WorldWideScience

Sample records for quantification impacts environnemental

  1. Plutonium in the environment - bibliographic study and quantification; Impacts environnemental et sanitaire des isotopes du plutonium, etude bibliographique et quantification

    Energy Technology Data Exchange (ETDEWEB)

    Guetat, Ph; Monfort, M; Ansoborlo, E [CEA Marcoule, Dir. de l' Energie Nucleaire, 30 (France); Bion, L; Moulin, V; Reiller, P; Vercouter, Th [CEA Saclay, Dir. de l' Energie Nucleaire, 91 - Gif sur Yvette (France); Boucher, L; Jourdain, F; Van Dorpe, F [CEA Cadarache, Dir. de l' Energie Nucleaire, 13 - Saint Paul lez Durance (France); Comte, A; Flury Heard, A; Fritsch, P; Menetrier, F [CEA Fontenay-aux-Roses, Dir. des Sciences du Vivant, 92 (France)

    2008-07-01

    This document deals with the different isotopes of plutonium. It intends to summarize the main features of plutonium behaviour from sources inside installation to the environment and man, and is expected to report the current knowledge about the different parameters used in the models for environmental and radiological impact assessment. The objective is to gather scientific information useful for deciders in case of accident or for regulation purposes. It gives main information on radiological and chemical characteristics which are necessary to understand transfers between compartments. Then it reports information on normal and accidental historical sources and present releases. The next part deals with transfer parameters in the installations and in environment. Parameters that influence its behaviour are examined, inside installations (physico-chemical forms and events that lead to releases), and outside in the environment for deposition to soils and transfer to plants, and animal products. A full chapter is dedicated to presentation of typical assessments, for each isotope and for mixture, and correspondence between activity, mass and dose reference levels are presented and discussed. Transfer and behaviour in man and effects on health are finally presented. (author)

  2. Radioactive iodine and environmental and sanitary effects - bibliographic study and quantification; Iodes radioactifs et impacts environnemental et sanitaire - etude bibliographique et quantification

    Energy Technology Data Exchange (ETDEWEB)

    Guetat, Ph.; Armand, P.; Monfort, M.; Fritsch, P. [CEA Bruyeres-le-Chatel, 91 (France); Flury Herard, A. [CEA, Dir. des Sciences du Vivant, 75 - Paris (France); Menetrier, F. [CEA Fontenay-aux-Roses, Dir. des Sciences du Vivant, 92 (France); Bion, L. [CEA Saclay, Dir. de l' Energie Nucleaire (DEN), 91 - Gif sur Yvette (France); Schoech, C.; Masset, S. [Societe EX-IN - Expertise et Ingenierie, 92 - Le Plessis-Robinson (France)

    2004-07-01

    This document is intended to a large public. It reviews the different parameters needed to evaluate the potential act o radioactive releases from the emission to public. Its objectives are to evaluate the importance of different exposure pathways and to assess efficiency of the possible interventions for large public. The main conclusions are summarised hereafter: The radioactive decay chains have to be taken into account to evaluate the iodine source term in the nuclear plants in the case of fission accidents. The physico-chemical forms of iodine are important in order to determine the released activity and deposited activity on the soil. The isotopes to be taken into account are mainly iodine 131 for radiological assessments and also iodine 133 for the nuclear reactor accidents, and the chain Tellurium-Iodine 132 when no particulate filtration exists. Iodine 129 in French reprocessing plant cannot lead to significant accidents. The dominant exposure pathways are related to the consumption of contaminated food products (vegetable, milk) for the inorganic iodine. The iodine transfer to goat and sheep milk is greater than the one to cow milk. The meat production of herbivores at field is the most sensitive. The interest to remove rapidly herbivore from pasture appears relatively clearly. The banning of consumption of local contaminated food products (vegetables and meats) may reduce by about a factor of thirteen the impact due to iodine 131. The youngest the population is, the greatest are the thyroid radiosensitivity and variability within the population. Oral administration of stable iodine limits transfers to maternal milk and foetal thyroid. Ingestion of stable iodine is complementary to consumption banning of local contaminated food products. The earliest the ingestion is, the greatest is the efficiency. 0,1 TBq of 131 iodine released at a low height involves only limited and local actions whereas the release of 10 TBq involves direct and immediate protection

  3. Automobile air-conditioning its energy and environmental impact; La climatisation automobile impact energetique et environnemental

    Energy Technology Data Exchange (ETDEWEB)

    Barbusse, St.; Gagnepain, L.

    2003-05-01

    Over the last three decades, automobile manufacturers have made a lot of progress in specific fuel consumption and engine emissions of pollutants. Yet the impact of these improvements on vehicle consumption has been limited by increased dynamic performances (maxi-mum speed, torque), increased safety (power steering and power brakes) and increased comfort (noise and vibration reduction, electric windows and thermal comfort). Because of this, the real CO{sub 2}-emission levels in vehicles is still high in a context where road transport is a major factor in the balance sheet of greenhouse gas emissions, thus in complying with the inter-national climate convention. Although European, Japanese and Korean manufacturers signed an important agreement with the European Commission for voluntarily reducing CO{sub 2} emissions from their vehicles, with a weighted average emission goal by sales of 140 grams per km on the MVEG approval cycle by 2008, it has to be noted that the European procedures for measuring fuel consumption and CO{sub 2} emissions do not take accessories into account, especially air-condition ng (A/C). The big dissemination of this equipment recognized as a big energy consumer and as using a refrigerant with a high global warming potential ed ADEME to implement a set of assessments of A/C's energy and environmental impact. In particular these assessments include studies of vehicle equipment rates, analyses of impact on fuel consumption as well as regulated pollutant emissions in the exhaust, a characterization of the refrigerant leakage levels and an estimate of greenhouse gas emissions for all air-conditioned vehicles. This leaflet summarizes the results of these actions. All of these studies and additional data are presented in greater detail in the document,-'Automobile Air-conditioning' (ADEME reference no. 4985). (author)

  4. EVALUATION DE L’IMPACT ENVIRONNEMENTAL : Comment étudier l’impact de colzas transgéniques sur les abeilles ?

    Directory of Open Access Journals (Sweden)

    Pierre Jacqueline

    2000-07-01

    Full Text Available L’évaluation de l’impact environnemental de plantes génétiquement modifiées implique, lorsqu’il s’agit de plantes mellifères, la prise en compte de leurs interactions avec les abeilles. Les interactions à prendre en compte sont de deux types : - d’une part, il s’agit de vérifier l’innocuité de ces plantes sur l’abeille. En effet, l’abeille joue un rôle essentiel sur les plans économique et écologique, en tant que productrice de miel et pollinisatrice de nombreuses plantes sauvages ou cultivées. Ces différentes activités reposent sur l’aptitude de l’abeille à identifier et à visiter régulièrement des plantes susceptibles de lui procurer de la nourriture sous forme de nectar, qu’elle transforme en miel et qui lui sert d’alimentation glucidique, et de pollen, source de protéines. Toute modification survenant au niveau de ces plantes peut entraîner des perturbations du comportement ou de la physiologie des abeilles et se répercuter sur leur productivité en miel ou leur efficacité pollinisatrice. Les perturbations éventuelles peuvent découler soit d’effets directs liés à la présence du produit de transgène (protéine codée par le gène d’intérêt introduit dans la plante, soit d’effets indirects dus à des modifications secondaires de la physiologie de la plante associées à l’introduction du gène (effets pléiotropiques; - d’autre part, il faut prendre en compte le rôle potentiel de l’abeille comme facteur de dissémination du transgène. Ainsi, l’abeille, en se déplaçant de fleur en fleur, peut contribuer, en parallèle à une vection pollinique par le vent, à transférer le transgène via le pollen et à assurer une fécondation intra, voire interspécifique. Or, notamment dans le cas de gènes de résistance à des herbicides, on souhaite circonscrire strictement aux plantes transgéniques le caractère d’intérêt, en évitant des croisements interspécifiques avec

  5. Impact environnemental des corps gras et de leurs dérivés formulés ou non : biodégradabilité et écotoxicité

    Directory of Open Access Journals (Sweden)

    Bouillon Vincent

    2003-09-01

    Full Text Available Les risques encourus par l’environnement à cause de la libération d’un produit chimique particulier dépendent essentiellement du potentiel et de la durée d’exposition de ce produit à l’environnement et de sa toxicité. Les corps gras et leurs dérivés sont utilisés dans des domaines très variés tels que les lubrifiants, les solvants, les agents de surface, les détergents… L’impact environnemental de ces composés peut être mis en évidence par des essais normalisés de biodégradabilité et de toxicité. La biodégradabilité est évaluée par diverses mesures : disparition de la substance, évolution de la demande biochimique en oxygène (DBO, consommation d’oxygène, production de dioxyde de carbone et évolution de la composition gazeuse autour du phénomène de biodégradation. La toxicité est évaluée sur divers organismes tels que les bactéries, les algues, les crustacés, les poissons, les mammifères… L’ensemble de ces mesures, expliquées dans cet article, est d’autant plus important qu’il intervient dans les classifications, les réglementations et les écolabels.

  6. Monitoring of the radiological environmental impact of the AREVA site of Tricastin; Suivi de l'impact radiologique environnemental des activites du site AREVA du Tricastin

    Energy Technology Data Exchange (ETDEWEB)

    Mercat, C.; Brun, F.; Florens, P.; Petit, J. [AREVA NC Pierrelatte, Direction surete environnement du site du Tricastin, 26 (France); Garnier, F. [EURODIF Production, Direction qualite securite surete environnement, 26 (France); Devin, P. [AREVA NC Pierrelatte, Direction surete, sante, securite, environnement, 26 (France)

    2010-06-15

    Set up at the beginning of the site's operations, in 1962, the monitoring of the radiological environmental impact of the AREVA site of Tricastin has evolved over time to meet more specifically the multiple objectives of environmental monitoring: to prove the respect of the commitments required by the authorities, to be able to detect a dysfunction in the observed levels, to enable the assessment of impacts of industrial activities, to ensure the balance between environmental quality and the use made by the local population and to inform the public of the radiological state of the environment. Thousands of data were acquired on the radioactivity of all environmental compartments as well as on the functioning of local ecosystems. Today, the Network of Environmental Monitoring of AREVA Tricastin goes beyond the requirements of routine monitoring to provide innovative solutions for monitoring the radioactivity (especially for uranium) in the environment. (author)

  7. Energy systems. Tome 3: advanced cycles, low environmental impact innovative systems; Systeme energetiques, TOME 3: cycles avances, systemes innovants a faible impact environnemental

    Energy Technology Data Exchange (ETDEWEB)

    Gicquel, R

    2009-07-01

    This third tome about energy systems completes the two previous ones by showing up advanced thermodynamical cycles, in particular having a low environmental impact, and by dealing with two other questions linked with the study of systems with a changing regime operation: - the time management of energy, with the use of thermal and pneumatic storage systems and time simulation (schedule for instance) of systems (solar energy type in particular); - the technological dimensioning and non-nominal regime operation studies. Because this last topic is particularly complex, new functionalities have been implemented mainly by using the external classes mechanism, which allows the user to freely personalize his models. This tome is illustrated with about 50 examples of cycles modelled with Thermoptim software. Content: foreword; 1 - generic external classes; 2 - advanced gas turbine cycles; 3 - evaporation-concentration, mechanical steam compression, desalination, hot gas drying; 4 - cryogenic cycles; 5 - electrochemical converters; 6 - global warming, CO{sub 2} capture and sequestration; 7 - future nuclear reactors (coupled to Hirn and Brayton cycles); 8 - thermodynamic solar cycles; 10 - pneumatic and thermal storage; 11 - calculation of thermodynamic solar facilities; 12 - problem of technological dimensioning and non-nominal regime; 13 - exchangers modeling and parameterizing for the dimensioning and the non-nominal regime; 14 - modeling and parameterizing of volumetric compressors; 15 - modeling and parameterizing of turbo-compressors and turbines; 16 - identification methodology of component parameters; 17 - case studies. (J.S.)

  8. Impact environnemental d'une désulfuration poussée des gazoles Environmental Impact of Gaz Oil Desulfurization

    Directory of Open Access Journals (Sweden)

    Armengol C.

    2006-11-01

    Full Text Available En une dizaine d'années, le diesel a connu un développement spectaculaire sur les marchés automobile français et européen et pourrait atteindre, en 1995, la moitié des immatriculations de véhicules particuliers en France et le quart en Europe de l'Ouest. Cette situation n'est évidemment pas sans poser de problèmes. Problèmes environnementaux puisque le moteur diesel est une source plus importante d'émissions d'oxydes d'azote et de particules que le convertisseur essence, mais également au niveau de l'industrie du raffinage qui, en France, n'est plus en mesure de satisfaire la demande en gazole. De plus, à compter du 1er octobre 1996, la teneur en soufre du gazole routier ne devra pas excéder 0,05 %, conformément aux nouvelles spécifications européennes. Cette perspective de production de carburants fortement désulfurés va affecter directement l'équilibre en hydrogène de la raffinerie et donc les autoconsommations et les émissions de CO2. L'objectif de cette étude est de mesurer l'impact sur l'environnement d'une réduction de la teneur en soufre des gazoles de 0,3 à 0,05 %. Le bilan est réalisé sur l'ensemble de la filière énergétique, depuis l'extraction du pétrole jusqu'à la combustion du carburant dans le moteur. Les gains et les pertes en termes de pollution locale ou globale sont évalués suivant la nature de l'hydrogène utilisé (oxydation partielle de résidus sous vide ou de charbon, reformage à la vapeur de gaz naturel ou de naphta électrolyse et la nature de la charge à traiter (gazole straight run ou light cycle oil lors de l'hydrodésulfuration. Over the past decade, diesel had made large advances in the French and European automobile markets. In 1995, diesel could account for half of all private vehicle registrations in France, and a quarter in Western Europe. This situation inevitably raises a number of problems : environmental problems, because the diesel engine emits more nitrogen oxides and

  9. Analyse technico-économique et évaluation de l’impact environnemental de la cuisson solaire directe au Maroc

    Directory of Open Access Journals (Sweden)

    Ndiaga MBODJI

    2017-09-01

    Full Text Available The objective of this study is to present a design methodology, carry out economic analysis and evaluate the environmental impact of direct solar cooking systems in Morocco. To satisfy the energy needs of a 5 people household, consuming a 3 kg meal at noon for a cooking time of 2.5 hours, a parabolic concentrator having a diameter of 1.4 m (useful area of 1.6 m² is required. At the household level, the economic analysis revealed that the payback period of a direct solar cooker compared to butane varies from 4 to 10 years, depending on the rate of public subsidy. Where firewood is used, the payback period varies from 0.6 to 10 years, depending on the stove performance and the firewood price. At the national level, a 50% subsidy of direct solar cookers with a penetration rate of 50% in rural areas requires a budget of 1.61 billion dirhams (1$US=10 Dirhams. This investment will allow the government to save 185 million dirhams a year in butane subsidies reduction, which corresponds to a payback period of about 8.7 years and a total profit of 1.45 billion dirhams over the cookers 15-year lifetime. On the ecological aspect, the area of forest saved would be about 10 000 ha/year, and the annual amount of CO2 emissions avoided would be 1.08 Mt/year.

  10. EVALUATION DE L’IMPACT ENVIRONNEMENTAL : Evaluation des impacts du flux de transgènes de tolérance à différents herbicides à large spectre

    Directory of Open Access Journals (Sweden)

    Astoin Marie-Florence

    2000-07-01

    Full Text Available Ce texte est tiré du rapport « Introduction de variétés génétiquement modifiées de colza tolérantes à différents herbicides : évaluation des impacts agro-environnementaux et propositions de scénarios de gestion » établi par le Cetiom dans le cadre du moratoire sur les variétés génétiquement modifiées de colza. Les auteurs se réservent la possibilité d’ici publication définitive du rapport d’apporter des modifications à ce texte.

  11. Environmental impact quantification and correlation between site ...

    African Journals Online (AJOL)

    The aim of this work was to quantify the most significant impact from the polluted environment and to review the correlation between pollution indicators and the content and structures of Tanacetum vulgare L. (Tansy). Heavy metals as mercury, lead, cadmium, chromium and nickel are considered as pollution indicators.

  12. Nuclear and energies nr 57. Japan, another glance. The environmental and radiological impact. The international impact. The illusion of renewable energies in Japan; Nucleaire et energies no. 57. Le Japon, un autre regard. L'impact environnemental et radiologique. L'impact international. L'illusion des energies renouvelables au Japon

    Energy Technology Data Exchange (ETDEWEB)

    Lenail, B.

    2011-07-15

    The contributions of this publication first address the Japanese local context (organization, mentality, cultural background, thinking and action modes), and secondly the environmental and radiological impact of the Fukushima accident, notably in comparison with Chernobyl (contamination is much more localized, sometimes higher; a larger concerned population but quicker and more efficient protection measures; more severe consequences due to population displacement). The third article discusses the international impact of the accident: known or foreseen consequences on nuclear programs, discussion on safety strengthening and on governance, evolution of public opinion, possible consequences on climate negotiations. The last article proposes an overview of the current situation of Japan which must mobilize all the available energy resources to face the difficulties in electricity supply

  13. Environmental and social impact study of the construction of a thermal gas plant at Kribi and a power transmission line between Kribi and Edea; Etude d'impact environnemental et social du projet de construction d'une centrale thermique a gaz a Kribi et d'une ligne de transport d'energie entre Kribi et Edea

    Energy Technology Data Exchange (ETDEWEB)

    Ndemanou, R. [Societe Africaine d' Expertise, Yaounde (Cameroon). Div. Environnement

    2009-04-15

    In recent years, the electric utility AES SONEL in Cameroon, Africa has focused on constructing a natural gas thermal power plant at Kribi to meet the growing electricity demand in the country. An environmental and social impact study has been conducted in accordance with Cameroon law to ensure that proper procedures are followed during construction of the project, which will involve eviction and resettlement of people. The methodology used for the impact studies took into account the justification for the project, field missions, identification of regulatory agencies, consultations, acquired data, potential impacts, and the management of a social and environmental plan of action. Environmental impacts were identified in terms of air quality, water quality, surface and groundwater quality, noise factors, traffic, soil and land utilization, fauna and flora, landscape and visual impact. The social impact was evaluated through questionnaires and informal consultations. Several concerns were evaluated, notably the expropriation of affected persons; loss of cultural relics; immigration in the area of potential workers; economics and infrastructures of social services. The project was accepted by the population of Cameroon following public hearings and examination of the environmental and social impact study reports issued by the Interministerial Committee on the Environment. Financial funding has been successful to date, and the start of the project is imminent. 4 figs.

  14. Quantification of environmental impacts of various energy technologies. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Selfors, A [ed.

    1994-10-01

    This report discusses problems related to economic assessment of the environmental impacts and abatement measures in connection with energy projects. Attention is called to the necessity of assessing environmental impacts both in the form of reduced economic welfare and in the form of costs of abatement measures to reduce the impact. In recent years, several methods for valuing environmental impacts have been developed, but the project shows that few empirical studies have been carried out. The final report indicates that some important factors are very difficult to evaluate. In addition environmental impacts of energy development in Norway vary considerably from project to project. This makes it difficult to obtain a good basis for comparing environmental impacts caused by different technologies, for instance hydroelectric power versus gas power or wind versus hydroelectric power. It might be feasible however to carry out more detailed economic assessments of environmental impacts of specific projects. 33 refs., 1 fig., 4 tabs.

  15. Management Environnemental

    CERN Document Server

    Guthapfel, C

    1999-01-01

    Comme toute industrie, le CERN est responsable des déchets qu'il produit. C'est pourquoi la section TFM/MS de la division ST a développé un système de gestion de ces déchets. Il faut savoir que les activités du Laboratoire le conduisent à produire près de 3000 tonnes de déchets par an qui correspondent, à la fois à ceux d'une collectivité locale et d'une industrie. On distingue deux classes de déchets : les Déchets Industriels Banals et Industriels Spéciaux. Pour chacune de ces classes, chaque type de déchets est caractérisé par son mode de production, sa collecte et ses filières d'élimination. La situation géographique, de part et d'autres de la frontière Franco-Suisse, engendre des procédures particulières et des coûts plus importants. L'environnement faisant partie intégrante des préoccupations de l'Organisation, le service en charge de cette activité s'est donné comme mission, en prenant en compte les desiderata des différents responsables et acteurs, d'optimiser son système...

  16. Nuclear and energy. Special issue on the Fukushima power plant; Nucleaire et energies numero special consacre a la centrale de Fukushima n. 57 juillet 2011 le Japon, un autre regard; l'impact environnemental et radiologique; l'impact international; l'illusion des energies renouvelables au Japon

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    This issue analyses the first consequences of the Fukushima accident at the world level, i.e. impacts which are either already noticeable or predictable. A first article proposes a portrait of Japan (its historical relationship with nature, the cultural education, the role of its bureaucracy, the Japanese business and political worlds) and evokes the nuclear safety organization at the institutional level. It also evokes the different companies involved in nuclear energy production. The second article discusses and comments the environmental and radiological impact of the accident (protection of the inhabitants, environment monitoring, comparison with Chernobyl, main steps of degradation of the reactors, releases in the sea, total release assessment, soil contamination, food contamination, radiation protection). A third article discusses the international impact, notably for the existing or projected power plants in different countries, in terms of public opinion, and with respect to negotiations on climate. The fourth article discusses the reactions of different countries possessing nuclear reactors. The last article questions the replacement of the lost production (that of Fukushima and maybe another power plant) by renewable energies

  17. Quantification of the Impact of Roadway Conditions on Emissions

    Science.gov (United States)

    2017-11-01

    The scope of this project involved developing a methodology to quantify the impact of roads condition on emissions and providing guidance to assist TxDOT in improving maintenance strategies to reduce gas emissions. The research quantified vehicle ...

  18. Ecological impacts of alien species: quantification, scope, caveats and recommendations

    Czech Academy of Sciences Publication Activity Database

    Kumschick, S.; Gaertner, M.; Vila, M.; Essl, F.; Jeschke, J.M.; Pyšek, Petr; Ricciardi, A.; Bacher, S.; Blackburn, T. M.; Dick, J. T. A.; Evans, T.; Hulme, P. E.; Kühn, I.; Mrugala, A.; Pergl, Jan; Rabitsch, W.; Richardson, D. M.; Sendek, A.; Winter, M.

    2015-01-01

    Roč. 65, č. 1 (2015), s. 55-63 ISSN 0006-3568 R&D Projects: GA ČR(CZ) GAP504/11/1028; GA ČR GB14-36079G Grant - others:AV ČR(CZ) AP1002 Program:Akademická prémie - Praemium Academiae Institutional support: RVO:67985939 Keywords : biological invasions * impact * prediction Subject RIV: EH - Ecology, Behaviour Impact factor: 4.294, year: 2015

  19. OMEGA Thau : outil de management environnemental et de gestion de l'avertissement des pollutions microbiologiques du bassin de Thau

    OpenAIRE

    Brocard, Gilles; Derolez, Valerie; Serais, Ophelie; Fiandrino, Annie; Lequette, Camille; Lescoulier, Christophe; Benedetti, Murielle; Couton, Prunelle; Marty, Delphine

    2010-01-01

    Le projet OMEGA Thau (Outil de Management Environnemental et de Gestion de l’Avertissement de la lagune de Thau) est un programme de recherche et développement, à maîtrise d'ouvrage du Syndicat Mixte du Bassin de Thau, associant des scientifiques, des autorités et collectivités locales ainsi que des professionnels de la conchyliculture. Il vise à élaborer un outil d'aide à la décision des gestionnaires pour orienter les investissements publics sur le bassin versant, afin d'obtenir une qualité...

  20. The impact of reconstruction method on the quantification of DaTSCAN images

    Energy Technology Data Exchange (ETDEWEB)

    Dickson, John C.; Erlandsson, Kjell; Hutton, Brian F. [UCLH NHS Foundation Trust and University College London, Institute of Nuclear Medicine, London (United Kingdom); Tossici-Bolt, Livia [Southampton University Hospitals NHS Trust, Department of Medical Physics, Southampton (United Kingdom); Sera, Terez [University of Szeged, Department of Nuclear Medicine and Euromedic Szeged, Szeged (Hungary); Varrone, Andrea [Psychiatry Section and Stockholm Brain Institute, Karolinska Institute, Department of Clinical Neuroscience, Stockholm (Sweden); Tatsch, Klaus [EANM/European Network of Excellence for Brain Imaging, Vienna (Austria)

    2010-01-15

    Reconstruction of DaTSCAN brain studies using OS-EM iterative reconstruction offers better image quality and more accurate quantification than filtered back-projection. However, reconstruction must proceed for a sufficient number of iterations to achieve stable and accurate data. This study assessed the impact of the number of iterations on the image quantification, comparing the results of the iterative reconstruction with filtered back-projection data. A striatal phantom filled with {sup 123}I using striatal to background ratios between 2:1 and 10:1 was imaged on five different gamma camera systems. Data from each system were reconstructed using OS-EM (which included depth-independent resolution recovery) with various combinations of iterations and subsets to achieve up to 200 EM-equivalent iterations and with filtered back-projection. Using volume of interest analysis, the relationships between image reconstruction strategy and quantification of striatal uptake were assessed. For phantom filling ratios of 5:1 or less, significant convergence of measured ratios occurred close to 100 EM-equivalent iterations, whereas for higher filling ratios, measured uptake ratios did not display a convergence pattern. Assessment of the count concentrations used to derive the measured uptake ratio showed that nonconvergence of low background count concentrations caused peaking in higher measured uptake ratios. Compared to filtered back-projection, OS-EM displayed larger uptake ratios because of the resolution recovery applied in the iterative algorithm. The number of EM-equivalent iterations used in OS-EM reconstruction influences the quantification of DaTSCAN studies because of incomplete convergence and possible bias in areas of low activity due to the nonnegativity constraint in OS-EM reconstruction. Nevertheless, OS-EM using 100 EM-equivalent iterations provides the best linear discriminatory measure to quantify the uptake in DaTSCAN studies. (orig.)

  1. Amélioration de l’autonomie énergétique et de l’impact environnemental d’une unité de trituration de tournesol par l’implantation conjointe d’un atelier de décorticage et d’une chaudière à coques

    Directory of Open Access Journals (Sweden)

    Tostain Sylvain

    2012-11-01

    Full Text Available Fibre rich sunflower hulls have always been regarded as having a remarkable calorific value (5 000 kWh/t DM, very close to that of wood. Rising energy costs, emergent environmental concerns, and fitness for use of sunflower derived products have led to a growing interest in the dehulling of sunflower seeds prior to crushing, and burning of hulls in biomass boilers to yield process steam on site. This was made possible by prominent technological improvements in boiler technology. The torsional chamber technology exhibits good performances in full combustion of sunflower hulls, allowing for a high efficiency, a great flexibility, and a limited emission of pollutants. Yet, fumes may still have to be post-treated to ensure compliance with stringent restrictions in dust emissions. Being a robust and versatile technology, the torsional chamber is able to cope with a feedstock quality varying to a certain extent. The general design of a crushing plant fitted with a dehulling unit is impacted dramatically and becomes very sensitive to variations in hullability of the incoming seeds. Hull content and size of the seeds are correlated positively to hullability; moisture, density and oil content being correlated negatively. Hullability is affected mostly by environmental effects, cultivars being responsible for it to a lesser extent. Thus, hullability is impacted by upstream practices in plant breeding, field, and grain elevator management. Success in an efficient hulling strategy not only depends on the use of relevant technologies on processing plants, but also relies on knowledge of the seed and meal customer needs, as well as on concerted actions at various levels along the sunflower chain.

  2. Identification and quantification of the hydrological impacts of imperviousness in urban catchments: a review.

    Science.gov (United States)

    Jacobson, Carol R

    2011-06-01

    Urbanisation produces numerous changes in the natural environments it replaces. The impacts include habitat fragmentation and changes to both the quality and quantity of the stormwater runoff, and result in changes to hydrological systems. This review integrates research in relatively diverse areas to examine how the impacts of urban imperviousness on hydrological systems can be quantified and modelled. It examines the nature of reported impacts of urbanisation on hydrological systems over four decades, including the effects of changes in imperviousness within catchments, and some inconsistencies in studies of the impacts of urbanisation. The distribution of imperviousness within urban areas is important in understanding the impacts of urbanisation and quantification requires detailed characterisation of urban areas. As a result most mapping of urban areas uses remote sensing techniques and this review examines a range of techniques using medium and high resolution imagery, including spectral unmixing. The third section examines the ways in which scientists and hydrological and environmental engineers model and quantify water flows in urban areas, the nature of hydrological models and methods for their calibration. The final section examines additional factors which influence the impact of impervious surfaces and some uncertainties that exist in current knowledge. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Impact socio-environnemental de l'exploitation du sable marin sur le ...

    African Journals Online (AJOL)

    En effet, l'exploitation du sable marin dans cette contrée s'est ouverte et évolue du fait de l'urbanisation galopante de la Commune liée à l'érection des ... In addition, the uncontrolled exploitation increased coastal erosion and does little to improve the conditions of economic and social life of the people who indulge ...

  4. Impact of polymeric membrane filtration of oil sands process water on organic compounds quantification.

    Science.gov (United States)

    Moustafa, Ahmed M A; Kim, Eun-Sik; Alpatova, Alla; Sun, Nian; Smith, Scott; Kang, Seoktae; Gamal El-Din, Mohamed

    2014-01-01

    The interaction between organic fractions in oil sands process-affected water (OSPW) and three polymeric membranes with varying hydrophilicity (nylon, polyvinylidene fluoride and polytetrafluoroethylene) at different pHs was studied to evaluate the impact of filtration on the quantification of acid-extractable fraction (AEF) and naphthenic acids (NAs). Four functional groups predominated in OSPW (amine, phosphoryl, carboxyl and hydroxyl) as indicated by the linear programming method. The nylon membranes were the most hydrophilic and exhibited the lowest AEF removal at pH of 8.7. However, the adsorption of AEF on the membranes increased as the pH of OSPW decreased due to hydrophobic interactions between the membrane surfaces and the protonated molecules. The use of ultra pressure liquid chromatography-high resolution mass spectrometry (UPLC/HRMS) showed insignificant adsorption of NAs on the tested membranes at pH 8.7. However, 26±2.4% adsorption of NAs was observed at pH 5.3 following the protonation of NAs species. For the nylon membrane, excessive carboxylic acids in the commercial NAs caused the formation of negatively charged assisted hydrogen bonds, resulting in increased adsorption at pH 8.2 (25%) as compared to OSPW (0%). The use of membranes for filtration of soluble compounds from complex oily wastewaters before quantification analysis of AEF and NAs should be examined prior to application.

  5. Impact factors and the optimal parameter of acoustic structure quantification in the assessment of liver fibrosis.

    Science.gov (United States)

    Huang, Yang; Liu, Guang-Jian; Liao, Bing; Huang, Guang-Liang; Liang, Jin-Yu; Zhou, Lu-Yao; Wang, Fen; Li, Wei; Xie, Xiao-Yan; Wang, Wei; Lu, Ming-De

    2015-09-01

    The aims of the present study are to assess the impact factors on acoustic structure quantification (ASQ) ultrasound and find the optimal parameter for the assessment of liver fibrosis. Twenty healthy volunteers underwent ASQ examinations to evaluate impact factors in ASQ image acquisition and analysis. An additional 113 patients with liver diseases underwent standardized ASQ examinations, and the results were compared with histologic staging of liver fibrosis. We found that the right liver displayed lower values of ASQ parameters than the left (p = 0.000-0.021). Receive gain experienced no significant impact except gain 70 (p = 0.193-1.000). With regard to different diameter of involved vessels in regions of interest, the group ≤2.0 mm differed significantly with the group 2.1-5.0 mm (p = 0.000-0.033) and the group >5.0 mm (p = 0.000-0.062). However, the region of interest size (p = 0.438-1.000) and depth (p = 0.072-0.764) had no statistical impact. Good intra- and inter-operator reproducibilities were found in both image acquisitions and offline image analyses. In the liver fibrosis study, the focal disturbance ratio had the highest correlation with histologic fibrosis stage (r = 0.67, p the assessment of liver fibrosis. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  6. Plasticity Detection and Quantification in Monopile Support Structures Due to Axial Impact Loading

    Directory of Open Access Journals (Sweden)

    Meijers P.C.

    2018-01-01

    Full Text Available Recent developments in the construction of offshore wind turbines have created the need for a method to detect whether a monopile foundation is plastically deformed during the installation procedure. Since measurements at the pile head are difficult to perform, a method based on measurements at a certain distance below the pile head is proposed in this work for quantification of the amount of plasticity. By considering a onedimensional rod model with an elastic-perfectly plastic constitutive relation, it is shown that the occurrence of plastic deformation caused by an impact load can be detected from these measurements. Furthermore, this plastic deformation can be quantified by the same measurement with the help of an energy balance. The effectiveness of the proposed method is demonstrated via a numerical example.

  7. Quantification and sensory studies of character impact odorants of different soybean lecithins.

    Science.gov (United States)

    Stephan, A; Steinhart, H

    1999-10-01

    Fifty-four potent odorants in standardized, hydrolyzed, and deoiled and hydrolyzed soybean lecithins were quantified by high-resolution gas chromatography/mass spectrometry (HRGC/MS). The characterization of their aroma impact was performed by calculation of nasal (n) and retronasal (r) odor activity values (OAVs). For this, the nasal and retronasal recognition thresholds of 18 odor-active compounds were determined in vegetable oil. The following compounds showed the highest nOAVs: 2,3-diethyl-5-methylpyrazine, methylpropanal, acetic acid, pentanoic acid, 2-ethyl-3,5-dimethylpyrazine, pentylpyridine, (Z)-1,5-octadien-3-one, 2-methylbutanal, and beta-damascenone. In addition to the compounds above, 1-octen-3-one, 1-nonen-3-one, and 3-methyl-2,4-nonandione showed potent rOAVs. The results of quantification and OAV calculation were confirmed by a model mixture of 25 impact odorants, which yielded a highly similar sensory profile to that of the original soybean lecithin. The sensory importance of pyrazines and free acids increased through enzymatic hydrolysis and decreased by the process of deoiling. The impact of unsaturated ketones on the lecithin aroma was not changed by either process.

  8. Impact of the Definition of Peak Standardized Uptake Value on Quantification of Treatment Response

    Science.gov (United States)

    Vanderhoek, Matt; Perlman, Scott B.; Jeraj, Robert

    2012-01-01

    PET-based treatment response assessment typically measures the change in maximum standardized uptake value (SUVmax), which is adversely affected by noise. Peak SUV (SUVpeak) has been recommended as a more robust alternative, but its associated region of interest (ROIpeak) is not uniquely defined. We investigated the impact of different ROIpeak definitions on quantification of SUVpeak and tumor response. Methods Seventeen patients with solid malignancies were treated with a multitargeted receptor tyrosine kinase inhibitor resulting in a variety of responses. Using the cellular proliferation marker 3′-deoxy-3′-18F-fluorothymidine (18F-FLT), whole-body PET/CT scans were acquired at baseline and during treatment. 18F-FLT–avid lesions (~2/patient) were segmented on PET images, and tumor response was assessed via the relative change in SUVpeak. For each tumor, 24 different SUVpeaks were determined by changing ROIpeak shape (circles vs. spheres), size (7.5–20 mm), and location (centered on SUVmax vs. placed in highest-uptake region), encompassing different definitions from the literature. Within each tumor, variations in the 24 SUVpeaks and tumor responses were measured using coefficient of variation (CV), standardized deviation (SD), and range. For each ROIpeak definition, a population average SUVpeak and tumor response were determined over all tumors. Results A substantial variation in both SUVpeak and tumor response resulted from changing the ROIpeak definition. The variable ROIpeak definition led to an intratumor SUVpeak variation ranging from 49% above to 46% below the mean (CV, 17%) and an intratumor SUVpeak response variation ranging from 49% above to 35% below the mean (SD, 9%). The variable ROIpeak definition led to a population average SUVpeak variation ranging from 24% above to 28% below the mean (CV, 14%) and a population average SUVpeak response variation ranging from only 3% above to 3% below the mean (SD, 2%). The size of ROIpeak caused more

  9. The impact of respiratory motion on tumor quantification and delineation in static PET/CT imaging

    International Nuclear Information System (INIS)

    Liu Chi; Pierce II, Larry A; Alessio, Adam M; Kinahan, Paul E

    2009-01-01

    Our aim is to investigate the impact of respiratory motion on tumor quantification and delineation in static PET/CT imaging using a population of patient respiratory traces. A total of 1295 respiratory traces acquired during whole body PET/CT imaging were classified into three types according to the qualitative shape of their signal histograms. Each trace was scaled to three diaphragm motion amplitudes (6 mm, 11 mm and 16 mm) to drive a whole body PET/CT computer simulation that was validated with a physical phantom experiment. Three lung lesions and one liver lesion were simulated with diameters of 1 cm and 2 cm. PET data were reconstructed using the OS-EM algorithm with attenuation correction using CT images at the end-expiration phase and respiratory-averaged CT. The errors of the lesion maximum standardized uptake values (SUV max ) and lesion volumes between motion-free and motion-blurred PET/CT images were measured and analyzed. For respiration with 11 mm diaphragm motion and larger quiescent period fraction, respiratory motion can cause a mean lesion SUV max underestimation of 28% and a mean lesion volume overestimation of 130% in PET/CT images with 1 cm lesions. The errors of lesion SUV max and volume are larger for patient traces with larger motion amplitudes. Smaller lesions are more sensitive to respiratory motion than larger lesions for the same motion amplitude. Patient respiratory traces with relatively larger quiescent period fraction yield results less subject to respiratory motion than traces with long-term amplitude variability. Mismatched attenuation correction due to respiratory motion can cause SUV max overestimation for lesions in the lower lung region close to the liver dome. Using respiratory-averaged CT for attenuation correction yields smaller mismatch errors than those using end-expiration CT. Respiratory motion can have a significant impact on static oncological PET/CT imaging where SUV and/or volume measurements are important. The impact

  10. Land grabbing: a preliminary quantification of economic impacts on rural livelihoods.

    Science.gov (United States)

    Davis, Kyle F; D'Odorico, Paolo; Rulli, Maria Cristina

    2014-01-01

    Global demands on agricultural land are increasing due to population growth, dietary changes and the use of biofuels. Their effect on food security is to reduce humans' ability to cope with the uncertainties of global climate change. In light of the 2008 food crisis, to secure reliable future access to sufficient agricultural land, many nations and corporations have begun purchasing large tracts of land in the global South, a phenomenon deemed "land grabbing" by popular media. Because land investors frequently export crops without providing adequate employment, this represents an effective income loss for local communities. We study 28 countries targeted by large-scale land acquisitions [comprising 87 % of reported cases and 27 million hectares (ha)] and estimate the effects of such investments on local communities' incomes. We find that this phenomenon can potentially affect the incomes of ~12 million people globally with implications for food security, poverty levels and urbanization. While it is important to note that our study incorporates a number of assumptions and limitations, it provides a much needed initial quantification of the economic impacts of large-scale land acquisitions on rural livelihoods.

  11. Uncertainty Quantification of the Reverse Taylor Impact Test and Localized Asynchronous Space-Time Algorithm

    Science.gov (United States)

    Subber, Waad; Salvadori, Alberto; Lee, Sangmin; Matous, Karel

    2017-06-01

    The reverse Taylor impact is a common experiment to investigate the dynamical response of materials at high strain rates. To better understand the physical phenomena and to provide a platform for code validation and Uncertainty Quantification (UQ), a co-designed simulation and experimental paradigm is investigated. For validation under uncertainty, quantities of interest (QOIs) within subregions of the computational domain are introduced. For such simulations where regions of interest can be identified, the computational cost for UQ can be reduced by confining the random variability within these regions of interest. This observation inspired us to develop an asynchronous space and time computational algorithm with localized UQ. In the region of interest, the high resolution space and time discretization schemes are used for a stochastic model. Apart from the region of interest, low spatial and temporal resolutions are allowed for a stochastic model with low dimensional representation of uncertainty. The model is exercised on the linear elastodynamics and shows a potential in reducing the UQ computational cost. Although, we consider wave prorogation in solid, the proposed framework is general and can be used for fluid flow problems as well. Department of Energy, National Nuclear Security Administration (PSAAP-II).

  12. Quantification of Accelerometer Derived Impacts Associated With Competitive Games in National Collegiate Athletic Association Division I College Football Players.

    Science.gov (United States)

    Wellman, Aaron D; Coad, Sam C; Goulet, Grant C; McLellan, Christopher P

    2017-02-01

    Wellman, AD, Coad, SC, Goulet, GC, and McLellan, CP. Quantification of accelerometer derived impacts associated with competitive games in National Collegiate Athletic Association division I college football players. J Strength Cond Res 31(2): 330-338, 2017-The aims of the present study were to (a) examine positional impact profiles of National Collegiate Athletic Association (NCAA) division I college football players using global positioning system (GPS) and integrated accelerometry (IA) technology and (b) determine if positional differences in impact profiles during competition exist within offensive and defensive teams. Thirty-three NCAA division I Football Bowl Subdivision players were monitored using GPS and IA (GPSports) during 12 regular season games throughout the 2014 season. Individual player data sets (n = 294) were divided into offensive and defensive teams, and positional subgroups. The intensity, number, and distribution of impact forces experienced by players during competition were recorded. Positional differences were found for the distribution of impacts within offensive and defensive teams. Wide receivers sustained more very light and light to moderate (5-6.5 G force) impacts than other position groups, whereas the running backs were involved in more severe (>10 G force) impacts than all offensive position groups, with the exception of the quarterbacks (p ≤ 0.05). The defensive back and linebacker groups were subject to more very light (5.0-6.0 G force) impacts, and the defensive tackle group sustained more heavy and very heavy (7.1-10 G force) impacts than other defensive positions (p ≤ 0.05). Data from the present study provide novel quantification of positional impact profiles related to the physical demands of college football games and highlight the need for position-specific monitoring and training in the preparation for the impact loads experienced during NCAA division I football competition.

  13. Impact of acid atmospheric deposition on soils : quantification of chemical and hydrologic processes

    NARCIS (Netherlands)

    Grinsven, van J.J.M.

    1988-01-01

    Atmospheric deposition of SO x , NOx and NHx will cause major changes in the chemical composition of solutions in acid soils, which may affect the biological functions of the soil. This thesis deals with quantification of soil acidification by means of chemical

  14. Environmental Impact of Munition and Propellant Disposal (Impact Environnemental de l’Elimination des Munitions et des Combustibles)

    Science.gov (United States)

    2010-02-01

    the Future of Demil 3-29 3.4.2 Poster Session 3-32 3.4.2.1 The Situation in Lithuania: The Studies on the Explosive Contamination, 3-32 Their Toxic...Presentations and Documents Supporting Capability Assessments A-1 Annex B – Presentations, Paper/ Posters and Videos from the Sofia Meeting B-1 Annex C...sur l’environnement. La réunion a inclus des participants provenant de l’OTAN et des partenaires dont la Russie et la Géorgie. Les sessions étaient

  15. The impact of targeting repetitive BamHI-W sequences on the sensitivity and precision of EBV DNA quantification.

    Directory of Open Access Journals (Sweden)

    Armen Sanosyan

    Full Text Available Viral load monitoring and early Epstein-Barr virus (EBV DNA detection are essential in routine laboratory testing, especially in preemptive management of Post-transplant Lymphoproliferative Disorder. Targeting the repetitive BamHI-W sequence was shown to increase the sensitivity of EBV DNA quantification, but the variability of BamHI-W reiterations was suggested to be a source of quantification bias. We aimed to assess the extent of variability associated with BamHI-W PCR and its impact on the sensitivity of EBV DNA quantification using the 1st WHO international standard, EBV strains and clinical samples.Repetitive BamHI-W- and LMP2 single- sequences were amplified by in-house qPCRs and BXLF-1 sequence by a commercial assay (EBV R-gene™, BioMerieux. Linearity and limits of detection of in-house methods were assessed. The impact of repeated versus single target sequences on EBV DNA quantification precision was tested on B95.8 and Raji cell lines, possessing 11 and 7 copies of the BamHI-W sequence, respectively, and on clinical samples.BamHI-W qPCR demonstrated a lower limit of detection compared to LMP2 qPCR (2.33 log10 versus 3.08 log10 IU/mL; P = 0.0002. BamHI-W qPCR underestimated the EBV DNA load on Raji strain which contained fewer BamHI-W copies than the WHO standard derived from the B95.8 EBV strain (mean bias: - 0.21 log10; 95% CI, -0.54 to 0.12. Comparison of BamHI-W qPCR versus LMP2 and BXLF-1 qPCR showed an acceptable variability between EBV DNA levels in clinical samples with the mean bias being within 0.5 log10 IU/mL EBV DNA, whereas a better quantitative concordance was observed between LMP2 and BXLF-1 assays.Targeting BamHI-W resulted to a higher sensitivity compared to LMP2 but the variable reiterations of BamHI-W segment are associated with higher quantification variability. BamHI-W can be considered for clinical and therapeutic monitoring to detect an early EBV DNA and a dynamic change in viral load.

  16. The impact of targeting repetitive BamHI-W sequences on the sensitivity and precision of EBV DNA quantification.

    Science.gov (United States)

    Sanosyan, Armen; Fayd'herbe de Maudave, Alexis; Bollore, Karine; Zimmermann, Valérie; Foulongne, Vincent; Van de Perre, Philippe; Tuaillon, Edouard

    2017-01-01

    Viral load monitoring and early Epstein-Barr virus (EBV) DNA detection are essential in routine laboratory testing, especially in preemptive management of Post-transplant Lymphoproliferative Disorder. Targeting the repetitive BamHI-W sequence was shown to increase the sensitivity of EBV DNA quantification, but the variability of BamHI-W reiterations was suggested to be a source of quantification bias. We aimed to assess the extent of variability associated with BamHI-W PCR and its impact on the sensitivity of EBV DNA quantification using the 1st WHO international standard, EBV strains and clinical samples. Repetitive BamHI-W- and LMP2 single- sequences were amplified by in-house qPCRs and BXLF-1 sequence by a commercial assay (EBV R-gene™, BioMerieux). Linearity and limits of detection of in-house methods were assessed. The impact of repeated versus single target sequences on EBV DNA quantification precision was tested on B95.8 and Raji cell lines, possessing 11 and 7 copies of the BamHI-W sequence, respectively, and on clinical samples. BamHI-W qPCR demonstrated a lower limit of detection compared to LMP2 qPCR (2.33 log10 versus 3.08 log10 IU/mL; P = 0.0002). BamHI-W qPCR underestimated the EBV DNA load on Raji strain which contained fewer BamHI-W copies than the WHO standard derived from the B95.8 EBV strain (mean bias: - 0.21 log10; 95% CI, -0.54 to 0.12). Comparison of BamHI-W qPCR versus LMP2 and BXLF-1 qPCR showed an acceptable variability between EBV DNA levels in clinical samples with the mean bias being within 0.5 log10 IU/mL EBV DNA, whereas a better quantitative concordance was observed between LMP2 and BXLF-1 assays. Targeting BamHI-W resulted to a higher sensitivity compared to LMP2 but the variable reiterations of BamHI-W segment are associated with higher quantification variability. BamHI-W can be considered for clinical and therapeutic monitoring to detect an early EBV DNA and a dynamic change in viral load.

  17. Quantification of Impact of Orbital Drift on Inter-Annual Trends in AVHRR NDVI Data

    Directory of Open Access Journals (Sweden)

    Jyoteshwar R. Nagol

    2014-07-01

    Full Text Available The Normalized Difference Vegetation Index (NDVI time-series data derived from Advanced Very High Resolution Radiometer (AVHRR have been extensively used for studying inter-annual dynamics of global and regional vegetation. However, there can be significant uncertainties in the data due to incomplete atmospheric correction and orbital drift of the satellites through their active life. Access to location specific quantification of uncertainty is crucial for appropriate evaluation of the trends and anomalies. This paper provides per pixel quantification of orbital drift related spurious trends in Long Term Data Record (LTDR AVHRR NDVI data product. The magnitude and direction of the spurious trends was estimated by direct comparison with data from MODerate resolution Imaging Spectrometer (MODIS Aqua instrument, which has stable inter-annual sun-sensor geometry. The maps show presence of both positive as well as negative spurious trends in the data. After application of the BRDF correction, an overall decrease in positive trends and an increase in number of pixels with negative spurious trends were observed. The mean global spurious inter-annual NDVI trend before and after BRDF correction was 0.0016 and −0.0017 respectively. The research presented in this paper gives valuable insight into the magnitude of orbital drift related trends in the AVHRR NDVI data as well as the degree to which it is being rectified by the MODIS BRDF correction algorithm used by the LTDR processing stream.

  18. Consistent quantification of climate impacts due to biogenic carbon storage across a range of bio-product systems

    International Nuclear Information System (INIS)

    Guest, Geoffrey; Bright, Ryan M.; Cherubini, Francesco; Strømman, Anders H.

    2013-01-01

    Temporary and permanent carbon storage from biogenic sources is seen as a way to mitigate climate change. The aim of this work is to illustrate the need to harmonize the quantification of such mitigation across all possible storage pools in the bio- and anthroposphere. We investigate nine alternative storage cases and a wide array of bio-resource pools: from annual crops, short rotation woody crops, medium rotation temperate forests, and long rotation boreal forests. For each feedstock type and biogenic carbon storage pool, we quantify the carbon cycle climate impact due to the skewed time distribution between emission and sequestration fluxes in the bio- and anthroposphere. Additional consideration of the climate impact from albedo changes in forests is also illustrated for the boreal forest case. When characterizing climate impact with global warming potentials (GWP), we find a large variance in results which is attributed to different combinations of biomass storage and feedstock systems. The storage of biogenic carbon in any storage pool does not always confer climate benefits: even when biogenic carbon is stored long-term in durable product pools, the climate outcome may still be undesirable when the carbon is sourced from slow-growing biomass feedstock. For example, when biogenic carbon from Norway Spruce from Norway is stored in furniture with a mean life time of 43 years, a climate change impact of 0.08 kg CO 2 eq per kg CO 2 stored (100 year time horizon (TH)) would result. It was also found that when biogenic carbon is stored in a pool with negligible leakage to the atmosphere, the resulting GWP factor is not necessarily − 1 CO 2 eq per kg CO 2 stored. As an example, when biogenic CO 2 from Norway Spruce biomass is stored in geological reservoirs with no leakage, we estimate a GWP of − 0.56 kg CO 2 eq per kg CO 2 stored (100 year TH) when albedo effects are also included. The large variance in GWPs across the range of resource and carbon storage

  19. Consistent quantification of climate impacts due to biogenic carbon storage across a range of bio-product systems

    Energy Technology Data Exchange (ETDEWEB)

    Guest, Geoffrey, E-mail: geoffrey.guest@ntnu.no; Bright, Ryan M., E-mail: ryan.m.bright@ntnu.no; Cherubini, Francesco, E-mail: francesco.cherubini@ntnu.no; Strømman, Anders H., E-mail: anders.hammer.stromman@ntnu.no

    2013-11-15

    Temporary and permanent carbon storage from biogenic sources is seen as a way to mitigate climate change. The aim of this work is to illustrate the need to harmonize the quantification of such mitigation across all possible storage pools in the bio- and anthroposphere. We investigate nine alternative storage cases and a wide array of bio-resource pools: from annual crops, short rotation woody crops, medium rotation temperate forests, and long rotation boreal forests. For each feedstock type and biogenic carbon storage pool, we quantify the carbon cycle climate impact due to the skewed time distribution between emission and sequestration fluxes in the bio- and anthroposphere. Additional consideration of the climate impact from albedo changes in forests is also illustrated for the boreal forest case. When characterizing climate impact with global warming potentials (GWP), we find a large variance in results which is attributed to different combinations of biomass storage and feedstock systems. The storage of biogenic carbon in any storage pool does not always confer climate benefits: even when biogenic carbon is stored long-term in durable product pools, the climate outcome may still be undesirable when the carbon is sourced from slow-growing biomass feedstock. For example, when biogenic carbon from Norway Spruce from Norway is stored in furniture with a mean life time of 43 years, a climate change impact of 0.08 kg CO{sub 2}eq per kg CO{sub 2} stored (100 year time horizon (TH)) would result. It was also found that when biogenic carbon is stored in a pool with negligible leakage to the atmosphere, the resulting GWP factor is not necessarily − 1 CO{sub 2}eq per kg CO{sub 2} stored. As an example, when biogenic CO{sub 2} from Norway Spruce biomass is stored in geological reservoirs with no leakage, we estimate a GWP of − 0.56 kg CO{sub 2}eq per kg CO{sub 2} stored (100 year TH) when albedo effects are also included. The large variance in GWPs across the range of

  20. Quantification of the impact of endometriosis symptoms on health-related quality of life and work productivity.

    Science.gov (United States)

    Fourquet, Jessica; Báez, Lorna; Figueroa, Michelle; Iriarte, R Iván; Flores, Idhaliz

    2011-07-01

    To quantify the impact of endometriosis-related symptoms on physical and mental health status, health-related quality of life, and work-related aspects (absenteeism, presenteeism, work productivity, and activity impairment). Cross-sectional quantitative study. Academic and research institution. Women (n = 193) with self-reported surgically diagnosed endometriosis from the Endometriosis Patient Registry at Ponce School of Medicine and Health Sciences (PSMHS). Anonymous questionnaire divided into three sections consisting of questions from the Patient Health Survey (SF-12), the Endometriosis Health Profile (EHP-5), and the Work Productivity and Activity Impairment Survey (WPAI). Quantification of impact of endometriosis symptoms on physical and mental health status, health-related quality of life, absenteeism, presenteeism, work productivity, and activity impairment. Patients had SF-12 scores denoting statistically significant disability in the physical and mental health components. They also reported an average of 7.41 hours (approximately one working day) of work time lost during the week when the symptoms are worse. In addition, the WPAI scores showed a high impact on work-related domains: 13% of average loss in work time (absenteeism), 65% of work impaired (presenteeism), 64% of loss in efficiency levels (work productivity loss), and 60% of daily activities perturbed (activity impairment). Endometriosis symptoms such as chronic, incapacitating pelvic pain and infertility negatively and substantially impact the physical and mental health status, health-related quality of life, and productivity at work of women. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  1. Quantification of Sediment Transport During Glacier Surges and its Impact on Landform Architecture

    DEFF Research Database (Denmark)

    Kjær, Kurt H.; Schomacker, Anders; Korsgaard, Niels Jákup

    ) for 1945, prior to the last surge in 1964, and for 2003 in order to assess the effect of the surge on the sediment architecture in the forefield. The pre- and post-surge DEMs allow direct quantification of the sediment volumes that were re-distributed in the forefield by the surging ice mass in 1964...... or glaciofluvial outwash fans. Mapping of the sediment thickness in the glacier forefield shows higher accumulation along ice marginal positions related to wedge formation during extremely rapid ice flow. Fast flow was sustained by overpressurized water causing sediment-bedrock decoupling beneath a thick sediment...... architecture occurs distal to the 1810 ice margin, where the 1890 surge advanced over hitherto undeformed sediments. Proximal to the 1810 ice margin, the landscape have been transgressed by either one or two glaciers (in 1890 and 1964). The most complex landscape architecture is found proximal to the 1964 ice...

  2. MRI measurements of water diffusion: impact of region of interest selection on ischemic quantification

    International Nuclear Information System (INIS)

    Ozsunar, Yelda; Koseoglu, Kutsi; Huisman, Thierry A.G.M.; Koroshetz, Walter; Sorensen, A. Gregory

    2004-01-01

    Objective: To investigate the effect of ADC heterogeneity on region of interest (ROI) measurement of isotropic and anisotropic water diffusion in acute (<12 h) cerebral infarctions. Methods and materials: Full diffusion tensor images were retrospectively analyzed in 32 patients with acute cerebral infarction. Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) values were measured in ischemic lesions and in the corresponding contralateral, normal appearing brain by using four ROIs for each patient. The 2x2 pixel square ROIs were placed in the center, the lateral rim and the medial rim of the infarction. In addition, the whole volume of the infarction was measured using a free hand method. Each ROI value obtained from the ischemic lesion was normalized using contralateral normal ROI values. Results: The localization of the ROIs in relation to the ischemic lesion significantly affected ADC measurement (P<0.01, using Friedman test), but not FA measurement (P=0.25). Significant differences were found between ADC values of the center of the infarction versus whole volume (P<0.01), and medial rim versus whole volume of infarction (P<0.001) with variation of relative ADC values up to 11%. The differences of absolute ADC for these groups were 22 and 23%, respectively. The lowest ADC was found in the center, followed by medial rim, lateral rim and whole volume of infarction. Conclusion: ADC quantification may provide variable results depending on ROI method. The ADC and FA values, obtained from the center of infarction tend to be lower compared to the periphery. The researchers who try to compare studies or work on ischemic quantification should be aware of these differences and effects

  3. On impact damage detection and quantification for CFRP laminates using structural response data only

    Science.gov (United States)

    Sultan, M. T. H.; Worden, K.; Pierce, S. G.; Hickey, D.; Staszewski, W. J.; Dulieu-Barton, J. M.; Hodzic, A.

    2011-11-01

    The overall purpose of the research is to detect and attempt to quantify impact damage in structures made from composite materials. A study that uses simplified coupon specimens made from a Carbon Fibre-Reinforced Polymer (CFRP) prepreg with 11, 12 and 13 plies is presented. PZT sensors were placed at three separate locations in each test specimen to record the responses from impact events. To perform damaging impact tests, an instrumented drop-test machine was used and the impact energy was set to cover a range of 0.37-41.72 J. The response signals captured from each sensor were recorded by a data acquisition system for subsequent evaluation. The impacted specimens were examined with an X-ray technique to determine the extent of the damaged areas and it was found that the apparent damaged area grew monotonically with impact energy. A number of simple univariate and multivariate features were extracted from the sensor signals recorded during impact by computing their spectra and calculating frequency centroids. The concept of discordancy from the statistical discipline of outlier analysis is employed in order to separate the responses from non-damaging and damaging impacts. The results show that the potential damage indices introduced here provide a means of identifying damaging impacts from the response data alone.

  4. Impact of the recorded variable on recurrence quantification analysis of flows

    International Nuclear Information System (INIS)

    Portes, Leonardo L.; Benda, Rodolfo N.; Ugrinowitsch, Herbert; Aguirre, Luis A.

    2014-01-01

    Recurrence quantification analysis (RQA) is useful in analyzing dynamical systems from a time series s(t). This paper investigates the robustness of RQA in detecting different dynamical regimes with respect to the recorded variable s(t). RQA was applied to time series x(t), y(t) and z(t) of a drifting Rössler system, which are known to have different observability properties. It was found that some characteristics estimated via RQA are heavily influenced by the choice of s(t) in the case of flows but not in the case of maps. - Highlights: • We investigate the influence of the recorded time series on the RQA coefficients. • The time series {x}, {y} and {z} of a drifting Rössler system were recorded. • RQA coefficients were affected in different degrees by the chosen time series. • RQA coefficients were not affected when computed with the Poincaré section. • In real world experiments, observability analysis should be performed prior to RQA

  5. Impact of the recorded variable on recurrence quantification analysis of flows

    Energy Technology Data Exchange (ETDEWEB)

    Portes, Leonardo L., E-mail: ll.portes@gmail.com [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Benda, Rodolfo N.; Ugrinowitsch, Herbert [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Aguirre, Luis A. [Departamento de Engenharia Eletrônica, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil)

    2014-06-27

    Recurrence quantification analysis (RQA) is useful in analyzing dynamical systems from a time series s(t). This paper investigates the robustness of RQA in detecting different dynamical regimes with respect to the recorded variable s(t). RQA was applied to time series x(t), y(t) and z(t) of a drifting Rössler system, which are known to have different observability properties. It was found that some characteristics estimated via RQA are heavily influenced by the choice of s(t) in the case of flows but not in the case of maps. - Highlights: • We investigate the influence of the recorded time series on the RQA coefficients. • The time series {x}, {y} and {z} of a drifting Rössler system were recorded. • RQA coefficients were affected in different degrees by the chosen time series. • RQA coefficients were not affected when computed with the Poincaré section. • In real world experiments, observability analysis should be performed prior to RQA.

  6. L’impact environnemental de l’usine hydroélectrique de Porto Primavera (Brésil

    Directory of Open Access Journals (Sweden)

    Jailton Dias

    2002-12-01

    Full Text Available L’implantation de l’usine hydroélectrique de Porto Primavera sur le cours du haut Paraná, au Centre-Sud du Brésil, a entraîné de grandes transformations de l’environnement et de l’organisation de l’espace. L’ampleur et la rapidité des modifications se prêtent à un suivi par télédétection. Les images Landsat™ démontrent que la construction du barrage a donné une nouvelle impulsion au développement économique régional.

  7. Quantification of character-impacting compounds in Ocimum basilicum and 'Pesto alla Genovese' with selected ion flow tube mass spectrometry.

    Science.gov (United States)

    Amadei, Gianluca; Ross, Brian M

    2012-02-15

    Basil (Ocimum basilicum) is an important flavourant plant which constitutes the major ingredient of the pasta sauce 'Pesto alla Genovese'. The characteristic smell of basil stems mainly from a handful of terpenoids (methyl cinnamate, eucalyptol, linalool and estragole), the concentration of which varies according to basil cultivars. The simple and rapid analysis of the terpenoid constituents of basil would be useful as a means to optimise harvesting times and to act as a quality control process for basil-containing foodstuffs. Classical analytical techniques such as gas chromatography/mass spectrometry (GC/MS) are, however, slow, technically demanding and therefore less suitable for routine analysis. A new chemical ionisation technique which allows real-time quantification of traces gases, Selected Ion Flow Tube Mass Spectrometry (SIFT-MS), was therefore utilised to determine its usefulness for the assay of terpenoid concentrations in basil and pesto sauce headspace. Trace gas analysis was performed using the NO(+) precursor ion which minimised interference from other compounds. Character-impacting compound concentration was measured in basil headspace with good reproducibility and statistically significant differences were observed between cultivars. Quantification of linalool in pesto sauce headspace proved more difficult due to the presence of interfering compounds. This was resolved by careful selection of reaction product ions which allowed us to detect differences between various commercial brands of pesto. We conclude that SIFT-MS may be a valid tool for the fast and reproducible analysis of flavourant terpenoids in basil and basil-derived foodstuffs. Copyright © 2011 John Wiley & Sons, Ltd.

  8. Impact of benzodiazepines on brain FDG-PET quantification after single-dose and chronic administration in rats

    International Nuclear Information System (INIS)

    Silva-Rodríguez, Jesús; García-Varela, Lara; López-Arias, Esteban; Domínguez-Prado, Inés; Cortés, Julia; Pardo-Montero, Juan; Fernández-Ferreiro, Anxo

    2016-01-01

    Introduction: Current guidelines for brain PET imaging advice against the injection of diazepam prior to brain FDG-PET examination in order to avoid possible interactions of benzodiazepines with the radiotracer uptake. Nevertheless, many patients undergoing PET studies are likely to be under chronic treatment with benzodiazepines, for example due to the use of different medications such as sleeping pills. Animal studies may provide an extensive and accurate estimation of the effect of benzodiazepines on brain metabolism in a well-defined and controlled framework. Aim: This study aims at evaluating the impact of benzodiazepines on brain FDG uptake after single-dose administration and chronic treatment in rats. Methods: Twelve Sprague–Dawley healthy rats were randomly divided into two groups, one treated with diazepam and the other used as control group. Both groups underwent PET/CT examinations after single-dose and chronic administration of diazepam (treated) or saline (controls) during twenty-eight days. Different atlas-based quantification methods were used to explore differences on the total uptake and uptake patterns of FDG between both groups. Results: Our analysis revealed a significant reduction of global FDG uptake after acute (−16.2%) and chronic (−23.2%) administration of diazepam. Moreover, a strong trend pointing to differences between acute and chronic administrations (p < 0.08) was also observed. Uptake levels returned to normal after interrupting the administration of diazepam. On the other hand, patterns of FDG uptake were not affected by the administration of diazepam. Conclusions: The administration of diazepam causes a progressive decrease of the FDG global uptake in the rat brain, but it does not change local patterns within the brain. Under these conditions, visual assessment and quantification methods based on regional differences such as asymmetry indexes or SPM statistical analysis would still be valid when administrating this

  9. Quantification in emission tomography

    International Nuclear Information System (INIS)

    Buvat, Irene

    2011-11-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena 2 - Main problems impacting quantification in PET and SPECT: problems, consequences, correction methods, results (Attenuation, scattering, partial volume effect, movement, un-stationary spatial resolution in SPECT, fortuitous coincidences in PET, standardisation in PET); 3 - Synthesis: accessible efficiency, know-how, Precautions, beyond the activity measurement

  10. An approach to microstructure quantification in terms of impact properties of HSLA pipeline steels

    Energy Technology Data Exchange (ETDEWEB)

    Gervasyev, Alexey [Department of Materials Science and Engineering, Ghent University (Belgium); R& D Center TMK, Ltd., Chelyabinsk (Russian Federation); Carretero Olalla, Victor [SKF Belgium NV/SA, Brussels (Belgium); Sidor, Jurij [Department of Mechanical Engineering, University of West Hungary, Szombathely (Hungary); Sanchez Mouriño, Nuria [ArcelorMittal Global R& D/OCAS NV, Gent (Belgium); Kestens, Leo A.I.; Petrov, Roumen H. [Department of Materials Science and Engineering, Ghent University (Belgium); Department of Materials Science and Engineering, Delft University of Technology (Netherlands)

    2016-11-20

    Several thermo-mechanical controlled processing (TMCP) schedules of a modern pipeline steel were executed using a laboratory mill to investigate both the TMCP parameters influence on the ductile properties and the microstructure and texture evolution during TMCP. Impact fracture toughness was evaluated by means of instrumented Charpy impact test and results were correlated with the metallurgical characterization of the steel via electron backscattered diffraction (EBSD) technique. It is shown that the ductile crack growth observed in the impact test experiments can be reasonably correlated with the Morphology Clustering (MC) and the Cleavage Morphology Clustering (CMC) parameters, which incorporate size, shape, and crystallographic texture features of microstructure elements. The mechanism of unfavorable texture formation during TMCP is explained by texture changes occurring between the end of finish rolling and the start of accelerated cooling.

  11. Experimental quantification of contact forces with impact, friction and uncertainty analysis

    DEFF Research Database (Denmark)

    Lahriri, Said; Santos, Ilmar

    2013-01-01

    and whirl motions in rotor-stator contact investigations. Dry friction coefficient is therefore estimated using two different experimental setups: (a) standard pin-on-disk tests and (b) rotor impact test rig fully instrumented. The findings in both setups indicate that the dry friction coefficient for brass......During rotor-stator contact dry friction plays a significant role in terms of reversing the rotor precession. The frictional force causes an increase in the rotor's tangential velocity in the direction opposite to that of the angular velocity. This effect is crucial for defining ranges of dry whip......-aluminum configuration significantly varies in a range of 0.16-0.83. The rotor enters a full annular contact mode shortly after two impacts with a contact duration of approximately 0.004 s at each location. It is experimentally demonstrated that the friction force is not present when the rotor enters a full annular...

  12. Quantification of the impacts of coalmine water irrigation on the underlying aquifers

    Energy Technology Data Exchange (ETDEWEB)

    Vermeulen, D.; Usher, B.; van Tonder, G. [University of Free State, Bloemfontein (South Africa). Institute of Groundwater Studies

    2009-07-15

    It is predicted that vast volumes of affected mine water will be produced by mining activities in the Mpumalanga coalfields of South Africa, The potential environmental impact of this excess water is of great concern in a water-scarce country like South Africa. Research over a period of more than 10 years has shown that this water can be used successfully for the irrigation of a range of crops. There is, however, continuing concern from the local regulators regarding the long-term impact that large-scale mine water irrigation may have on groundwater quality and quantity. Detailed research has been undertaken over the last three years to supplement the groundwater monitoring programme at five different pilot sites, on both virgin soils (greenfields) and in coalmining spoils. These sites range from sandy soils to very clayey soils. The research has included soil moisture measurements, collection of in situ soil moisture over time, long-term laboratory studies of the leaching and attenuation properties of different soils and the impact of irrigation on acid rock drainage processes, and in depth determination of the hydraulic properties of the subsurface at each of these sites, including falling head tests, pumping tests and point dilution tests. This has been supported by geochemical modelling of these processes to quantify the impacts. The results indicate that many of the soils have considerable attenuation capacities and that in the period of irrigation, a large proportion of the salts have been contained in the upper portions of the unsaturated zones below each irrigation pivot. The volumes and quality of water leaching through to the aquifers have been quantified at each site. From this mixing ratios have been calculated in order to determine the effect of the irrigation water on the underlying aquifers.

  13. Quantification of the Impact of Endometriosis Symptoms on Health Related Quality of Life and Work Productivity

    Science.gov (United States)

    Fourquet, Jessica; Báez, Lorna; Figueroa, Michelle; Iriarte, R. Iván; Flores, Idhaliz

    2011-01-01

    OBJECTIVE To quantify the impact of endometriosis-related symptoms on physical and mental health status, health-related quality of life (HRQoL), and work-related aspects (absenteeism, presenteeism, work productivity, activity impairment). DESIGN Cross-sectional quantitative study. SETTING Academic and research institution. PATIENT(S) Women (n=193) with self-reported surgically diagnosed endometriosis from the Endometriosis Patient Registry at Ponce School of Medicine and Health Sciences (PSMHS). INTERVENTION(S) Patients completed an anonymous questionnaire divided into three sections consisting of questions from the Patient Health Survey (SF-12®), the Endometriosis Health Profile (EHP-5), and the Work Productivity and Activity Impairment Survey (WPAI). MAIN OUTCOME MEASURE(S) Impact of endometriosis symptoms on physical and mental health status, health-related quality of life (HRQoL), absenteeism, presenteeism, work productivity and activity impairment was quantified. RESULTS Patients had SF-12 scores denoting significant disability in the phyisical and mental health components. They also reported an average of 7.41 hrs (approximately one working day) of work time loss during the week the symptoms are worse. In addition, WPAI scores show high impact on work-related domains: 13% of average loss in work time (absenteeism), 65% of their work was impaired (presenteeism), 64% loss in efficiency levels (work productivity loss), and 60% of daily activities perturbed (activity impairment). CONCLUSION Endometriosis symptoms such as chronic, incapacitating pelvic pain and infertility negatively and substantially impact the physical and mental health status, HRQoL, and productivity at work of patients with endometriosis. PMID:21621771

  14. Application of microtomography and image analysis to the quantification of fragmentation in ceramics after impact loading

    Science.gov (United States)

    Forquin, Pascal; Ando, Edward

    2017-01-01

    Silicon carbide ceramics are widely used in personal body armour and protective solutions. However, during impact, an intense fragmentation develops in the ceramic tile due to high-strain-rate tensile loadings. In this work, microtomography equipment was used to analyse the fragmentation patterns of two silicon carbide grades subjected to edge-on impact (EOI) tests. The EOI experiments were conducted in two configurations. The so-called open configuration relies on the use of an ultra-high-speed camera to visualize the fragmentation process with an interframe time set to 1 µs. The so-called sarcophagus configuration consists in confining the target in a metallic casing to avoid any dispersion of fragments. The target is infiltrated after impact so the final damage pattern is entirely scanned using X-ray tomography and a microfocus source. Thereafter, a three-dimensional (3D) segmentation algorithm was tested and applied in order to separate fragments in 3D allowing a particle size distribution to be obtained. Significant differences between the two specimens of different SiC grades were noted. To explain such experimental results, numerical simulations were conducted considering the Denoual-Forquin-Hild anisotropic damage model. According to the calculations, the difference of crack pattern in EOI tests is related to the population of defects within the two ceramics. This article is part of the themed issue 'Experimental testing and modelling of brittle materials at high strain rates'.

  15. The GEO-3 Scenarios 2002-2032. Quantification and Analysis of Environmental Impacts

    International Nuclear Information System (INIS)

    Bakkes, J.; Potting, J.; Kemp-Benedict, E.; Raskin, P.; Masui, T.; Rana, A.; Nellemann, C.; Rothman, D.

    2004-01-01

    The four contrasting visions of the world's next three decades as presented in the third Global Environment Outlook (GEO-3) have many implications for policy - from hunger to climate change and from freshwater issues to biodiversity. The four scenarios analysed are Markets First, Policy First, Security First, Sustainability First. Presenting a deeper analysis than the original GEO-3 report, this Technical Report quantifies the impacts of the scenarios for all 19 GEO 'sub-regions', such as Eastern Africa and Central Europe. Regional impacts are discussed in the context of sustainable development. The report summary compares the impacts of the four scenarios across regions - and for the world as a whole - in the light of internationally agreed targets including those in the Millennium Declaration where applicable. It provides an account of the analytical methods, key assumptions, models and other tools, along with the approaches used in the analyses. Based on the methods and results, the report looks back on the process of producing the forward-looking analysis for GEO-3. Were all analytical centres on the same track? Did the approach adopted for GEO-3 contribute to the overall GEO objective of strengthening global-regional involvement and linkages?

  16. The GEO-3 Scenarios 2002-2032. Quantification and Analysis of Environmental Impacts

    Energy Technology Data Exchange (ETDEWEB)

    Bakkes, J.; Potting, J. (eds.) [National Institute for Public Health and the Environment RIVM, Bilthoven (Netherlands); Henrichs, T. [Center for Environmental Systems Research CESR, University of Kassel, Kassel (Germany); Kemp-Benedict, E.; Raskin, P. [Stockholm Environment Institute SEI, Boston, MA (United States); Masui, T.; Rana, A. [National Institute for Environmental Studies NIES, Ibaraki (Japan); Nellemann, C. [United Nations Environment Programme UNEP, GRID Global and Regional Integrated Data centres Arendal, Lillehammer (Norway); Rothman, D. [International Centre for Integrative Studies ICIS, Maastricht University, Maastricht (Netherlands)

    2004-07-01

    The four contrasting visions of the world's next three decades as presented in the third Global Environment Outlook (GEO-3) have many implications for policy - from hunger to climate change and from freshwater issues to biodiversity. The four scenarios analysed are Markets First, Policy First, Security First, Sustainability First. Presenting a deeper analysis than the original GEO-3 report, this Technical Report quantifies the impacts of the scenarios for all 19 GEO 'sub-regions', such as Eastern Africa and Central Europe. Regional impacts are discussed in the context of sustainable development. The report summary compares the impacts of the four scenarios across regions - and for the world as a whole - in the light of internationally agreed targets including those in the Millennium Declaration where applicable. It provides an account of the analytical methods, key assumptions, models and other tools, along with the approaches used in the analyses. Based on the methods and results, the report looks back on the process of producing the forward-looking analysis for GEO-3. Were all analytical centres on the same track? Did the approach adopted for GEO-3 contribute to the overall GEO objective of strengthening global-regional involvement and linkages?.

  17. Life Cycle Assessment to support the quantification of the environmental impacts of an event

    Energy Technology Data Exchange (ETDEWEB)

    Toniolo, Sara; Mazzi, Anna; Fedele, Andrea; Aguiari, Filippo; Scipioni, Antonio, E-mail: scipioni@unipd.it

    2017-03-15

    In recent years, several tools have been used to define and quantify the environmental impacts associated with an event; however, a lack of uniform approaches for conducting environmental evaluations has been revealed. The aim of this paper is to evaluate whether the Life Cycle Assessment methodology, which is rarely applied to an event, can be an appropriate tool for calculating the environmental impacts associated with the assembly, disassembly, and use phase of an event analysing in particular the components and the displays used to establish the exhibits. The aim is also to include the issues reported by ISO 20121:2012 involving the interested parties that can be monitored but also affected by the event owner, namely the event organiser, the workforce and the supply chain. A small event held in Northern Italy was selected as the subject of the research. The results obtained show that the main contributors are energy consumption for lighting and heating and the use of aluminium materials, such as bars for supporting the spotlights, carpet and the electronic equipment. A sensitivity analysis for estimating the effects of the impact assessment method chosen has also been conducted and an uncertainty analysis has been performed using the Monte Carlo technique. This study highlighted the importance of the energy consumed by heating and lighting on the environmental implications, and indicated that the preparation and assembly should always be considered when quantifying the environmental profile of an event. - Highlights: • LCA methodology, developed for products and services, is applied to an event. • A small event held in Northern Italy is analysed. • The main contributors are energy consumption and the use of aluminium and carpet. • Exhibition site preparation can have important environmental implications. • This study demonstrates the importance of the assembly, disassembly and use phase.

  18. Quantification of intensive hybrid coastal reclamation for revealing its impacts on macrozoobenthos

    International Nuclear Information System (INIS)

    Yan, Jiaguo; Cui, Baoshan; Zheng, Jingjing; Xie, Tian; Wang, Qing; Li, Shanze

    2015-01-01

    Managing and identifying the sources of anthropogenic stress in coastal wetlands requires an in-depth understanding of relationships between species diversity and human activities. Empirical and experimental studies provide clear evidence that coastal reclamation can have profound impacts on marine organisms, but the focus of such studies is generally on comparative or laboratory research. We developed a compound intensity index (reclamation intensity index, RI) on hybrid coastal reclamation, to quantify the impacts of reclamation on coastal ecosystems. We also made use of mean annual absolute changes to a number of biotic variables (biodiversity, species richness, biomass of total macrozoobenthos, and species richness and biomass of Polychaeta, Mollusca, Crustacea, and Echinodermata) to determine Hedges’d index, which is a measure of the potential effects of coastal reclamation. Our results showed that there was significant difference of coastal reclamation intensity between Yellow Sea, East China Sea and South China Sea, the biological changes in effect sizes of the three regions differed greatly over time. Our modelling analyses showed that hybrid coastal reclamation generally had significant negative impacts on species diversity and biomass of macrozoobenthos. These relationships varied among different taxonomic groups and included both linear and nonlinear relationships. The results indicated that a high-intensity of coastal reclamation contributed to a pronounced decline in species diversity and biomass, while lower-intensity reclamation, or reclamation within certain thresholds, resulted in a small increase in species diversity and biomass. These results have important implications for biodiversity conservation and the ecological restoration of coastal wetlands in face of the intensive reclamation activities. (letter)

  19. Life Cycle Assessment to support the quantification of the environmental impacts of an event

    International Nuclear Information System (INIS)

    Toniolo, Sara; Mazzi, Anna; Fedele, Andrea; Aguiari, Filippo; Scipioni, Antonio

    2017-01-01

    In recent years, several tools have been used to define and quantify the environmental impacts associated with an event; however, a lack of uniform approaches for conducting environmental evaluations has been revealed. The aim of this paper is to evaluate whether the Life Cycle Assessment methodology, which is rarely applied to an event, can be an appropriate tool for calculating the environmental impacts associated with the assembly, disassembly, and use phase of an event analysing in particular the components and the displays used to establish the exhibits. The aim is also to include the issues reported by ISO 20121:2012 involving the interested parties that can be monitored but also affected by the event owner, namely the event organiser, the workforce and the supply chain. A small event held in Northern Italy was selected as the subject of the research. The results obtained show that the main contributors are energy consumption for lighting and heating and the use of aluminium materials, such as bars for supporting the spotlights, carpet and the electronic equipment. A sensitivity analysis for estimating the effects of the impact assessment method chosen has also been conducted and an uncertainty analysis has been performed using the Monte Carlo technique. This study highlighted the importance of the energy consumed by heating and lighting on the environmental implications, and indicated that the preparation and assembly should always be considered when quantifying the environmental profile of an event. - Highlights: • LCA methodology, developed for products and services, is applied to an event. • A small event held in Northern Italy is analysed. • The main contributors are energy consumption and the use of aluminium and carpet. • Exhibition site preparation can have important environmental implications. • This study demonstrates the importance of the assembly, disassembly and use phase.

  20. Quantification of regional radiative impacts and climate effects of tropical fire aerosols

    Science.gov (United States)

    Tosca, M. G.; Zender, C. S.; Randerson, J. T.

    2011-12-01

    Regionally expansive smoke clouds originating from deforestation fires in Indonesia can modify local precipitation patterns via direct aerosol scattering and absorption of solar radiation (Tosca et al., 2010). Here we quantify the regional climate impacts of fire aerosols for three tropical burning regions that together account for about 70% of global annual fire emissions. We use the Community Atmosphere Model, version 5 (CAM5) coupled to a slab ocean model (SOM) embedded within the Community Earth System Model (CESM). In addition to direct aerosol radiative effects, CAM5 also quantifies indirect, semi-direct and cloud microphysical aerosol effects. Climate impacts are determined using regionally adjusted emissions data that produce realistic aerosol optical depths in CAM5. We first analyzed a single 12-year transient simulation (1996-2007) forced with unadjusted emissions estimates from the Global Fire Emissions Database, version 3 (GFEDv3) and compared the resulting aerosol optical depths (AODs) for 4 different burning regions (equatorial Asia, southern Africa, South America and boreal North America) to observed MISR and MODIS AODs for the same period. Based on this analysis we adjusted emissions for each burning region between 150 and 300% and forced a second simulation with the regionally adjusted emissions. Improved AODs from this simulation are compared to AERONET observations available at 15 stations throughout the tropics. We present here two transient simulations--one with the adjusted fire emissions and one without fires--to quantify the cumulative fire aerosol climate impact for three major tropical burning regions (equatorial Asia, southern Africa and South America). Specifically, we quantify smoke effects on radiation, precipitation, and temperature. References Tosca, M.G., J.T. Randerson, C.S. Zender, M.G. Flanner and P.J. Rasch (2010), Do biomass burning aerosols intensify drought in equatorial Asia during El Nino?, Atmos. Chem. Phys., 10, 3515

  1. Quantification of the Impact of the HIV-1-Glycan Shield on Antibody Elicitation

    Directory of Open Access Journals (Sweden)

    Tongqing Zhou

    2017-04-01

    Full Text Available While the HIV-1-glycan shield is known to shelter Env from the humoral immune response, its quantitative impact on antibody elicitation has been unclear. Here, we use targeted deglycosylation to measure the impact of the glycan shield on elicitation of antibodies against the CD4 supersite. We engineered diverse Env trimers with select glycans removed proximal to the CD4 supersite, characterized their structures and glycosylation, and immunized guinea pigs and rhesus macaques. Immunizations yielded little neutralization against wild-type viruses but potent CD4-supersite neutralization (titers 1: >1,000,000 against four-glycan-deleted autologous viruses with over 90% breadth against four-glycan-deleted heterologous strains exhibiting tier 2 neutralization character. To a first approximation, the immunogenicity of the glycan-shielded protein surface was negligible, with Env-elicited neutralization (ID50 proportional to the exponential of the protein-surface area accessible to antibody. Based on these high titers and exponential relationship, we propose site-selective deglycosylated trimers as priming immunogens to increase the frequency of site-targeting antibodies.

  2. Quantification of the Impact of the HIV-1-Glycan Shield on Antibody Elicitation

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Tongqing; Doria-Rose, Nicole A.; Cheng, Cheng; Stewart-Jones, Guillaume B. E.; Chuang, Gwo-Yu; Chambers, Michael; Druz, Aliaksandr; Geng, Hui; McKee, Krisha; Kwon, Young Do; O’Dell, Sijy; Sastry, Mallika; Schmidt, Stephen D.; Xu, Kai; Chen, Lei; Chen, Rita E.; Louder, Mark K.; Pancera, Marie; Wanninger, Timothy G.; Zhang, Baoshan; Zheng, Anqi; Farney, S. Katie; Foulds, Kathryn E.; Georgiev, Ivelin S.; Joyce, M. Gordon; Lemmin, Thomas; Narpala, Sandeep; Rawi, Reda; Soto, Cinque; Todd, John-Paul; Shen, Chen-Hsiang; Tsybovsky, Yaroslav; Yang, Yongping; Zhao, Peng; Haynes, Barton F.; Stamatatos, Leonidas; Tiemeyer, Michael; Wells, Lance; Scorpio, Diana G.; Shapiro, Lawrence; McDermott, Adrian B.; Mascola, John R.; Kwong, Peter D.

    2017-04-01

    While the HIV-1-glycan shield is known to shelter Env from the humoral immune response, its quantitative impact on antibody elicitation has been unclear. Here, we use targeted deglycosylation to measure the impact of the glycan shield on elicitation of antibodies against the CD4 supersite. We engineered diverse Env trimers with select glycans removed proximal to the CD4 supersite, characterized their structures and glycosylation, and immunized guinea pigs and rhesus macaques. Immunizations yielded little neutralization against wild-type viruses but potent CD4-supersite neutralization (titers 1: >1,000,000 against four-glycan-deleted autologous viruses with over 90% breadth against four-glycan-deleted heterologous strains exhibiting tier 2 neutralization character). To a first approximation, the immunogenicity of the glycan-shielded protein surface was negligible, with Env-elicited neutralization (ID50) proportional to the exponential of the protein-surface area accessible to antibody. Based on these high titers and exponential relationship, we propose site-selective deglycosylated trimers as priming immunogens to increase the frequency of site-targeting antibodies.

  3. Quantification of the impact of PSI:Biology according to the annotations of the determined structures.

    Science.gov (United States)

    DePietro, Paul J; Julfayev, Elchin S; McLaughlin, William A

    2013-10-21

    Protein Structure Initiative:Biology (PSI:Biology) is the third phase of PSI where protein structures are determined in high-throughput to characterize their biological functions. The transition to the third phase entailed the formation of PSI:Biology Partnerships which are composed of structural genomics centers and biomedical science laboratories. We present a method to examine the impact of protein structures determined under the auspices of PSI:Biology by measuring their rates of annotations. The mean numbers of annotations per structure and per residue are examined. These are designed to provide measures of the amount of structure to function connections that can be leveraged from each structure. One result is that PSI:Biology structures are found to have a higher rate of annotations than structures determined during the first two phases of PSI. A second result is that the subset of PSI:Biology structures determined through PSI:Biology Partnerships have a higher rate of annotations than those determined exclusive of those partnerships. Both results hold when the annotation rates are examined either at the level of the entire protein or for annotations that are known to fall at specific residues within the portion of the protein that has a determined structure. We conclude that PSI:Biology determines structures that are estimated to have a higher degree of biomedical interest than those determined during the first two phases of PSI based on a broad array of biomedical annotations. For the PSI:Biology Partnerships, we see that there is an associated added value that represents part of the progress toward the goals of PSI:Biology. We interpret the added value to mean that team-based structural biology projects that utilize the expertise and technologies of structural genomics centers together with biological laboratories in the community are conducted in a synergistic manner. We show that the annotation rates can be used in conjunction with established metrics, i

  4. Human impacts quantification on the coastal landforms of Gran Canaria Island (Canary Islands)

    Science.gov (United States)

    Ferrer-Valero, Nicolás; Hernández-Calvento, Luis; Hernández-Cordero, Antonio I.

    2017-06-01

    The coastal areas of the Canary Islands are particularly sensitive to changes, both from a natural perspective and for their potential socio-economic implications. In this paper, the state of conservation of an insular coast is approached from a geomorphological point of view, considering recent changes induced by urban and tourism development. The analysis is applied to the coast of Gran Canaria, a small Atlantic island of volcanic origin, subject to a high degree of human pressure on its coastal areas, especially in recent decades. Currently, much of the economic activity of Gran Canaria is linked to mass tourism, associated with climatic and geomorphological features of the coast. This work is addressed through detailed mapping of coastal landforms across the island (256 km perimeter), corresponding to the period before the urban and tourism development (late 19th century for the island's capital, mid-20th century for the rest of the island) and today. The comparison between the coastal geomorphology before and after the urban and tourism development was established through four categories of human impacts, related to their conservation state: unaltered, altered, semi-destroyed and extinct. The results indicate that 43% of coastal landforms have been affected by human impacts, while 57% remain unaltered. The most affected are sedimentary landforms, namely coastal dunes, palaeo-dunes, beaches and wetlands. Geodiversity loss was also evaluated by applying two diversity indices. The coastal geodiversity loss by total or partial destruction of landforms is estimated at - 15.2%, according to Shannon index (H‧), while it increases to - 32.1% according to an index proposed in this paper. We conclude that the transformations of the coast of Gran Canaria induced by urban and tourism development have heavily affected the most singular coastal landforms (dunes, palaeo-dunes and wetlands), reducing significantly its geodiversity.

  5. Impact of Personal Characteristics and Technical Factors on Quantification of Sodium 18F-Fluoride Uptake in Human Arteries

    DEFF Research Database (Denmark)

    Blomberg, Björn Alexander; Thomassen, Anders; de Jong, Pim A

    2015-01-01

    Sodium (18)F-fluoride ((18)F-NaF) PET/CT imaging is a promising imaging technique for assessment of atherosclerosis, but is hampered by a lack of validated quantification protocols. Both personal characteristics and technical factors can affect quantification of arterial (18)F-NaF uptake....... This study investigated if blood activity, renal function, injected dose, circulating time, and PET/CT system affect quantification of arterial (18)F-NaF uptake. METHODS: Eighty-nine healthy subjects were prospectively examined by (18)F-NaF PET/CT imaging. Arterial (18)F-NaF uptake was quantified...... assessed the effect of personal characteristics and technical factors on quantification of arterial (18)F-NaF uptake. RESULTS: NaFmax and TBRmax/mean were dependent on blood activity (β = .34 to .44, P

  6. Climate change impacts on tree ranges: model intercomparison facilitates understanding and quantification of uncertainty.

    Science.gov (United States)

    Cheaib, Alissar; Badeau, Vincent; Boe, Julien; Chuine, Isabelle; Delire, Christine; Dufrêne, Eric; François, Christophe; Gritti, Emmanuel S; Legay, Myriam; Pagé, Christian; Thuiller, Wilfried; Viovy, Nicolas; Leadley, Paul

    2012-06-01

    Model-based projections of shifts in tree species range due to climate change are becoming an important decision support tool for forest management. However, poorly evaluated sources of uncertainty require more scrutiny before relying heavily on models for decision-making. We evaluated uncertainty arising from differences in model formulations of tree response to climate change based on a rigorous intercomparison of projections of tree distributions in France. We compared eight models ranging from niche-based to process-based models. On average, models project large range contractions of temperate tree species in lowlands due to climate change. There was substantial disagreement between models for temperate broadleaf deciduous tree species, but differences in the capacity of models to account for rising CO(2) impacts explained much of the disagreement. There was good quantitative agreement among models concerning the range contractions for Scots pine. For the dominant Mediterranean tree species, Holm oak, all models foresee substantial range expansion. © 2012 Blackwell Publishing Ltd/CNRS.

  7. Quantification of the Impact of Photon Distinguishability on Measurement-Device- Independent Quantum Key Distribution

    Directory of Open Access Journals (Sweden)

    Garrett K. Simon

    2018-04-01

    Full Text Available Measurement-Device-Independent Quantum Key Distribution (MDI-QKD is a two-photon protocol devised to eliminate eavesdropping attacks that interrogate or control the detector in realized quantum key distribution systems. In MDI-QKD, the measurements are carried out by an untrusted third party, and the measurement results are announced openly. Knowledge or control of the measurement results gives the third party no information about the secret key. Error-free implementation of the MDI-QKD protocol requires the crypto-communicating parties, Alice and Bob, to independently prepare and transmit single photons that are physically indistinguishable, with the possible exception of their polarization states. In this paper, we apply the formalism of quantum optics and Monte Carlo simulations to quantify the impact of small errors in wavelength, bandwidth, polarization and timing between Alice’s photons and Bob’s photons on the MDI-QKD quantum bit error rate (QBER. Using published single-photon source characteristics from two-photon interference experiments as a test case, our simulations predict that the finite tolerances of these sources contribute ( 4.04 ± 20 / N sifted % to the QBER in an MDI-QKD implementation generating an N sifted -bit sifted key.

  8. Wettability impact on supercritical CO2 capillary trapping: Pore-scale visualization and quantification

    Science.gov (United States)

    Hu, Ran; Wan, Jiamin; Kim, Yongman; Tokunaga, Tetsu K.

    2017-08-01

    How the wettability of pore surfaces affects supercritical (sc) CO2 capillary trapping in geologic carbon sequestration (GCS) is not well understood, and available evidence appears inconsistent. Using a high-pressure micromodel-microscopy system with image analysis, we studied the impact of wettability on scCO2 capillary trapping during short-term brine flooding (80 s, 8-667 pore volumes). Experiments on brine displacing scCO2 were conducted at 8.5 MPa and 45°C in water-wet (static contact angle θ = 20° ± 8°) and intermediate-wet (θ = 94° ± 13°) homogeneous micromodels under four different flow rates (capillary number Ca ranging from 9 × 10-6 to 8 × 10-4) with a total of eight conditions (four replicates for each). Brine invasion processes were recorded and statistical analysis was performed for over 2000 images of scCO2 saturations, and scCO2 cluster characteristics. The trapped scCO2 saturation under intermediate-wet conditions is 15% higher than under water-wet conditions under the slowest flow rate (Ca ˜ 9 × 10-6). Based on the visualization and scCO2 cluster analysis, we show that the scCO2 trapping process in our micromodels is governed by bypass trapping that is enhanced by the larger contact angle. Smaller contact angles enhance cooperative pore filling and widen brine fingers (or channels), leading to smaller volumes of scCO2 being bypassed. Increased flow rates suppress this wettability effect.

  9. Impact of attenuation correction strategies on the quantification of High Resolution Research Tomograph PET studies

    International Nuclear Information System (INIS)

    Velden, Floris H P van; Kloet, Reina W; Berckel, Bart N M van; Molthoff, Carla F M; Jong, Hugo W A M de; Lammertsma, Adriaan A; Boellaard, Ronald

    2008-01-01

    In this study, the quantitative accuracy of different attenuation correction strategies presently available for the High Resolution Research Tomograph (HRRT) was investigated. These attenuation correction methods differ in reconstruction and processing (segmentation) algorithms used for generating a μ-image from measured 2D transmission scans, an intermediate step in the generation of 3D attenuation correction factors. Available methods are maximum-a-posteriori reconstruction (MAP-TR), unweighted OSEM (UW-OSEM) and NEC-TR, which transforms sinogram values back to their noise equivalent counts (NEC) to restore Poisson distribution. All methods can be applied with or without μ-image segmentation. However, for MAP-TR a μ-histogram is a prior during reconstruction. All possible strategies were evaluated using phantoms of various sizes, simulating preclinical and clinical situations. Furthermore, effects of emission contamination of the transmission scan on the accuracy of various attenuation correction strategies were studied. Finally, the accuracy of various attenuation corrections strategies and its relative impact on the reconstructed activity concentration (AC) were evaluated using small animal and human brain studies. For small structures, MAP-TR with human brain priors showed smaller differences in μ-values for transmission scans with and without emission contamination (<8%) than the other methods (<26%). In addition, it showed best agreement with true AC (deviation <4.5%). A specific prior designed to take into account the presence of small animal fixation devices only very slightly improved AC precision to 4.3%. All methods scaled μ-values of a large homogeneous phantom to within 4% of the water peak, but MAP-TR provided most accurate AC after reconstruction. However, for clinical data MAP-TR using the default prior settings overestimated the thickness of the skull, resulting in overestimations of μ-values in regions near the skull and thus in incorrect

  10. Quantification of the impact of water as an impurity on standard physico-chemical properties of ionic liquids

    International Nuclear Information System (INIS)

    Andanson, J.-M.; Meng, X.; Traïkia, M.; Husson, P.

    2016-01-01

    Highlights: • Residual water has a negligible impact on density of hydrophobic ionic liquids. • The density of a dry sample can be calculated from the density of a wet ionic liquid. • The viscosity of a dry sample can be calculated from the one of a wet ionic liquid. • Water can be quantified by NMR spectroscopy even in dried hydrophobic ionic liquids. - Abstract: The objective of this work was to quantify the effect of the presence of water as impurity in ionic liquids. First, density and viscosity of five ionic liquids as well as their aqueous solutions were measured. For hydrophobic dried ionic liquids, traces of water (50 ppm) have measurable impact neither on the density nor on the viscosity values. In the concentration range studied (up to 5000 ppm), a linear evolution of the molar volume of the mixture with the mole fraction composition is observed. Practically, this allows to estimate the density of an neat ionic liquid provided (i) the water quantity and (ii) the density of the undried sample are known. This is particularly useful for hydrophilic ionic liquids that are difficult to dry. In the studied concentration range, a linear evolution of the relative viscosity was also depicted as a function of the mass fraction composition. It is thus possible to evaluate the viscosity of the pure ionic liquid knowing the water quantity and the viscosity of the undried sample. The comparison of the results obtained using two viscosimeters confirms that a Stabinger viscosimeter is appropriate to precisely measure ionic liquids viscosities. Second, NMR and IR spectroscopies were used to characterize the pure ionic liquids and their solutions with water. The sensitivity of IR spectroscopy does allow neither the quantification nor the detection of water below 1 mol%. With NMR spectroscopy, water can be quantified using either the intensity or the chemical shift of the water proton peak for mole fractions as low as 200 ppm. It is even possible to detect water in

  11. Impact of sequential proton density fat fraction for quantification of hepatic steatosis in nonalcoholic fatty liver disease.

    Science.gov (United States)

    Idilman, Ilkay S; Keskin, Onur; Elhan, Atilla Halil; Idilman, Ramazan; Karcaaltincaba, Musturay

    2014-05-01

    To determine the utility of sequential MRI-estimated proton density fat fraction (MRI-PDFF) for quantification of the longitudinal changes in liver fat content in individuals with nonalcoholic fatty liver disease (NAFLD). A total of 18 consecutive individuals (M/F: 10/8, mean age: 47.7±9.8 years) diagnosed with NAFLD, who underwent sequential PDFF calculations for the quantification of hepatic steatosis at two different time points, were included in the study. All patients underwent T1-independent volumetric multi-echo gradient-echo imaging with T2* correction and spectral fat modeling. A close correlation for quantification of hepatic steatosis between the initial MRI-PDFF and liver biopsy was observed (rs=0.758, phepatic steatosis. The changes in serum ALT levels significantly reflected changes in MRI-PDFF in patients with NAFLD.

  12. Impact of muscular uptake and statistical noise on tumor quantification based on simulated FDG-PET studies

    International Nuclear Information System (INIS)

    Silva-Rodríguez, Jesús; Domínguez-Prado, Inés; Pardo-Montero, Juan; Ruibal, Álvaro

    2017-01-01

    Purpose: The aim of this work is to study the effect of physiological muscular uptake variations and statistical noise on tumor quantification in FDG-PET studies. Methods: We designed a realistic framework based on simulated FDG-PET acquisitions from an anthropomorphic phantom that included different muscular uptake levels and three spherical lung lesions with diameters of 31, 21 and 9 mm. A distribution of muscular uptake levels was obtained from 136 patients remitted to our center for whole-body FDG-PET. Simulated FDG-PET acquisitions were obtained by using the Simulation System for Emission Tomography package (SimSET) Monte Carlo package. Simulated data was reconstructed by using an iterative Ordered Subset Expectation Maximization (OSEM) algorithm implemented in the Software for Tomographic Image Reconstruction (STIR) library. Tumor quantification was carried out by using estimations of SUV max , SUV 50 and SUV mean from different noise realizations, lung lesions and multiple muscular uptakes. Results: Our analysis provided quantification variability values of 17–22% (SUV max ), 11–19% (SUV 50 ) and 8–10% (SUV mean ) when muscular uptake variations and statistical noise were included. Meanwhile, quantification variability due only to statistical noise was 7–8% (SUV max ), 3–7% (SUV 50 ) and 1–2% (SUV mean ) for large tumors (>20 mm) and 13% (SUV max ), 16% (SUV 50 ) and 8% (SUV mean ) for small tumors (<10 mm), thus showing that the variability in tumor quantification is mainly affected by muscular uptake variations when large enough tumors are considered. In addition, our results showed that quantification variability is strongly dominated by statistical noise when the injected dose decreases below 222 MBq. Conclusions: Our study revealed that muscular uptake variations between patients who are totally relaxed should be considered as an uncertainty source of tumor quantification values. - Highlights: • Distribution of muscular uptake from 136 PET

  13. Impact of follicular G-CSF quantification on subsequent embryo transfer decisions: a proof of concept study.

    Science.gov (United States)

    Lédée, N; Gridelet, V; Ravet, S; Jouan, C; Gaspard, O; Wenders, F; Thonon, F; Hincourt, N; Dubois, M; Foidart, J M; Munaut, C; Perrier d'Hauterive, S

    2013-02-01

    Previous experiments have shown that granulocyte colony-stimulating factor (G-CSF), quantified in the follicular fluid (FF) of individual oocytes, correlates with the potential for an ongoing pregnancy of the corresponding fertilized oocytes among selected transferred embryos. Here we present a proof of concept study aimed at evaluating the impact of including FF G-CSF quantification in the embryo transfer decisions. FF G-CSF was quantified with the Luminex XMap technology in 523 individual FF samples corresponding to 116 fresh transferred embryos, 275 frozen embryos and 131 destroyed embryos from 78 patients undergoing ICSI. Follicular G-CSF was highly predictive of subsequent implantation. The receiving operator characteristics curve methodology showed its higher discriminatory power to predict ongoing pregnancy in multivariate logistic regression analysis for FF G-CSF compared with embryo morphology [0.77 (0.69-0.83), P Embryos were classified by their FF G-CSF concentration: Class I over 30 pg/ml (a highest positive predictive value for implantation), Class II from 30 to 18.4 pg/ml and Class III Embryos derived from Class I follicles had a significantly higher implantation rate (IR) than those from Class II and III follicles (36 versus 16.6 and 6%, P Embryos derived from Class I follicles with an optimal morphology reached an IR of 54%. Frozen-thawed embryos transfer derived from Class I follicles had an IR of 37% significantly higher than those from Class II and III follicles, respectively, of 8 and 5% (P embryos but also 10% of the destroyed embryos were derived from G-CSF Class I follicles. Non-optimal embryos appear to have been transferred in 28% (22/78) of the women, and their pregnancy rate was significantly lower than that of women who received at least one optimal embryo (18 versus 36%, P = 0.04). Monitoring FF G-CSF for the selection of embryos with a better potential for pregnancy might improve the effectiveness of IVF by reducing the time and cost

  14. SU-D-303-03: Impact of Uncertainty in T1 Measurements On Quantification of Dynamic Contrast Enhanced MRI

    Energy Technology Data Exchange (ETDEWEB)

    Aryal, M; Cao, Y [The University of Michigan, Ann Arbor, MI (United States)

    2015-06-15

    Purpose: Quantification of dynamic contrast enhanced (DCE) MRI requires native longitudinal relaxation time (T1) measurement. This study aimed to assess uncertainty in T1 measurements using two different methods. Methods and Materials: Brain MRI scans were performed on a 3T scanner in 9 patients who had low grade/benign tumors and partial brain radiotherapy without chemotherapy at pre-RT, week-3 during RT (wk-3), end-RT, and 1, 6 and 18 months after RT. T1-weighted images were acquired using gradient echo sequences with 1) 2 different flip angles (50 and 150), and 2) 5 variable TRs (100–2000ms). After creating quantitative T1 maps, average T1 was calculated in regions of interest (ROI), which were distant from tumors and received a total of accumulated radiation doses < 5 Gy at wk-3. ROIs included left and right normal Putamen and Thalamus (gray matter: GM), and frontal and parietal white matter (WM). Since there were no significant or even a trend of T1 changes from pre-RT to wk-3 in these ROIs, a relative repeatability coefficient (RC) of T1 as a measure of uncertainty was estimated in each ROI using the data pre-RT and at wk-3. The individual T1 changes at later time points were evaluated compared to the estimated RCs. Results: The 2-flip angle method produced small RCs in GM (9.7–11.7%) but large RCs in WM (12.2–13.6%) compared to the saturation-recovery (SR) method (11.0–17.7% for GM and 7.5–11.2% for WM). More than 81% of individual T1 changes were within T1 uncertainty ranges defined by RCs. Conclusion: Our study suggests that the impact of T1 uncertainty on physiological parameters derived from DCE MRI is not negligible. A short scan with 2 flip angles is able to achieve repeatability of T1 estimates similar to a long scan with 5 different TRs, and is desirable to be integrated in the DCE protocol. Present study was supported by National Institute of Health (NIH) under grant numbers; UO1 CA183848 and RO1 NS064973.

  15. Detection and quantification of microparticles from different cellular lineages using flow cytometry. Evaluation of the impact of secreted phospholipase A2 on microparticle assessment.

    Science.gov (United States)

    Rousseau, Matthieu; Belleannee, Clemence; Duchez, Anne-Claire; Cloutier, Nathalie; Levesque, Tania; Jacques, Frederic; Perron, Jean; Nigrovic, Peter A; Dieude, Melanie; Hebert, Marie-Josee; Gelb, Michael H; Boilard, Eric

    2015-01-01

    Microparticles, also called microvesicles, are submicron extracellular vesicles produced by plasma membrane budding and shedding recognized as key actors in numerous physio(patho)logical processes. Since they can be released by virtually any cell lineages and are retrieved in biological fluids, microparticles appear as potent biomarkers. However, the small dimensions of microparticles and soluble factors present in body fluids can considerably impede their quantification. Here, flow cytometry with improved methodology for microparticle resolution was used to detect microparticles of human and mouse species generated from platelets, red blood cells, endothelial cells, apoptotic thymocytes and cells from the male reproductive tract. A family of soluble proteins, the secreted phospholipases A2 (sPLA2), comprises enzymes concomitantly expressed with microparticles in biological fluids and that catalyze the hydrolysis of membrane phospholipids. As sPLA2 can hydrolyze phosphatidylserine, a phospholipid frequently used to assess microparticles, and might even clear microparticles, we further considered the impact of relevant sPLA2 enzymes, sPLA2 group IIA, V and X, on microparticle quantification. We observed that if enriched in fluids, certain sPLA2 enzymes impair the quantification of microparticles depending on the species studied, the source of microparticles and the means of detection employed (surface phosphatidylserine or protein antigen detection). This study provides analytical considerations for appropriate interpretation of microparticle cytofluorometric measurements in biological samples containing sPLA2 enzymes.

  16. Impact of improved attenuation correction featuring a bone atlas and truncation correction on PET quantification in whole-body PET/MR.

    Science.gov (United States)

    Oehmigen, Mark; Lindemann, Maike E; Gratz, Marcel; Kirchner, Julian; Ruhlmann, Verena; Umutlu, Lale; Blumhagen, Jan Ole; Fenchel, Matthias; Quick, Harald H

    2018-04-01

    Recent studies have shown an excellent correlation between PET/MR and PET/CT hybrid imaging in detecting lesions. However, a systematic underestimation of PET quantification in PET/MR has been observed. This is attributable to two methodological challenges of MR-based attenuation correction (AC): (1) lack of bone information, and (2) truncation of the MR-based AC maps (μmaps) along the patient arms. The aim of this study was to evaluate the impact of improved AC featuring a bone atlas and truncation correction on PET quantification in whole-body PET/MR. The MR-based Dixon method provides four-compartment μmaps (background air, lungs, fat, soft tissue) which served as a reference for PET/MR AC in this study. A model-based bone atlas provided bone tissue as a fifth compartment, while the HUGE method provided truncation correction. The study population comprised 51 patients with oncological diseases, all of whom underwent a whole-body PET/MR examination. Each whole-body PET dataset was reconstructed four times using standard four-compartment μmaps, five-compartment μmaps, four-compartment μmaps + HUGE, and five-compartment μmaps + HUGE. The SUV max for each lesion was measured to assess the impact of each μmap on PET quantification. All four μmaps in each patient provided robust results for reconstruction of the AC PET data. Overall, SUV max was quantified in 99 tumours and lesions. Compared to the reference four-compartment μmap, the mean SUV max of all 99 lesions increased by 1.4 ± 2.5% when bone was added, by 2.1 ± 3.5% when HUGE was added, and by 4.4 ± 5.7% when bone + HUGE was added. Larger quantification bias of up to 35% was found for single lesions when bone and truncation correction were added to the μmaps, depending on their individual location in the body. The novel AC method, featuring a bone model and truncation correction, improved PET quantification in whole-body PET/MR imaging. Short reconstruction times, straightforward

  17. Impact of improved attenuation correction featuring a bone atlas and truncation correction on PET quantification in whole-body PET/MR

    Energy Technology Data Exchange (ETDEWEB)

    Oehmigen, Mark; Lindemann, Maike E. [University Hospital Essen, High Field and Hybrid MR Imaging, Essen (Germany); Gratz, Marcel; Quick, Harald H. [University Hospital Essen, High Field and Hybrid MR Imaging, Essen (Germany); University Duisburg-Essen, Erwin L. Hahn Institute for MR Imaging, Essen (Germany); Kirchner, Julian [University Dusseldorf, Department of Diagnostic and Interventional Radiology, Medical Faculty, Dusseldorf (Germany); Ruhlmann, Verena [University Hospital Essen, Department of Nuclear Medicine, Essen (Germany); Umutlu, Lale [University Hospital Essen, Department of Diagnostic and Interventional Radiology and Neuroradiology, Essen (Germany); Blumhagen, Jan Ole; Fenchel, Matthias [Siemens Healthcare GmbH, Erlangen (Germany)

    2018-04-15

    Recent studies have shown an excellent correlation between PET/MR and PET/CT hybrid imaging in detecting lesions. However, a systematic underestimation of PET quantification in PET/MR has been observed. This is attributable to two methodological challenges of MR-based attenuation correction (AC): (1) lack of bone information, and (2) truncation of the MR-based AC maps (μmaps) along the patient arms. The aim of this study was to evaluate the impact of improved AC featuring a bone atlas and truncation correction on PET quantification in whole-body PET/MR. The MR-based Dixon method provides four-compartment μmaps (background air, lungs, fat, soft tissue) which served as a reference for PET/MR AC in this study. A model-based bone atlas provided bone tissue as a fifth compartment, while the HUGE method provided truncation correction. The study population comprised 51 patients with oncological diseases, all of whom underwent a whole-body PET/MR examination. Each whole-body PET dataset was reconstructed four times using standard four-compartment μmaps, five-compartment μmaps, four-compartment μmaps + HUGE, and five-compartment μmaps + HUGE. The SUV{sub max} for each lesion was measured to assess the impact of each μmap on PET quantification. All four μmaps in each patient provided robust results for reconstruction of the AC PET data. Overall, SUV{sub max} was quantified in 99 tumours and lesions. Compared to the reference four-compartment μmap, the mean SUV{sub max} of all 99 lesions increased by 1.4 ± 2.5% when bone was added, by 2.1 ± 3.5% when HUGE was added, and by 4.4 ± 5.7% when bone + HUGE was added. Larger quantification bias of up to 35% was found for single lesions when bone and truncation correction were added to the μmaps, depending on their individual location in the body. The novel AC method, featuring a bone model and truncation correction, improved PET quantification in whole-body PET/MR imaging. Short reconstruction times, straightforward

  18. Impact of bileaflet mitral valve prolapse on quantification of mitral regurgitation with cardiac magnetic resonance: a single-center study.

    Science.gov (United States)

    Vincenti, Gabriella; Masci, Pier Giorgio; Rutz, Tobias; De Blois, Jonathan; Prša, Milan; Jeanrenaud, Xavier; Schwitter, Juerg; Monney, Pierre

    2017-07-27

    To quantify mitral regurgitation (MR) with CMR, the regurgitant volume can be calculated as the difference between the left ventricular (LV) stroke volume (SV) measured with the Simpson's method and the reference SV, i.e. the right ventricular SV (RVSV) in patients without tricuspid regurgitation. However, for patients with prominent mitral valve prolapse (MVP), the Simpson's method may underestimate the LV end-systolic volume (LVESV) as it only considers the volume located between the apex and the mitral annulus, and neglects the ventricular volume that is displaced into the left atrium but contained within the prolapsed mitral leaflets at end systole. This may lead to an underestimation of LVESV, and resulting an over-estimation of LVSV, and an over-estimation of mitral regurgitation. The aim of the present study was to assess the impact of prominent MVP on MR quantification by CMR. In patients with MVP (and no more than trace tricuspid regurgitation) MR was quantified by calculating the regurgitant volume as the difference between LVSV and RVSV. LVSV uncorr was calculated conventionally as LV end-diastolic (LVEDV) minus LVESV. A corrected LVESV corr was calculated as the LVESV plus the prolapsed volume, i.e. the volume between the mitral annulus and the prolapsing mitral leaflets. The 2 methods were compared with respect to the MR grading. MR grades were defined as absent or trace, mild (5-29% regurgitant fraction (RF)), moderate (30-49% RF), or severe (≥50% RF). In 35 patients (44.0 ± 23.0y, 14 males, 20 patients with MR) the prolapsed volume was 16.5 ± 8.7 ml. The 2 methods were concordant in only 12 (34%) patients, as the uncorrected method indicated a 1-grade higher MR severity in 23 (66%) patients. For the uncorrected/corrected method, the distribution of the MR grades as absent-trace (0 vs 11, respectively), mild (20 vs 18, respectively), moderate (11 vs 5, respectively), and severe (4 vs 1, respectively) was significantly different (p

  19. Evaluation of the impact of matrix effect on quantification of pesticides in foods by gas chromatography-mass spectrometry using isotope-labeled internal standards.

    Science.gov (United States)

    Yarita, Takashi; Aoyagi, Yoshie; Otake, Takamitsu

    2015-05-29

    The impact of the matrix effect in GC-MS quantification of pesticides in food using the corresponding isotope-labeled internal standards was evaluated. A spike-and-recovery study of nine target pesticides was first conducted using paste samples of corn, green soybean, carrot, and pumpkin. The observed analytical values using isotope-labeled internal standards were more accurate for most target pesticides than that obtained using the external calibration method, but were still biased from the spiked concentrations when a matrix-free calibration solution was used for calibration. The respective calibration curves for each target pesticide were also prepared using matrix-free calibration solutions and matrix-matched calibration solutions with blank soybean extract. The intensity ratio of the peaks of most target pesticides to that of the corresponding isotope-labeled internal standards was influenced by the presence of the matrix in the calibration solution; therefore, the observed slope varied. The ratio was also influenced by the type of injection method (splitless or on-column). These results indicated that matrix-matching of the calibration solution is required for very accurate quantification, even if isotope-labeled internal standards were used for calibration. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Total Corporate social responsibility report 2004. Sharing our energy; TOTAL rapport societal and environnemental 2004. Notre energie en partage

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-05-15

    This document presents the social and environmental activities of the group Total for the year 2004. It provides information on the ethical aspects of the governance, the industrial security, the environmental policy, the public health and the occupational safety, the social liability and the economical and social impact of the group activities in the local development, the contribution to the climatic change fight and the development of other energy sources. (A.L.B.)

  1. Porosity of spacer-filled channels in spiral-wound membrane systems: Quantification methods and impact on hydraulic characterization

    KAUST Repository

    Siddiqui, Amber; Lehmann, S.; Haaksman, V.; Ogier, J.; Schellenberg, C.; van Loosdrecht, M.C.M.; Kruithof, J.C.; Vrouwenvelder, Johannes S.

    2017-01-01

    The porosity of spacer-filled feed channels influences the hydrodynamics of spiral-wound membrane systems and impacts the overall performance of the system. Therefore, an exact measurement and a detailed understanding of the impact of the feed

  2. Quantification of the impact of a confounding variable on functional connectivity confirms anti-correlated networks in the resting-state.

    Science.gov (United States)

    Carbonell, F; Bellec, P; Shmuel, A

    2014-02-01

    The effect of regressing out the global average signal (GAS) in resting state fMRI data has become a concern for interpreting functional connectivity analyses. It is not clear whether the reported anti-correlations between the Default Mode and the Dorsal Attention Networks are intrinsic to the brain, or are artificially created by regressing out the GAS. Here we introduce a concept, Impact of the Global Average on Functional Connectivity (IGAFC), for quantifying the sensitivity of seed-based correlation analyses to the regression of the GAS. This voxel-wise IGAFC index is defined as the product of two correlation coefficients: the correlation between the GAS and the fMRI time course of a voxel, times the correlation between the GAS and the seed time course. This definition enables the calculation of a threshold at which the impact of regressing-out the GAS would be large enough to introduce spurious negative correlations. It also yields a post-hoc impact correction procedure via thresholding, which eliminates spurious correlations introduced by regressing out the GAS. In addition, we introduce an Artificial Negative Correlation Index (ANCI), defined as the absolute difference between the IGAFC index and the impact threshold. The ANCI allows a graded confidence scale for ranking voxels according to their likelihood of showing artificial correlations. By applying this method, we observed regions in the Default Mode and Dorsal Attention Networks that were anti-correlated. These findings confirm that the previously reported negative correlations between the Dorsal Attention and Default Mode Networks are intrinsic to the brain and not the result of statistical manipulations. Our proposed quantification of the impact that a confound may have on functional connectivity can be generalized to global effect estimators other than the GAS. It can be readily applied to other confounds, such as systemic physiological or head movement interferences, in order to quantify their

  3. Environmental status of plant-based industries. Biomass and bio-materials; Bilan environnemental des filieres vegetales. Biomasse et biomateriaux

    Energy Technology Data Exchange (ETDEWEB)

    Vindimian, E; Boeglin, N; Houillon, G; Osset, Ph; Vial, E; Leguern, Y; Gosse, G; Gabrielle, B; Dohy, M; Bewa, H; Rigal, L; Guilbert, St; Cesar, G; Pandard, P; Oster, D; Normand, N; Piccardi, M; Garoux, V; Arnaud, L; Barbier, J; Mougin, G; Krausz, P; Pluquet, V; Massacrier, L; Dussaud, J

    2005-07-01

    The French agency of environment and energy mastery (Ademe) and the agency of Agriculture for chemistry and energy (Agrice) have jointly organized these technical days about the potentialities of plant-based products in front of the big environmental stakes of the diversification of energy sources, the development of new outputs for agriculture and the opening of new fields of industrial innovation. This document gathers the articles and transparencies of the presentations given during these 2 days of conference: 1 - Biomass and life cycle analysis (LCA) - impacts and benefits: introduction to LCA (E. Vindimian), keys to understand this environmental evaluation tool (N. Boeglin); environmental status of plant-based industries for chemistry, materials and energy: LCA knowledge status, plant versus fossil (G. Houillon), detailed analysis of 2 industries: agro-materials and bio-polymers (J. Payet); example of environmental and LCA studies: energy and greenhouse gas statuses of the biofuel production processes (P. Osset, E. Vial), LCA of collective and industrial wood-fueled space heating (Y. Leguern), contribution and limitations of LCA for plant-based industries (G. Gosse, B. Gabrielle), conclusion of the first day (M. Dohy). 2 - Biomass and materials: a reality: biomaterials in the Agrice program (H. Bewa), plant-derived materials: resources, status and perspectives (L. Rigal); biopolymers: overview of the industrial use of biopolymers: materials and markets, applications (S. Guibert), degradation mechanisms of biopolymers used in agriculture: biodegradability, eco-toxicity and accumulation in soils (G. Cesar, P. Pandard), present and future regulatory framework: specifications and methods of biodegradability evaluation of materials for agriculture and horticulture (D. Oster), standardization: necessity and possibilities (N. Normand); vegetable fibers and composite materials: market of new vegetable fiber uses (M. Piccardi, V. Garoux), vegetable particulates and

  4. Superposition Quantification

    Science.gov (United States)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  5. The ratio of right ventricular volume to left ventricular volume reflects the impact of pulmonary regurgitation independently of the method of pulmonary regurgitation quantification

    International Nuclear Information System (INIS)

    Śpiewak, Mateusz; Małek, Łukasz A.; Petryka, Joanna; Mazurkiewicz, Łukasz; Miłosz, Barbara; Biernacka, Elżbieta K.; Kowalski, Mirosław; Hoffman, Piotr; Demkow, Marcin; Miśko, Jolanta; Rużyłło, Witold

    2012-01-01

    Background: Previous studies have advocated quantifying pulmonary regurgitation (PR) by using PR volume (PRV) instead of commonly used PR fraction (PRF). However, physicians are not familiar with the use of PRV in clinical practice. The ratio of right ventricle (RV) volume to left ventricle volume (RV/LV) may better reflect the impact of PR on the heart than RV end-diastolic volume (RVEDV) alone. We aimed to compare the impact of PRV and PRF on RV size expressed as either the RV/LV ratio or RVEDV (mL/m 2 ). Methods: Consecutive patients with repaired tetralogy of Fallot were included (n = 53). PRV, PRF and ventricular volumes were measured with the use of cardiac magnetic resonance. Results: RVEDV was more closely correlated with PRV when compared with PRF (r = 0.686, p 2.0 [area under the curve (AUC) PRV = 0.770 vs AUC PRF = 0.777, p = 0.86]. Conversely, with the use of the RVEDV-based criterion (>170 mL/m 2 ), PRV proved to be superior over PRF (AUC PRV = 0.770 vs AUC PRF = 0.656, p = 0.0028]. Conclusions: PRV and PRF have similar significance as measures of PR when the RV/LV ratio is used instead of RVEDV. The RV/LV ratio is a universal marker of RV dilatation independent of the method of PR quantification applied (PRF vs PRV)

  6. Impact of hydrogeological and geomechanical properties on surface uplift at a CO2 injection site: Parameter estimation and uncertainty quantification

    Science.gov (United States)

    Newell, P.; Yoon, H.; Martinez, M. J.; Bishop, J. E.; Arnold, B. W.; Bryant, S.

    2013-12-01

    It is essential to couple multiphase flow and geomechanical response in order to predict a consequence of geological storage of CO2. In this study, we estimate key hydrogeologic features to govern the geomechanical response (i.e., surface uplift) at a large-scale CO2 injection project at In Salah, Algeria using the Sierra Toolkit - a multi-physics simulation code developed at Sandia National Laboratories. Importantly, a jointed rock model is used to study the effect of postulated fractures in the injection zone on the surface uplift. The In Salah Gas Project includes an industrial-scale demonstration of CO2 storage in an active gas field where CO2 from natural gas production is being re-injected into a brine-filled portion of the structure downdip of the gas accumulation. The observed data include millimeter scale surface deformations (e.g., uplift) reported in the literature and injection well locations and rate histories provided by the operators. Our preliminary results show that the intrinsic permeability and Biot coefficient of the injection zone are important. Moreover pre-existing fractures within the injection zone affect the uplift significantly. Estimation of additional (i.e., anisotropy ratio) and coupled parameters will help us to develop models, which account for the complex relationship between mechanical integrity and CO2 injection-induced pressure changes. Uncertainty quantification of model predictions will be also performed using various algorithms including null-space Monte Carlo and polynomial-chaos expansion methods. This work will highlight that our coupled reservoir and geomechanical simulations associated with parameter estimation can provide a practical solution for designing operating conditions and understanding subsurface processes associated with the CO2 injection. This work is supported as part of the Center for Frontiers of Subsurface Energy Security, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office

  7. Ratio of involved/uninvolved immunoglobulin quantification by Hevylite™ assay: clinical and prognostic impact in multiple myeloma

    Directory of Open Access Journals (Sweden)

    Koulieris Efstathios

    2012-04-01

    Full Text Available Abstract Background HevyLite™ is a new, recently developed method that facilitates separate quantification of the kappa- and lambda-bounded amounts of a given immunoglobulin (Ig. Using this method, we measured intact immunoglobulin (heavy/light chain; HLC IgG-kappa, IgG-lambda, IgA-kappa, IgA-lambda individually, as well as their deriving ratios (HLCR in a series of IgG or IgA multiple myeloma (MM patients, to investigate and assess the contribution of these tests to disease evaluation. Patients and methods HevyLite™ assays were used in sera from 130 healthy individuals (HI and 103 MM patients, at time of diagnosis. In patients, the level of paraprotein was IgG in 78 (52 IgG-kappa, 26 IgG-lambda and IgΑ in 25 (13 IgΑ-kappa, 12 IgΑ-lambda. Durie-Salmon and International Staging System stages were evenly distributed. Symptomatic patients (n = 77 received treatment while asymptomatic ones (n = 26 were followed. Patients' median follow-up was at 32.6 months. HLCR was calculated with the involved Ig (either G or A as numerator. Results In HI, median IgG-kappa was 6.85, IgG-lambda 3.81, IgA-kappa 1.19 and IgA-lambda 0.98 g/L. The corresponding median involving HLC values in MM patients were 25.8, 23.45, 28.9 and 36.4 g/L. HLC-IgG related to anemia, high serum free light chain ratio and extensive bone marrow infiltration, while high HLCR correlated with the same plus increased β2-microglobulin. In addition, increased HLCR and the presence of immunoparesis correlated with time to treatment. Patients with high HLCR had a significantly shorter survival (p = 0.022; HLCR retained its prognostic value in multivariate analysis. Conclusions HLC and HLCR quantify the precise amount of the involved immunoglobulin more accurately than other methods; moreover, they carry prognostic information regarding survival in MM patients.

  8. Quantification of Multiple Climate Change and Human Activity Impact Factors on Flood Regimes in the Pearl River Delta of China

    Directory of Open Access Journals (Sweden)

    Yihan Tang

    2016-01-01

    Full Text Available Coastal flood regimes have been irreversibly altered by both climate change and human activities. This paper aims to quantify the impacts of multiple factors on delta flood. The Pearl River Delta (PRD, with dense river network and population, is one of the most developed coastal areas in China. The recorded extreme water level (m.s.l. in flood season has been heavily interfered with by varied income flood flow, sea-level rise, and dredged riverbeds. A methodology, composed of a numerical model and the index R, has been developed to quantify the impacts of these driving factors in the the PRD. Results show that the flood level varied 4.29%–53.49% from the change of fluvial discharge, 3.35%–38.73% from riverbed dredging, and 0.12%–16.81% from sea-level rise. The variation of flood flow apparently takes the most effect and sea-level rise the least. In particular, dense river network intensifies the impact of income flood change and sea-level rise. Findings from this study help understand the causes of the the PRD flood regimes and provide theoretical support for flood protection in the delta region.

  9. SU-F-R-28: Correction of FCh-PET Bladder Uptake Using Virtual Sinograms and Investigation of Its Impact On the Quantification of Prostate Textural Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Laberge, S; Beauregard, J; Archambault, L [CHUQ Pavillon Hotel-Dieu de Quebec, Quebec, QC (Canada)

    2016-06-15

    Purpose: Textural biomarkers as a tool for quantifying intratumoral heterogeneity hold great promise for diagnosis and early assessment of treatment response in prostate cancer. However, spill-in counts from the bladder uptake are suspected to have an impact on the textural measurements of the prostate volume. This work proposes a correction method for the FCh-PET bladder uptake and investigates its impact on intraprostatic textural properties. Methods: Two patients with PC received pre-treatment dynamic FCh-PET scans reconstructed at four time points (interval: 2 min), for which prostate and bladder contours were obtained. Projection bins affected by bladder uptake were determined by forward-projection. For each time point and axial position, virtual sinograms were obtained and affected bins replaced by a weighted combination of original values and values interpolated using cubic spline from non-affected bins of the current and adjacent projection angles. The process was optimized using a genetic algorithm in terms of minimization of the root-mean-square error (RMSE) within the bladder between the corrected dynamic time point volume and a reference initial uptake volume. Finally, the impact of the bladder uptake correction on the prostate region was investigated using two standard SUV metrics (1) and three texture metrics (2): 1) SUVmax, SUVmean; 2) Contrast, Homogeneity, Coarseness. Results: Without bladder uptake correction, SUVmax and SUVmean were on average overestimated in the prostate by 0%, 0%, 33.2%, 51.2%, and 3.6%, 6.0%, 2.9%, 3.2%, for each time point respectively. Contrast varied by −9.1%, −6.7%, +40.4%, +107.7%, and Homogeneity and Coarseness by +4.5%, +1.8%, −8.8%, −14.8% and +1.0%, +0.5%, −9.5%, +0.9%. Conclusion: We proposed a method for FCh-PET bladder uptake correction and showed an impact on the quantification of the prostate signal. This method achieved a large reduction of intra-prostatic SUVmax while minimizing the impact on SUVmean

  10. Long Term Quantification of Climate and Land Cover Change Impacts on Streamflow in an Alpine River Catchment, Northwestern China

    Directory of Open Access Journals (Sweden)

    Zhenliang Yin

    2017-07-01

    Full Text Available Quantifying the long term impacts of climate and land cover change on streamflow is of great important for sustainable water resources management in inland river basins. The Soil and Water Assessment Tool (SWAT model was employed to simulate the streamflow in the upper reaches of Heihe River Basin, northwestern China, over the last half century. The Sequential Uncertainty Fitting algorithm (SUFI-2 was selected to calibrate and validate the SWAT model. The results showed that both Nash-Sutcliffe efficiency (NSE and determination coefficient (R2 were over 0.93 for calibration and validation periods, the percent bias (PBIAS of the two periods were—3.47% and 1.81%, respectively. The precipitation, average, maximum, and minimum air temperature were all showing increasing trends, with 14.87 mm/10 years, 0.30 °C/10 years, 0.27 °C/10 year, and 0.37 °C/10 years, respectively. Runoff coefficient has increased from 0.36 (averaged during 1964 to 1988 to 0.39 (averaged during 1989 to 2013. Based on the SWAT simulation, we quantified the contribution of climate and land cover change to streamflow change, indicated that the land cover change had a positive impact on river discharge by increasing 7.12% of the streamflow during 1964 to 1988, and climate change contributed 14.08% for the streamflow increasing over last 50 years. Meanwhile, the climate change impact was intensive after 2000s. The increasing of streamflow contributed to the increasing of total streamflow by 64.1% for cold season (November to following March and 35.9% for warm season (April to October. The results provide some references for dealing with climate and land cover change in an inland river basin for water resource management and planning.

  11. Porosity of spacer-filled channels in spiral-wound membrane systems: Quantification methods and impact on hydraulic characterization

    KAUST Repository

    Siddiqui, Amber

    2017-04-13

    The porosity of spacer-filled feed channels influences the hydrodynamics of spiral-wound membrane systems and impacts the overall performance of the system. Therefore, an exact measurement and a detailed understanding of the impact of the feed channel porosity is required to understand and improve the hydrodynamics of spiral-wound membrane systems applied for desalination and wastewater reuse. The objectives of this study were to assess the accuracy of porosity measurement techniques for feed spacers differing in geometry and thickness and the consequences of using an inaccurate method on hydrodynamic predictions, which may affect permeate production. Six techniques were applied to measure the porosity namely, three volumetric calculations based on spacer strand count together with cuboidal (SC), cylindrical (VCC) and ellipsoidal volume calculation (VCE) and three independent techniques based on volume displacement (VD), weight and density (WD) and computed tomography scanning (CT). The CT method was introduced as an alternative for the other five already existing and applied methods in practice.Six feed spacers used for the porosity measurement differed in filament thickness, angle between the filaments and mesh-size. The results of the studies showed differences between the porosities, measured by the six methods. The results of the microscopic techniques SC, VCC and VCE deviated significantly from measurements by VD, WD and CT, which showed similar porosity values for all spacer types.Depending on the maximum deviation of the porosity measurement techniques from –6% to +6%, (i) the linear velocity deviations were −5.6% and +6.4% respectively and (ii) the pressure drop deviations were –31% and +43% respectively, illustrating the importance of an accurate porosity measurement. Because of the accuracy and standard deviation, the VD and WD method should be applied for the porosity determination of spacer-filled channels, while the CT method is recommended for

  12. Impact of deep learning on the normalization of reconstruction kernel effects in imaging biomarker quantification: a pilot study in CT emphysema

    Science.gov (United States)

    Jin, Hyeongmin; Heo, Changyong; Kim, Jong Hyo

    2018-02-01

    Differing reconstruction kernels are known to strongly affect the variability of imaging biomarkers and thus remain as a barrier in translating the computer aided quantification techniques into clinical practice. This study presents a deep learning application to CT kernel conversion which converts a CT image of sharp kernel to that of standard kernel and evaluates its impact on variability reduction of a pulmonary imaging biomarker, the emphysema index (EI). Forty cases of low-dose chest CT exams obtained with 120kVp, 40mAs, 1mm thickness, of 2 reconstruction kernels (B30f, B50f) were selected from the low dose lung cancer screening database of our institution. A Fully convolutional network was implemented with Keras deep learning library. The model consisted of symmetric layers to capture the context and fine structure characteristics of CT images from the standard and sharp reconstruction kernels. Pairs of the full-resolution CT data set were fed to input and output nodes to train the convolutional network to learn the appropriate filter kernels for converting the CT images of sharp kernel to standard kernel with a criterion of measuring the mean squared error between the input and target images. EIs (RA950 and Perc15) were measured with a software package (ImagePrism Pulmo, Seoul, South Korea) and compared for the data sets of B50f, B30f, and the converted B50f. The effect of kernel conversion was evaluated with the mean and standard deviation of pair-wise differences in EI. The population mean of RA950 was 27.65 +/- 7.28% for B50f data set, 10.82 +/- 6.71% for the B30f data set, and 8.87 +/- 6.20% for the converted B50f data set. The mean of pair-wise absolute differences in RA950 between B30f and B50f is reduced from 16.83% to 1.95% using kernel conversion. Our study demonstrates the feasibility of applying the deep learning technique for CT kernel conversion and reducing the kernel-induced variability of EI quantification. The deep learning model has a

  13. Quantification of the impact of macrophytes on oxygen dynamics and nitrogen retention in a vegetated lowland river

    Science.gov (United States)

    Desmet, N. J. S.; Van Belleghem, S.; Seuntjens, P.; Bouma, T. J.; Buis, K.; Meire, P.

    When macrophytes are growing in the river, the vegetation induces substantial changes to the water quality. Some effects are the result of direct interactions, such as photosynthetic activity or nutrient uptake, whereas others may be attributed to indirect effects of the water plants on hydrodynamics and river processes. This research focused on the direct effect of macrophytes on oxygen dynamics and nutrient cycling. Discharge, macrophyte biomass density, basic water quality, dissolved oxygen and nutrient concentrations were in situ monitored throughout the year in a lowland river (Nete catchment, Belgium). In addition, various processes were investigated in more detail in multiple ex situ experiments. The field and aquaria measurement results clearly demonstrated that aquatic plants can exert considerable impact on dissolved oxygen dynamics in a lowland river. When the river was dominated by macrophytes, dissolved oxygen concentrations varied from 5 to 10 mg l -1. Considering nutrient retention, it was shown that the investigated in-stream macrophytes could take up dissolved inorganic nitrogen (DIN) from the water column at rates of 33-50 mg N kgdry matter-1 h. And DIN fluxes towards the vegetation were found to vary from 0.03 to 0.19 g N ha -1 h -1 in spring and summer. Compared to the measured changes in DIN load over the river stretch, it means that about 3-13% of the DIN retention could be attributed to direct nitrogen uptake from the water by macrophytes. Yet, the role of macrophytes in rivers should not be underrated as aquatic vegetation also exerts considerable indirect effects that may have a greater impact than the direct fixation of nutrients into the plant biomass.

  14. Quantification of the impact of precipitation spatial distribution uncertainty on predictive uncertainty of a snowmelt runoff model

    Science.gov (United States)

    Jacquin, A. P.

    2012-04-01

    This study is intended to quantify the impact of uncertainty about precipitation spatial distribution on predictive uncertainty of a snowmelt runoff model. This problem is especially relevant in mountain catchments with a sparse precipitation observation network and relative short precipitation records. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment's glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation at a station and a precipitation factor FPi. If other precipitation data are not available, these precipitation factors must be adjusted during the calibration process and are thus seen as parameters of the model. In the case of the fifth zone, glaciers are seen as an inexhaustible source of water that melts when the snow cover is depleted.The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. The model's predictive uncertainty is measured in terms of the output variance of the mean squared error of the Box-Cox transformed discharge, the relative volumetric error, and the weighted average of snow water equivalent in the elevation zones at the end of the simulation period. Sobol's variance decomposition (SVD) method is used for assessing the impact of precipitation spatial distribution, represented by the precipitation factors FPi, on the models' predictive uncertainty. In the SVD method, the first order effect of a parameter (or group of parameters) indicates the fraction of predictive uncertainty that could be reduced if the true value of this parameter (or group) was known. Similarly, the total effect of a parameter (or group) measures the fraction of predictive uncertainty that would remain if the true value of this parameter (or group) was unknown, but all the remaining model parameters could be fixed

  15. Quantification of source impact to PM using three-dimensional weighted factor model analysis on multi-site data

    Science.gov (United States)

    Shi, Guoliang; Peng, Xing; Huangfu, Yanqi; Wang, Wei; Xu, Jiao; Tian, Yingze; Feng, Yinchang; Ivey, Cesunica E.; Russell, Armistead G.

    2017-07-01

    Source apportionment technologies are used to understand the impacts of important sources of particulate matter (PM) air quality, and are widely used for both scientific studies and air quality management. Generally, receptor models apportion speciated PM data from a single sampling site. With the development of large scale monitoring networks, PM speciation are observed at multiple sites in an urban area. For these situations, the models should account for three factors, or dimensions, of the PM, including the chemical species concentrations, sampling periods and sampling site information, suggesting the potential power of a three-dimensional source apportionment approach. However, the principle of three-dimensional Parallel Factor Analysis (Ordinary PARAFAC) model does not always work well in real environmental situations for multi-site receptor datasets. In this work, a new three-way receptor model, called "multi-site three way factor analysis" model is proposed to deal with the multi-site receptor datasets. Synthetic datasets were developed and introduced into the new model to test its performance. Average absolute error (AAE, between estimated and true contributions) for extracted sources were all less than 50%. Additionally, three-dimensional ambient datasets from a Chinese mega-city, Chengdu, were analyzed using this new model to assess the application. Four factors are extracted by the multi-site WFA3 model: secondary source have the highest contributions (64.73 and 56.24 μg/m3), followed by vehicular exhaust (30.13 and 33.60 μg/m3), crustal dust (26.12 and 29.99 μg/m3) and coal combustion (10.73 and 14.83 μg/m3). The model was also compared to PMF, with general agreement, though PMF suggested a lower crustal contribution.

  16. Quantification of cellular NEMO content and its impact on NF-κB activation by genotoxic stress.

    Directory of Open Access Journals (Sweden)

    Byounghoon Hwang

    Full Text Available NF-κB essential modulator, NEMO, plays a key role in canonical NF-κB signaling induced by a variety of stimuli, including cytokines and genotoxic agents. To dissect the different biochemical and functional roles of NEMO in NF-κB signaling, various mutant forms of NEMO have been previously analyzed. However, transient or stable overexpression of wild-type NEMO can significantly inhibit NF-κB activation, thereby confounding the analysis of NEMO mutant phenotypes. What levels of NEMO overexpression lead to such an artifact and what levels are tolerated with no significant impact on NEMO function in NF-κB activation are currently unknown. Here we purified full-length recombinant human NEMO protein and used it as a standard to quantify the average number of NEMO molecules per cell in a 1.3E2 NEMO-deficient murine pre-B cell clone stably reconstituted with full-length human NEMO (C5. We determined that the C5 cell clone has an average of 4 x 10(5 molecules of NEMO per cell. Stable reconstitution of 1.3E2 cells with different numbers of NEMO molecules per cell has demonstrated that a 10-fold range of NEMO expression (0.6-6x10(5 molecules per cell yields statistically equivalent NF-κB activation in response to the DNA damaging agent etoposide. Using the C5 cell line, we also quantified the number of NEMO molecules per cell in several commonly employed human cell lines. These results establish baseline numbers of endogenous NEMO per cell and highlight surprisingly normal functionality of NEMO in the DNA damage pathway over a wide range of expression levels that can provide a guideline for future NEMO reconstitution studies.

  17. Radiation in fog: quantification of the impact on fog liquid water based on ground-based remote sensing

    Science.gov (United States)

    Wærsted, Eivind G.; Haeffelin, Martial; Dupont, Jean-Charles; Delanoë, Julien; Dubuisson, Philippe

    2017-09-01

    Radiative cooling and heating impact the liquid water balance of fog and therefore play an important role in determining their persistence or dissipation. We demonstrate that a quantitative analysis of the radiation-driven condensation and evaporation is possible in real time using ground-based remote sensing observations (cloud radar, ceilometer, microwave radiometer). Seven continental fog events in midlatitude winter are studied, and the radiative processes are further explored through sensitivity studies. The longwave (LW) radiative cooling of the fog is able to produce 40-70 g m-2 h-1 of liquid water by condensation when the fog liquid water path exceeds 30 g m-2 and there are no clouds above the fog, which corresponds to renewing the fog water in 0.5-2 h. The variability is related to fog temperature and atmospheric humidity, with warmer fog below a drier atmosphere producing more liquid water. The appearance of a cloud layer above the fog strongly reduces the LW cooling relative to a situation with no cloud above; the effect is strongest for a low cloud, when the reduction can reach 100 %. Consequently, the appearance of clouds above will perturb the liquid water balance in the fog and may therefore induce fog dissipation. Shortwave (SW) radiative heating by absorption by fog droplets is smaller than the LW cooling, but it can contribute significantly, inducing 10-15 g m-2 h-1 of evaporation in thick fog at (winter) midday. The absorption of SW radiation by unactivated aerosols inside the fog is likely less than 30 % of the SW absorption by the water droplets, in most cases. However, the aerosols may contribute more significantly if the air mass contains a high concentration of absorbing aerosols. The absorbed radiation at the surface can reach 40-120 W m-2 during the daytime depending on the fog thickness. As in situ measurements indicate that 20-40 % of this energy is transferred to the fog as sensible heat, this surface absorption can contribute

  18. Quantification of emissions from domestic heating in residential areas of İzmir, Turkey and assessment of the impact on local/regional air-quality

    Energy Technology Data Exchange (ETDEWEB)

    Sari, Deniz, E-mail: deniz.sari@tubitak.gov.tr [TUBITAK Marmara Research Center, Environment and Cleaner Production Institute, 41470 Kocaeli (Turkey); Bayram, Abdurrahman [Department of Environmental Engineering, Faculty of Engineering, Dokuz Eylul University, Kaynaklar Campus, 35160 Buca, Izmir (Turkey)

    2014-08-01

    Air pollution in cities is a major environmental problem principally in the developing countries. The quantification of emissions is a basic requirement to assess the human influence to the atmosphere. The air quality generally shows decreases with the major contribution residential emissions and meteorology in the winter season in the big cities. Poor meteorological conditions especially inversion events for the efficient mixing of air pollutants occurred during the winter months in İzmir. With this work we quantify the amount of domestic heating emissions for particulate matter (PM10), sulfur dioxides (SO{sub 2}), nitrogen dioxides (NO{sub 2}), volatile organic compounds (VOC) and carbon monoxide (CO) together with greenhouse gases which are carbon dioxide (CO{sub 2}), nitrous oxide (N{sub 2}O) and methane (CH{sub 4}) in İzmir for 2008–2009 winter season. The results showed that the most affected residential areas were central districts in the city center from domestic heating emissions due to meteorological condition and demographic reasons. Air quality modeling is a great tool for assisting policy makers how to decrease emissions and improve air quality. At the second part of the study, calculated emissions were modeled by using CALMET/CALPUFF dispersion modeling system and plotted in the form of air pollution maps by using geographical information system to determine the locations and estimate the effects of the new residential areas that will be established in the future in İzmir. - Highlights: • The air pollution in cities especially shows raises with the opening of winter season. • Air pollution has become a problem due to rapid urbanization in İzmir, Turkey. • The air quality shows decreases with the residential emissions in İzmir's winter. • With this work we quantify the amount of domestic heating emissions for pollutants. • The impact of emissions on local air-quality is determined by using dispersion model.

  19. Effects of Respiration-Averaged Computed Tomography on Positron Emission Tomography/Computed Tomography Quantification and its Potential Impact on Gross Tumor Volume Delineation

    International Nuclear Information System (INIS)

    Chi, Pai-Chun Melinda; Mawlawi, Osama; Luo Dershan; Liao Zhongxing; Macapinlac, Homer A.; Pan Tinsu

    2008-01-01

    Purpose: Patient respiratory motion can cause image artifacts in positron emission tomography (PET) from PET/computed tomography (CT) and change the quantification of PET for thoracic patients. In this study, respiration-averaged CT (ACT) was used to remove the artifacts, and the changes in standardized uptake value (SUV) and gross tumor volume (GTV) were quantified. Methods and Materials: We incorporated the ACT acquisition in a PET/CT session for 216 lung patients, generating two PET/CT data sets for each patient. The first data set (PET HCT /HCT) contained the clinical PET/CT in which PET was attenuation corrected with a helical CT (HCT). The second data set (PET ACT /ACT) contained the PET/CT in which PET was corrected with ACT. We quantified the differences between the two datasets in image alignment, maximum SUV (SUV max ), and GTV contours. Results: Of the patients, 68% demonstrated respiratory artifacts in the PET HCT , and for all patients the artifact was removed or reduced in the corresponding PET ACT . The impact of respiration artifact was the worst for lesions less than 50 cm 3 and located below the dome of the diaphragm. For lesions in this group, the mean SUV max difference, GTV volume change, shift in GTV centroid location, and concordance index were 21%, 154%, 2.4 mm, and 0.61, respectively. Conclusion: This study benchmarked the differences between the PET data with and without artifacts. It is important to pay attention to the potential existence of these artifacts during GTV contouring, as such artifacts may increase the uncertainties in the lesion volume and the centroid location

  20. Quantification of emissions from domestic heating in residential areas of İzmir, Turkey and assessment of the impact on local/regional air-quality

    International Nuclear Information System (INIS)

    Sari, Deniz; Bayram, Abdurrahman

    2014-01-01

    Air pollution in cities is a major environmental problem principally in the developing countries. The quantification of emissions is a basic requirement to assess the human influence to the atmosphere. The air quality generally shows decreases with the major contribution residential emissions and meteorology in the winter season in the big cities. Poor meteorological conditions especially inversion events for the efficient mixing of air pollutants occurred during the winter months in İzmir. With this work we quantify the amount of domestic heating emissions for particulate matter (PM10), sulfur dioxides (SO 2 ), nitrogen dioxides (NO 2 ), volatile organic compounds (VOC) and carbon monoxide (CO) together with greenhouse gases which are carbon dioxide (CO 2 ), nitrous oxide (N 2 O) and methane (CH 4 ) in İzmir for 2008–2009 winter season. The results showed that the most affected residential areas were central districts in the city center from domestic heating emissions due to meteorological condition and demographic reasons. Air quality modeling is a great tool for assisting policy makers how to decrease emissions and improve air quality. At the second part of the study, calculated emissions were modeled by using CALMET/CALPUFF dispersion modeling system and plotted in the form of air pollution maps by using geographical information system to determine the locations and estimate the effects of the new residential areas that will be established in the future in İzmir. - Highlights: • The air pollution in cities especially shows raises with the opening of winter season. • Air pollution has become a problem due to rapid urbanization in İzmir, Turkey. • The air quality shows decreases with the residential emissions in İzmir's winter. • With this work we quantify the amount of domestic heating emissions for pollutants. • The impact of emissions on local air-quality is determined by using dispersion model

  1. Impact and correction of the bladder uptake on 18F-FCH PET quantification: a simulation study using the XCAT2 phantom

    Science.gov (United States)

    Silva-Rodríguez, Jesús; Tsoumpas, Charalampos; Domínguez-Prado, Inés; Pardo-Montero, Juan; Ruibal, Álvaro; Aguiar, Pablo

    2016-01-01

    The spill-in counts from neighbouring regions can significantly bias the quantification over small regions close to high activity extended sources. This effect can be a drawback for 18F-based radiotracers positron emission tomography (PET) when quantitatively evaluating the bladder area for diseases such as prostate cancer. In this work, we use Monte Carlo simulations to investigate the impact of the spill-in counts from the bladder on the quantitative evaluation of prostate cancer when using 18F-Fluorcholine (FCH) PET and we propose a novel reconstruction-based correction method. Monte Carlo simulations of a modified version of the XCAT2 anthropomorphic phantom with 18F-FCH biological distribution, variable bladder uptake and inserted prostatic tumours were used in order to obtain simulated realistic 18F-FCH data. We evaluated possible variations of the measured tumour Standardized Uptake Value (SUV) for different values of bladder uptake and propose a novel correction by appropriately adapting image reconstruction methodology. The correction is based on the introduction of physiological background terms on the reconstruction, removing the contribution of the bladder to the final image. The bladder is segmented from the reconstructed image and then forward-projected to the sinogram space. The resulting sinograms are used as background terms for the reconstruction. SUVmax and SUVmean could be overestimated by 41% and 22% respectively due to the accumulation of radiotracer in the bladder, with strong dependence on bladder-to-lesion ratio. While the SUVs measured under these conditions are not reliable, images corrected using the proposed methodology provide better repeatability of SUVs, with biases below 6%. Results also showed remarkable improvements on visual detectability. The spill-in counts from the bladder can affect prostatic SUV measurements of 18F-FCH images, which can be corrected to less than 6% using the proposed methodology, providing reliable SUV

  2. The ratio of right ventricular volume to left ventricular volume reflects the impact of pulmonary regurgitation independently of the method of pulmonary regurgitation quantification

    Energy Technology Data Exchange (ETDEWEB)

    Śpiewak, Mateusz, E-mail: mspiewak@ikard.pl [Department of Coronary Artery Disease and Structural Heart Diseases, Institute of Cardiology, Warsaw (Poland); Cardiac Magnetic Resonance Unit, Institute of Cardiology, Warsaw (Poland); Małek, Łukasz A., E-mail: lmalek@ikard.pl [Cardiac Magnetic Resonance Unit, Institute of Cardiology, Warsaw (Poland); Department of Interventional Cardiology and Angiology, Institute of Cardiology, Warsaw (Poland); Petryka, Joanna, E-mail: joannapetryka@hotmail.com [Department of Coronary Artery Disease and Structural Heart Diseases, Institute of Cardiology, Warsaw (Poland); Cardiac Magnetic Resonance Unit, Institute of Cardiology, Warsaw (Poland); Mazurkiewicz, Łukasz, E-mail: lmazurkiewicz@ikard.pl [Cardiac Magnetic Resonance Unit, Institute of Cardiology, Warsaw (Poland); Department of Cardiomyopathy, Institute of Cardiology, Warsaw (Poland); Miłosz, Barbara, E-mail: barbara-milosz@o2.pl [Cardiac Magnetic Resonance Unit, Institute of Cardiology, Warsaw (Poland); Department of Radiology, Institute of Cardiology, Warsaw (Poland); Biernacka, Elżbieta K., E-mail: kbiernacka@ikard.pl [Department of Congenital Heart Diseases, Institute of Cardiology, Warsaw (Poland); Kowalski, Mirosław, E-mail: mkowalski@ikard.pl [Department of Congenital Heart Diseases, Institute of Cardiology, Warsaw (Poland); Hoffman, Piotr, E-mail: phoffman@ikard.pl [Department of Congenital Heart Diseases, Institute of Cardiology, Warsaw (Poland); Demkow, Marcin, E-mail: mdemkow@ikard.pl [Department of Coronary Artery Disease and Structural Heart Diseases, Institute of Cardiology, Warsaw (Poland); Miśko, Jolanta, E-mail: jmisko@wp.pl [Cardiac Magnetic Resonance Unit, Institute of Cardiology, Warsaw (Poland); Department of Radiology, Institute of Cardiology, Warsaw (Poland); Rużyłło, Witold, E-mail: wruzyllo@ikard.pl [Institute of Cardiology, Warsaw (Poland)

    2012-10-15

    Background: Previous studies have advocated quantifying pulmonary regurgitation (PR) by using PR volume (PRV) instead of commonly used PR fraction (PRF). However, physicians are not familiar with the use of PRV in clinical practice. The ratio of right ventricle (RV) volume to left ventricle volume (RV/LV) may better reflect the impact of PR on the heart than RV end-diastolic volume (RVEDV) alone. We aimed to compare the impact of PRV and PRF on RV size expressed as either the RV/LV ratio or RVEDV (mL/m{sup 2}). Methods: Consecutive patients with repaired tetralogy of Fallot were included (n = 53). PRV, PRF and ventricular volumes were measured with the use of cardiac magnetic resonance. Results: RVEDV was more closely correlated with PRV when compared with PRF (r = 0.686, p < 0.0001, and r = 0.430, p = 0.0014, respectively). On the other hand, both PRV and PRF showed a good correlation with the RV/LV ratio (r = 0.691, p < 0.0001, and r = 0.685, p < 0.0001, respectively). Receiver operating characteristic analysis showed that both measures of PR had similar ability to predict severe RV dilatation when the RV/LV ratio-based criterion was used, namely the RV/LV ratio > 2.0 [area under the curve (AUC){sub PRV} = 0.770 vs AUC{sub PRF} = 0.777, p = 0.86]. Conversely, with the use of the RVEDV-based criterion (>170 mL/m{sup 2}), PRV proved to be superior over PRF (AUC{sub PRV} = 0.770 vs AUC{sub PRF} = 0.656, p = 0.0028]. Conclusions: PRV and PRF have similar significance as measures of PR when the RV/LV ratio is used instead of RVEDV. The RV/LV ratio is a universal marker of RV dilatation independent of the method of PR quantification applied (PRF vs PRV)

  3. An integrated multi-parameter monitoring approach for the quantification and mitigation of the climate change impact on the coasts of Eastern Crete, S. Aegean Sea (Project AKTAIA)

    Science.gov (United States)

    Ghionis, George; Alexandrakis, George; Karditsa, Aikaterini; Sifnioti, Dafni; Vousdoukas, Michalis; Andreadis, Olympos; Petrakis, Stelios; Poulos, Serafim; Velegrakis, Adonis; Kampanis, Nikolaos; Lipakis, Michalis

    2014-05-01

    The AKTAIA project aims at the production of new knowledge regarding the forms of manifestation of the climate change and its influence on the stability and evolution of the coastal landforms along the shoreline of eastern Crete (approximate length: 757 km), taking into account the various aspects of human intervention. Aerial photographs, satellite images and orthophotomaps have been used to produce a detailed coastline map and to study the morphological characteristics of the coastal zone of Eastern Crete. More than 100 beach zones have been visited during three field campaigns, which included geomorphological and human intervention mapping, topographic, meteorological and oceanographic measurements and sedimentological sampling and observations. In addition, two pilot sites (one in the north and one in the south part of Crete) are being monitored, via the installation of coastal video monitoring systems, shore-based meteorological stations and wave-tide recorders installed in the nearshore zone. Detailed seafloor mapping with the use of side scan sonar and scuba diving and bathymetric surveys were conducted in the two pilot sites. Meteorological and oceanographic data from all existing land-based meteorological stations, oceanographic buoys and the ERA-interim dataset are used to determine the wind and wave climate of each beach. The collected climatic, sedimentological and coastal environmental data are being integrated in a GIS database that will be used to forecast the climatic trends in the area of Crete for the next decades and to model the impact of the climatic change on the future evolution of the coastal zone. New methodologies for the continuous monitoring of land-sea interaction and for the quantification of the loss of sensitive coastal zones due to sea-level rise and a modified Coastal Vulnerability Index for a comparative evaluation of the vulnerability of the coasts are being developed. Numerical modelling of the nearshore hydrodynamics and the

  4. Quantification of traffic generation and its environmental impacts through decisions, frameworks and measures indirectly influencing transportation; Quantifizierung der Verkehrsentstehung und deren Umweltauswirkungen durch Entscheidungen, Regelwerke und Massnahmen mit indirektem Verkehrsbezug

    Energy Technology Data Exchange (ETDEWEB)

    Holz-Rau, C.; Hesse, M.; Geier, S.; Holzhey, A.; Rau, P.; Schreiner, J.; Schenk, E. [Buero fuer Integrierte Planung, Herdecke (Germany); Arndt, W.H.; Flaemig, H.; Rogge, L.; Steinfeld, M. [Institut fuer Oekologische Wirtschaftsforschung (IOEW) gGmbH, Berlin (Germany)

    2000-09-01

    Legislation (e.g. at the Federal or State level) outside the transportation policy sphere may have indirect impacts on traffic structure and generation. Four case studies have been carried out in order to analyse transport related impact chains, to quantify possible transportation impacts and to identify potential strategies for minimising impacts generated by non-transport-specific legislation. One result is a questionnaire designed as a further element of transportation impact assessment. The goal is to quickly evaluate potential impacts and to modify such proposed legislation in order to minimise negative impacts. Interestingly, the quantification of certain transportion impacts, a difficult task, proved not to be essential. (orig.) [German] Von Vorhaben ausserhalb des Verkehrsbereichs (z.B. Bundesgesetze oder Landesrichtlinien) koennen indirekte Effekte auf die Verkehrsstrukturen ausgehen. Fuer vier Fallbeispiele wurden verkehrsrelevante Wirkungsketten analysiert, die Verkehrseffekte ansatzweise quantifiziert und potentielle Handlungsansaetze fuer eine verkehrssparsamere Modifikation der Vorhaben identifiziert. Es wurde ein Frageschema entwickelt, dessen Anwendung als Fortsetzung einer Verkehrsauswirkungspruefung zur fruehzeitigen Beruecksichtigung von Verkehrseffekten und zur verkehrssparsamen Modifikation von Vorhaben mit indirektem Verkehrsbezug beitragen kann. Eine Quantifizierung der Verkehrseffekte erwies sich dazu in der Regel als entbehrlich. (orig.)

  5. Le développement durable est-il bienvenu dans les organisations ? Cas de l’implantation d’un Système de Management Environnemental en Tunisie

    Directory of Open Access Journals (Sweden)

    Moez Ben Yedder

    2009-02-01

    Full Text Available Les entreprises sociétalement responsables sont celles qui s’inscrivent dans une logique de développement durable. Leur engagement se matérialise, notamment, par le déploiement de systèmes de gestion respectueux de l’environnement tels que les systèmes de management environnemental (SME. En plus de leur portée écologique, ces derniers ont une visée sociale puisqu’ils permettent d’améliorer les conditions de travail des opérateurs. Toutefois, leur mise en place peut être gênée par ces derniers s’ils ne retrouvent pas leurs intérêts lors de l’instauration d’un tel système. L’objectif de ce texte est d’étayer cette idée à partir de l’étude de cas d’une entreprise tunisienne où l’instauration d’un SME n’a pu aboutir en partie à cause de la résistance du personnel. Dans ce sens, nous appuierons l’idée selon laquelle les pratiques du Développement Durable et de la Responsabilité Sociale de l’Entreprise ne se greffent pas ex abrupto sur une entreprise mais nécessitent certains préalables d’ordre organisationnel pour être mis en place.Socially responsible firms are the ones which act in accordance to sustainable development logic. Their engagement is materialised by deploying environment friendly practices such as environmental management system (EMS. Those systems have also a social purpose in the way that they allow enhancing work conditions of operators. However, their setting could be embarrassed by those ones because of the change of habits that it implies for them. The purpose of this study is to shore up this idea by resorting to a case study of a Tunisian firm in which the setting of the environmental management system turned failed in part because of the resistance of the personnel. We will defend the idea that practices of sustainable development and corporate social responsibilities couldn’t be correctly adopted by firms without taking into consideration to some organizational

  6. Quantification of Liver Proton-Density Fat Fraction in an 7.1 Tesla preclinical MR Systems: Impact of the Fitting Technique

    Science.gov (United States)

    Mahlke, C; Hernando, D; Jahn, C; Cigliano, A; Ittermann, T; Mössler, A; Kromrey, ML; Domaska, G; Reeder, SB; Kühn, JP

    2016-01-01

    Purpose To investigate the feasibility of estimating the proton-density fat fraction (PDFF) using a 7.1 Tesla magnetic resonance imaging (MRI) system and to compare the accuracy of liver fat quantification using different fitting approaches. Materials and Methods Fourteen leptin-deficient ob/ob mice and eight intact controls were examined in a 7.1 Tesla animal scanner using a 3-dimensional six-echo chemical shift-encoded pulse sequence. Confounder-corrected PDFF was calculated using magnitude (magnitude data alone) and combined fitting (complex and magnitude data). Differences between fitting techniques were compared using Bland-Altman analysis. In addition, PDFFs derived with both reconstructions were correlated with histopathological fat content and triglyceride mass fraction using linear regression analysis. Results The PDFFs determined with use of both reconstructions correlated very strongly (r=0.91). However, small mean bias between reconstructions demonstrated divergent results (3.9%; CI 2.7%-5.1%). For both reconstructions, there was linear correlation with histopathology (combined fitting: r=0.61; magnitude fitting: r=0.64) and triglyceride content (combined fitting: r=0.79; magnitude fitting: r=0.70). Conclusion Liver fat quantification using the PDFF derived from MRI performed at 7.1 Tesla is feasible. PDFF has strong correlations with histopathologically determined fat and with triglyceride content. However, small differences between PDFF reconstruction techniques may impair the robustness and reliability of the biomarker at 7.1 Tesla. PMID:27197806

  7. Quantification of local mobilities

    DEFF Research Database (Denmark)

    Zhang, Y. B.

    2018-01-01

    A new method for quantification of mobilities of local recrystallization boundary segments is presented. The quantification is based on microstructures characterized using electron microscopy and on determination of migration velocities and driving forces for local boundary segments. Pure aluminium...... is investigated and the results show that even for a single recrystallization boundary, different boundary segments migrate differently, and the differences can be understood based on variations in mobilities and local deformed microstructures. The present work has important implications for understanding...

  8. Fluorescent quantification of melanin.

    Science.gov (United States)

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very ...

  10. Design and testing of a roller kiln for ceramic tile manufacturing with lower environmental impact and higher performance; Conception et essai d'un four a rouleaux pour carreaux a impact environnemental faible et performances elevees

    Energy Technology Data Exchange (ETDEWEB)

    Reimer, A. [DPD-TNO, Delft (Netherlands); Bresciani, A.; Pifferi, G. [SACMI, Savoyarde de Construction de Materiel Industriel, 73 - Montmelian (France)

    1999-09-01

    Motivated by the need to improve firing processes of ceramic tiles with regard to either the homogeneity of the heat distribution in the kiln section or the harmful emissions (particularly fluoride), TNO and the Dutch tile industry, in cooperation with SACMI, have developed a new kiln concept. The study has led to designing and manufacturing a prototype roller kiln, that will be started up and tested at the MOSA facilities in Maastricht. Other partners include SPHINX GUSTAVSBERG, GASUNIE, GOUDA VUURVAST and the Dutch government as sponsor. Great attention has been devoted to control the temperature distribution inside the kiln as well as to manage the fast firing cycles currently used. Burner power, positions, flow, emission level have been calculated using the TNO kiln simulation models. New, but commercially available technologies have been integrated into the new kiln engineering, thus to improve the firing process, reduce emissions and minimize energy consumption. The main technological solutions applied are: (1)radiant tube burners in the firing zone (2)new convective burners in the heating zone (3)convection enhancement in the pre-heating zone by adopting adequate systems for the recirculation of fumes.

  11. The new energy processes and the new approaches of the combustion. The environmental impact decrease; Nouveaux procedes energetiques et nouvelles approches de la combustion. Reduction de l'impact environnemental

    Energy Technology Data Exchange (ETDEWEB)

    Cabot, G. [CORIA, 76 - Mont Saint Aignan (France); Caillat, S. [Ecole des Mines de Douai, Dept. Energetique, 59 (France); Guillet, R. [Gaz de France, GDF DR, 93 - La Plaine Saint-Denis (France)] [and others

    2001-07-01

    During this day organized by the french society of the science of heat (SFT), seven papers have been presented. They deal with new processes of combustion leading to a better air quality for the environment. The first process concerns the wet combustion, an energy efficient and environmentally friendly technique, its properties and the DHC (hygrometric diagram of combustion) analysis. The flames mechanisms and the swirl process are presented in a second part with the analysis of the radiant heat transfers and the nitrogen oxides emissions. (A.L.B.)

  12. Quantification of physical and economic impacts of climate change on public infrastructure in Alaska and benefits of global greenhouse gas mitigation

    Science.gov (United States)

    Melvin, A. M.; Larsen, P.; Boehlert, B.; Martinich, J.; Neumann, J.; Chinowsky, P.; Schweikert, A.; Strzepek, K.

    2015-12-01

    Climate change poses many risks and challenges for the Arctic and sub-Arctic, including threats to infrastructure. The safety and stability of infrastructure in this region can be impacted by many factors including increased thawing of permafrost soils, reduced coastline protection due to declining arctic sea ice, and changes in inland flooding. The U.S. Environmental Protection Agency (EPA) is coordinating an effort to quantify physical and economic impacts of climate change on public infrastructure across the state of Alaska and estimate how global greenhouse gas (GHG) mitigation may avoid or reduce these impacts. This research builds on the Climate Change Impacts and Risk Analysis (CIRA) project developed for the contiguous U.S., which is described in an EPA report released in June 2015. We are using a multi-model analysis focused primarily on the impacts of changing permafrost, coastal erosion, and inland flooding on a range of infrastructure types, including transportation (e.g. roads, airports), buildings and harbors, energy sources and transmission, sewer and water systems, and others. This analysis considers multiple global GHG emission scenarios ranging from a business as usual future to significant global action. These scenarios drive climate projections through 2100 spanning a range of outcomes to capture variability amongst climate models. Projections are being combined with a recently developed public infrastructure database and integrated into a version of the Infrastructure Planning Support System (IPSS) we are modifying for use in the Arctic and sub-Arctic region. The IPSS tool allows for consideration of both adaptation and reactive responses to climate change. Results of this work will address a gap in our understanding of climate change impacts in Alaska, provide estimates of the physical and economic damages we may expect with and without global GHG mitigation, and produce important insights about infrastructure vulnerabilities in response to

  13. Towards improved quantification of post-fire conifer mortality and recovery: Impacts of fire radiative flux on seedling and mature tree mortality, physiology, and growth

    Science.gov (United States)

    Sparks, A. M.; Kolden, C.; Smith, A. M.

    2016-12-01

    Fire activity, in terms of intensity, frequency, and total area burned, is expected to increase with changing climate. A challenge for landscape level assessment of fire effects, termed burn severity, is that current assessments provide very little information regarding vegetation physiological performance and recovery, limiting our understanding of fire effects on ecosystem services such as carbon storage/cycling. To address these limitations, we evaluated an alternative dose-response methodology for quantifying fire effects that attempts to bridge fire combustion dynamics and ecophysiology. Specifically, we conducted a highly controlled, laboratory assessment of seedling response to increasing doses of fire radiative energy applied through surface fires, for two western U.S. conifer species. Seedling physiology and spectral reflectance were acquired pre- and up to 1 year post-fire. Post-fire mortality, physiological performance, and spectral reflectance were strongly related with fire radiative energy density (FRED: J m-2) dose. To examine how these relationships change with tree size and age, we conducted small prescribed fires at the tree scale (35 m2) in a mature conifer stand. Radial growth and resin duct defenses were assessed on the mature conifer trees following the prescribed fires. Differences in dose-response relationships between seedlings and mature trees indicate the importance of fire behavior (e.g., flaming-dominated versus smoldering-dominated combustion) in characterizing these relationships. Ultimately, these results suggest that post-fire impacts on growth of surviving seedlings and mature trees require modes of heat transfer to impact tree canopies.

  14. Quantification of years of life lost attributable to chronic air pollution exposure in a health impact assessment: the case of Nantes

    International Nuclear Information System (INIS)

    Guillois-Becel, Y.; Eilstein, D.; Glorennec, Ph.; Lefranc, A.

    2007-01-01

    Background: When French regional planning for air quality first began, exposure-response functions from time-series studies were used to assess the short-term health impact of urban air pollution. The World Health Organisation also suggests that exposure-response functions from cohort studies be taken into account to evaluate the effects of chronic exposure and to quantify the prematurity of deaths related to chronic exposure to air pollution. This work characterizes the long term effects of air pollution in Nantes by considering years of life lost as well as the number of attributable deaths. methods: the study population is classified in birth cohorts. for each cohort, 2 survival curves are built based on current mortality conditions: the first is built for current exposure to air pollution and the second for exposure to a lower reference level of air pollution. The area between the 2 curves represents years of life lost attributable to urban air pollution. results: the estimated number of premature deaths due to air pollution is approximately 56, or about 2% of the deaths of those older than 30 years. The health impact on the Nantes population is estimated at 27.2 years of life lost attributable to urban air pollution in 1999 and 2388.1 years of life lost for the 1999-2008 period. This amounts to a decrease of roughly 4 months in the life expectancy of those aged 30 years. Conclusion: This study, which also identifies and discusses relevant errors and uncertainty, confirmed that air pollution in Nantes has significant health effects and that chronic exposure plays an essential role in this impact. the number of years of life lost and the reduction in life expectancy provide new reasons to reject the assumption that health effects are limited to the premature deaths of terminally-ill people. the expected health gains in Nantes associated with reduced although still moderate air pollution levels are on the same scale as, and possibly better than, those found in 9

  15. Numerical investigation and Uncertainty Quantification of the Impact of the geological and geomechanical properties on the seismo-acoustic responses of underground chemical explosions

    Science.gov (United States)

    Ezzedine, S. M.; Pitarka, A.; Vorobiev, O.; Glenn, L.; Antoun, T.

    2017-12-01

    We have performed three-dimensional high resolution simulations of underground chemical explosions conducted recently in jointed rock outcrop as part of the Source Physics Experiments (SPE) being conducted at the Nevada National Security Site (NNSS). The main goal of the current study is to investigate the effects of the structural and geomechanical properties on the spall phenomena due to underground chemical explosions and its subsequent effect on the seismo-acoustic signature at far distances. Two parametric studies have been undertaken to assess the impact of different 1) conceptual geological models including a single layer and two layers model, with and without joints and with and without varying geomechanical properties, and 2) depth of bursts of the chemical explosions and explosion yields. Through these investigations we have explored not only the near-field response of the chemical explosions but also the far-field responses of the seismic and the acoustic signatures. The near-field simulations were conducted using the Eulerian and Lagrangian codes, GEODYN and GEODYN -L, respectively, while the far-field seismic simulations were conducted using the elastic wave propagation code, WPP, and the acoustic response using the Kirchhoff-Helmholtz-Rayleigh time-dependent approximation code, KHR. Though a series of simulations we have recorded the velocity field histories a) at the ground surface on an acoustic-source-patch for the acoustic simulations, and 2) on a seismic-source-box for the seismic simulations. We first analyzed the SPE3 experimental data and simulated results, then simulated SPE4-prime, SPE5, and SPE6 to anticipate their seismo-acoustic responses given conditions of uncertainties. SPE experiments were conducted in a granitic formation; we have extended the parametric study to include other geological settings such dolomite and alluvial formations. These parametric studies enabled us 1) investigating the geotechnical and geophysical key parameters

  16. Quantification of Functional Marker Genes for Denitrifying Microbial Populations in the Chandeleur Islands Impacted by the 2010 Gulf of Mexico Oil Spill

    Science.gov (United States)

    Crawford, P.; Flournoy, N.; Taylor, C.; Tatariw, C.; Mortazavi, B.; Sobecky, P.

    2017-12-01

    Barrier island ecosystems provide protection by reducing storm surges, dissipating wave energy, and economically through services such as fisheries, water catchment, and water quality. As these ecosystems are deteriorating and threatened in this century, services provided to humans are being valued monetarily to communicate their importance. Events such as the 2010 Gulf of Mexico oil spill, act as catalysts to accelerate deterioration and further loss of these vital ecosystem services. The oil spill impacted the Chandeleur Islands, barrier islands in Louisiana waters located forty miles south of Gulfport, MS. Island chain vegetation; i.e., Avicennia germinans and native Spartina alterniflora was heavily damaged as a result of the oil spill. As oil was deposited differentially, it was important to investigate the microbiology of oil-impacted areas as marsh vegetation is directly linked to microbe-driven ecosystem services such as denitrification, a nitrogen (N) cycle pathway. The objectives of this study were: i) characterize the biodiversity of microorganisms; ii) quantify denitrifying microbial populations using functional marker genes; and iii) measure rates of denitrification during a one-year period. Eco-functional marker genes narG, nirS, norB, nosZ, and nrfA were selected to represent denitrification. Three different marsh sites were selected for study based upon estimated amounts of prior oiling. Highest rates of denitrification were in September while the lowest rates were observed in February. The highest nirS abundance was detected for two of the three sites (Site 1 and 2) in September while Site 3 exhibited the highest abundance in November. Similarly, the highest abundances observed for norB and nosZ varied by site and by month. Weathered oil was also detected in some of the marsh sediment cores and chemically typed to Macondo oil. Studies such as this one are designed to characterize the barrier island microbial biodiversity and N cycle processes to

  17. Quantification and classification of the main environmental impacts on a Halodule wrightii seagrass meadow on a tropical island in northeastern Brazil

    Directory of Open Access Journals (Sweden)

    Maria Elisa Pitanga

    2012-03-01

    Full Text Available Multiple stress mechanisms have caused a worldwide decrease in seagrasses, which are vulnerable to environmental and/or anthropogenic pressure. The loss of seagrass meadows of Halodule wrightii is reported for the littoral of Itamaracá Island (Northeastern Brazil. The present study identified the main anthropogenic factors that negatively influenced over the abundance and distribution of seagrass meadows between July and September 2007 at the Jaguaribe and Pilar Beaches, Eastern littoral of Itamaracá. Anthropogenic impact included the discharge of untreated sewage through fluvial channels, urban and commercial development along the coast, the anchoring of motorized and non-motorized boats, diverse fishing techniques and the dumping of solid waste. The data indicates that the Pilar is an environment with a higher impact index (71.43% when compared with the Jaguaribe (57.14%, standing out the number of boats with a central motor, the total number of boats, the presence of shellfish gatherers and coastal urban development. The present study reinforces the need for defining management and conservation measures for this ecosystem, which has high ecological and economic value.Múltiplos mecanismos estressores têm causado em todo mundo declínio das angiospermas marinhas, que são vulneráveis a pressões ambientais e/ou antrópicas. A perda de prados de Halodule wrightii tem sido relatada para o litoral da Ilha de Itamaracá (Nordeste do Brasil. O presente estudo identificou os principais fatores antrópicos que influenciaram negativamente na abundância e distribuição desses prados entre Julho e Setembro de 2007 nas praias de Jaguaribe e do Pilar, litoral leste de Itamaracá. O impacto antropogênico incluiu despejo de efluentes in natura nos canais fluviais, desenvolvimento urbano e comercial na linha de costa, ancoragem de embarcações motorizadas e não-motorizadas, técnicas diversas de pesca e disposição de resíduos sólidos. Os dados

  18. Synthesis and Review: Advancing agricultural greenhouse gas quantification

    International Nuclear Information System (INIS)

    Olander, Lydia P; Wollenberg, Eva; Tubiello, Francesco N; Herold, Martin

    2014-01-01

    Reducing emissions of agricultural greenhouse gases (GHGs), such as methane and nitrous oxide, and sequestering carbon in the soil or in living biomass can help reduce the impact of agriculture on climate change while improving productivity and reducing resource use. There is an increasing demand for improved, low cost quantification of GHGs in agriculture, whether for national reporting to the United Nations Framework Convention on Climate Change (UNFCCC), underpinning and stimulating improved practices, establishing crediting mechanisms, or supporting green products. This ERL focus issue highlights GHG quantification to call attention to our existing knowledge and opportunities for further progress. In this article we synthesize the findings of 21 papers on the current state of global capability for agricultural GHG quantification and visions for its improvement. We conclude that strategic investment in quantification can lead to significant global improvement in agricultural GHG estimation in the near term. (paper)

  19. Quantification of the impact of multifaceted initiatives intended to improve operational efficiency and the safety culture: a case study from an academic medical center radiation oncology department.

    Science.gov (United States)

    Chera, Bhishamjit S; Mazur, Lukasz; Jackson, Marianne; Taylor, Kinely; Mosaly, Prithima; Chang, Sha; Deschesne, Kathy; LaChapelle, Dana; Hoyle, Lesley; Saponaro, Patricia; Rockwell, John; Adams, Robert; Marks, Lawrence B

    2014-01-01

    We have systematically been incorporating several operational efficiency and safety initiatives into our academic radiation oncology clinic. We herein quantify the impact of these initiatives on prospectively collected, clinically meaningful, metrics. The data from 5 quality improvement initiatives, each focused on a specific safety/process concern in our clinic, are presented. Data was collected prospectively: operational metrics recorded before and after implementation of the initiative were compared using statistical analysis. Results from the Agency for Health Care Research and Quality (AHRQ) patient safety culture surveys administered during and after many of these initiatives were similarly compared. (1) Workload levels for nurses assisting with brachytherapy were high (National Aeronautics and Space Administration Task Load Index (NASA-TLX) scores >55-60, suggesting, "overwork"). Changes in work flow and procedure room layout reduced workload to more acceptable levels (NASA-TLX 50% to <10%; P < .01). To assess the overall changes in "patient safety culture," we conducted a pre- and postanalysis using the AHRQ survey. Improvements in all measured dimensions were noted. Quality improvement initiatives can be successfully implemented in an academic radiation oncology department to yield measurable improvements in operations resulting in improvement in patient safety culture. Copyright © 2014 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  20. Potential overestimation of HPV vaccine impact due to unmasking of non-vaccine types: quantification using a multi-type mathematical model.

    Science.gov (United States)

    Choi, Yoon Hong; Chapman, Ruth; Gay, Nigel; Jit, Mark

    2012-05-14

    Estimates of human papillomavirus (HPV) vaccine impact in clinical trials and modelling studies rely on DNA tests of cytology or biopsy specimens to determine the HPV type responsible for a cervical lesion. DNA of several oncogenic HPV types may be detectable in a specimen. However, only one type may be responsible for a particular cervical lesion. Misattribution of the causal HPV type for a particular abnormality may give rise to an apparent increase in disease due to non-vaccine HPV types following vaccination ("unmasking"). To investigate the existence and magnitude of unmasking, we analysed data from residual cytology and biopsy specimens in English women aged 20-64 years old using a stochastic type-specific individual-based model of HPV infection, progression and disease. The model parameters were calibrated to data on the prevalence of HPV DNA and cytological lesion of different grades, and used to assign causal HPV types to cervical lesions. The difference between the prevalence of all disease due to non-vaccine HPV types, and disease due to non-vaccine HPV types in the absence of vaccine HPV types, was then estimated. There could be an apparent maximum increase of 3-10% in long-term cervical cancer incidence due to non-vaccine HPV types following vaccination. Unmasking may be an important phenomenon in HPV post-vaccination epidemiology, in the same way that has been observed following pneumococcal conjugate vaccination. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Remote Sensing-Based Quantification of the Impact of Flash Flooding on the Rice Production: A Case Study over Northeastern Bangladesh.

    Science.gov (United States)

    Ahmed, M Razu; Rahaman, Khan Rubayet; Kok, Aaron; Hassan, Quazi K

    2017-10-14

    The northeastern region of Bangladesh often experiences flash flooding during the pre-harvesting period of the boro rice crop, which is the major cereal crop in the country. In this study, our objective was to delineate the impact of the 2017 flash flood (that initiated on 27 March 2017) on boro rice using multi-temporal Landsat-8 OLI and MODIS data. Initially, we opted to use Landsat-8 OLI data for mapping the damages; however, during and after the flooding event the acquisition of cloud free images were challenging. Thus, we used this data to map the cultivated boro rice acreage considering the planting to mature stages of the crop. Also, in order to map the extent of the damaged boro area, we utilized MODIS data as their 16-day composites provided cloud free information. Our results indicated that both the cultivated and damaged boro area estimates based on satellite data had strong relationships while compared to the ground-based estimates (i.e., r ² values approximately 0.92 for both cases, and RMSE of 18,374 and 9380 ha for cultivated and damaged areas, respectively). Finally, we believe that our study would be critical for planning and ensuring food security for the country.

  2. The use of direct analysis in real time (DART) to assess the levels of inhibitors co-extracted with DNA and the associated impact in quantification and amplification.

    Science.gov (United States)

    Moreno, Lilliana I; McCord, Bruce R

    2016-10-01

    The measure of quality in DNA sample processing starts with an effective nucleic acid isolation procedure. Most problems with DNA sample typing can be attributed to low quantity DNA and/or to the presence of inhibitors in the sample. Therefore, establishing which isolation method is best at removing potential inhibitors may help overcome some of the problems analysts encounter by providing useful information in the determination of the optimal approach for any given sample. Direct analysis in real time (DART) mass spectrometry was used in this study to investigate the ability of different extraction methods to remove PCR inhibitors. Methods investigated included both liquid/liquid (phenol-chloroform) and solid phase based robotic procedures, (PrepFiler™ and EZ1 chemistries). Following extraction, samples were analyzed by DART in order to determine the level of remaining inhibitors and then quantified and amplified to determine the effect any remaining inhibitor had on the overall results. The data suggests that organic extraction methods result in detrimental amounts of phenol carryover while automated methods may produce carry-over of bile salts and other chemicals that preferentially bind the solid phase matrix. Both of these effects can have a negative impact in downstream sample processing and genotyping by PCR. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Green-blue water in the city: quantification of impact of source control versus end-of-pipe solutions on sewer and river floods.

    Science.gov (United States)

    De Vleeschauwer, K; Weustenraad, J; Nolf, C; Wolfs, V; De Meulder, B; Shannon, K; Willems, P

    2014-01-01

    Urbanization and climate change trends put strong pressures on urban water systems. Temporal variations in rainfall, runoff and water availability increase, and need to be compensated for by innovative adaptation strategies. One of these is stormwater retention and infiltration in open and/or green spaces in the city (blue-green water integration). This study evaluated the efficiency of three adaptation strategies for the city of Turnhout in Belgium, namely source control as a result of blue-green water integration, retention basins located downstream of the stormwater sewers, and end-of-pipe solutions based on river flood control reservoirs. The efficiency of these options is quantified by the reduction in sewer and river flood frequencies and volumes, and sewer overflow volumes. This is done by means of long-term simulations (100-year rainfall simulations) using an integrated conceptual sewer-river model calibrated to full hydrodynamic sewer and river models. Results show that combining open, green zones in the city with stormwater retention and infiltration for only 1% of the total city runoff area would lead to a 30 to 50% reduction in sewer flood volumes for return periods in the range 10-100 years. This is due to the additional surface storage and infiltration and consequent reduction in urban runoff. However, the impact of this source control option on downstream river floods is limited. Stormwater retention downstream of the sewer system gives a strong reduction in peak discharges to the receiving river. However due to the difference in response time between the sewer and river systems, this does not lead to a strong reduction in river flood frequency. The paper shows the importance of improving the interface between urban design and water management, and between sewer and river flood management.

  4. Quantification and Mitigation of Long-Term Impacts of Urbanization and Climate Change in the Tropical Coastal City of San Juan, Puerto Rico

    Science.gov (United States)

    Comarazamy, Daniel; Gonzalez, Jorge E.; Luvall, Jeffrey C.

    2014-01-01

    Urbanization, along with other cases of land cover and land use changes, has significant climate impacts in tropical regions with the added complexity of occurring within the context of global warming. The individual and combined effects of these two factors on the surface energy balance of a tropical city are investigated by use of an integrated atmospheric modeling approach, taking the San Juan Metropolitan Area (SJMA), Puerto Rico as the test case. To achieve this goal, an ensemble of climate and weather simulations is performed, with the climate scenarios combining urban development and sprawl with regional climate change over the past 50 years, and the short-term simulations designed to test the sensitivity to different urban vegetation configurations as mitigating alternatives. As indicator of change, we use the thermal response number (TRN), which is a measure of the sensible heating to the thermal storage of a surface or region, and the Bowen ratio, which is defined as the ratio of sensible to latent heat fluxes. The TRN of the area occupied by the SJMA has decreased as a consequence of replacing the low land coastal plain vegetation with man made materials, indicating that it takes less energy to raise the surface temperature of the urban area, whereas the TRN of forested regions has remained virtually unchanged. The global warming signal also has effects on the thermal response of the SJMA, where dryer current conditions generate lower TRN values. Differences due to global warming are more evident in the Bowen ratio pattern, mostly associated with the drier present conditions observed and its effects on sensible and latent heat fluxes. In terms of testing different mitigation strategies, the short-term simulations show that the urban area is more efficient in partitioning surface energy balance terms when green roofs are specified, as opposed to including vegetation inside the urban core.

  5. Quantification of anthropogenic impact on groundwater-dependent terrestrial ecosystem using geochemical and isotope tools combined with 3-D flow and transport modelling

    Science.gov (United States)

    Zurek, A. J.; Witczak, S.; Dulinski, M.; Wachniew, P.; Rozanski, K.; Kania, J.; Postawa, A.; Karczewski, J.; Moscicki, W. J.

    2015-02-01

    Groundwater-dependent ecosystems (GDEs) have important functions in all climatic zones as they contribute to biological and landscape diversity and provide important economic and social services. Steadily growing anthropogenic pressure on groundwater resources creates a conflict situation between nature and man which are competing for clean and safe sources of water. Such conflicts are particularly noticeable in GDEs located in densely populated regions. A dedicated study was launched in 2010 with the main aim to better understand the functioning of a groundwater-dependent terrestrial ecosystem (GDTE) located in southern Poland. The GDTE consists of a valuable forest stand (Niepolomice Forest) and associated wetland (Wielkie Błoto fen). It relies mostly on groundwater from the shallow Quaternary aquifer and possibly from the deeper Neogene (Bogucice Sands) aquifer. In July 2009 a cluster of new pumping wells abstracting water from the Neogene aquifer was set up 1 km to the northern border of the fen. A conceptual model of the Wielkie Błoto fen area for the natural, pre-exploitation state and for the envisaged future status resulting from intense abstraction of groundwater through the new well field was developed. The main aim of the reported study was to probe the validity of the conceptual model and to quantify the expected anthropogenic impact on the studied GDTE. A wide range of research tools was used. The results obtained through combined geologic, geophysical, geochemical, hydrometric and isotope investigations provide strong evidence for the existence of upward seepage of groundwater from the deeper Neogene aquifer to the shallow Quaternary aquifer supporting the studied GDTE. Simulations of the groundwater flow field in the study area with the aid of a 3-D flow and transport model developed for Bogucice Sands (Neogene) aquifer and calibrated using environmental tracer data and observations of hydraulic head in three different locations on the study area

  6. Quantification of climatic feedbacks on the Caspian Sea level variability and impacts from the Caspian Sea on the large-scale atmospheric circulation

    Science.gov (United States)

    Arpe, Klaus; Tsuang, Ben-Jei; Tseng, Yu-Heng; Liu, Xin-Yu; Leroy, Suzanne A. G.

    2018-05-01

    the water budget of the whole catchment area due to feedbacks with the precipitation. This suggests a high proportion of recycling of water within the CS catchment area. When using a model which does not have a correct CS size, the effect of a reduced CS area on the water budget for the whole CS catchment can be estimated by taking the evaporation over the sea multiplied by the proportional changed area. However, only 50% of that change is ending up in the water balance of the total catchment of the CS. A formula is provided. This method has been applied to estimate the CSL during the Last Glacial Maximum to be at - 30 to - 33 m. The experiments show as well that the CS has an impact on the large-scale atmospheric circulation with a widened Aleutian 500 hPa height field trough with increasing CS sizes. It is possible to validate this aspect with observational data.

  7. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong.

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  8. Accident sequence quantification with KIRAP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP`s cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs.

  9. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  10. EVALUATION DE L’IMPACT ENVIRONNEMENTAL : Evaluation des potentialités de transfert de l’ADN des plantes transgéniques vers les bactéries du sol

    Directory of Open Access Journals (Sweden)

    Simonet Pascal

    2000-07-01

    Full Text Available Les bactéries se caractérisent par leurs stratégies évolutives multiples incluant mutations ponctuelles, remaniements endogènes par déplacements de séquences d’insertion ou de transposons, délétions ou amplifications de larges régions d’ADN et acquisition de nouveaux gènes par transfert horizontal d’informations génétiques. Les récentes et toutes premières analyses conduites sur les données des quelques génomes bactériens totalement séquencés indiquent le rôle tout à fait fondamental qu’ont pu jouer les transferts latéraux de gènes dans l’évolution bactérienne [1-3]. Nombre de travaux sont aujourd’hui conduits afin d’élucider le rôle des trois mécanismes de transfert (conjugaison, transformation et transduction dans l’évolution et l’adaptation des bactéries aux conditions changeantes de leur environnement. En particulier, l’accroissement inquiétant de la proportion de germes résistants aux antibiotiques serait le fait de la dispersion de plasmides porteurs des gènes de résistance parmi la microflore pathogène ? Dans ces cisconstances, la question de savoir si les plantes transgéniques, très souvent pourvues de tels gènes, pourraient constituer un facteur aggravant la dissémination d’éléments menaçant potentiellement la santé humaine est donc totalement justifiée. Tout au long de l’évolution, la nature a cependant établi un certain nombre de filtres moléculaires afin de limiter les flux de gènes, notamment entre organismes phylétiquement très éloignés. La caractéristique des plantes transgéniques tient à ce que certains des gènes des transgènes, et en particulier ceux conférant la résistance aux antibiotiques, ont une origine procaryotique susceptible de leur permettre de contourner les barrières moléculaires. Cependant, d’autres facteurs environnementaux, biotiques et abiotiques vont intervenir pour favoriser ou au contraire limiter les échanges de gènes entre plantes et micro-organismes. Parmi les trois mécanismes précédemment cités, seule la transformation génétique est en mesure d’assurer le transfert d’ADN entre plantes et bactéries [4]. Celle-ci se caractérise par l’acquisition d’ADN extracellulaire par une cellule bactérienne en état de compétence. En conditions naturelles, la réalisation d’un tel processus requiert donc successivement la libération d’ADN par la plante et la persistance de ces molécules dans un environnement hostile jusqu’au contact avec des cellules bactériennes non seulement génétiquement adaptées pour la transformation mais physiologiquement dans un stade de compétence. Enfin, l’ADN ayant pénétré une cellule bactérienne ne s’y maintiendra que si les séquences qu’il porte et les mécanismes de la cellule hôte lui permettent de s’intégrer dans le génome ou, éventuellement, de se répliquer de façon autonome.

  11. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  12. Quantification of informed opinion

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1985-01-01

    The objective of this session, Quantification of Informed Opinion, is to provide the statistician with a better understanding of this important area. The NRC uses informed opinion, sometimes called engineering judgment or subjective judgment, in many areas. Sometimes informed opinion is the only source of information that exists, especially in phenomenological areas, such as steam explosions, where experiments are costly and phenomena are very difficult to measure. There are many degrees of informed opinion. These vary from the weatherman who makes predictions concerning relatively high probability events with a large data base to the phenomenological expert who must use his intuition tempered with basic knowledge and little or no measured data to predict the behavior of events with a low probability of occurrence. The first paper in this session provides the reader with an overview of the subject area. The second paper provides some aspects that must be considered in the collection of informed opinion to improve the quality of the information. The final paper contains an example of the use of informed opinion in the area of seismic hazard characterization. These papers should be useful to researchers and statisticians who need to collect and use informed opinion in their work

  13. Quantification In Neurology

    Directory of Open Access Journals (Sweden)

    Netravati M

    2005-01-01

    Full Text Available There is a distinct shift of emphasis in clinical neurology in the last few decades. A few years ago, it was just sufficient for a clinician to precisely record history, document signs, establish diagnosis and write prescription. In the present context, there has been a significant intrusion of scientific culture in clinical practice. Several criteria have been proposed, refined and redefined to ascertain accurate diagnosis for many neurological disorders. Introduction of the concept of impairment, disability, handicap and quality of life has added new dimension to the measurement of health and disease and neurological disorders are no exception. "Best guess" treatment modalities are no more accepted and evidence based medicine has become an integral component of medical care. Traditional treatments need validation and new therapies require vigorous trials. Thus, proper quantification in neurology has become essential, both in practice and research methodology in neurology. While this aspect is widely acknowledged, there is a limited access to a comprehensive document pertaining to measurements in neurology. This following description is a critical appraisal of various measurements and also provides certain commonly used rating scales/scores in neurological practice.

  14. The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification

    Science.gov (United States)

    Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)

    2003-01-01

    We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.

  15. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  16. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  17. A "Toy" Model for Operational Risk Quantification using Credibility Theory

    OpenAIRE

    Hans B\\"uhlmann; Pavel V. Shevchenko; Mario V. W\\"uthrich

    2009-01-01

    To meet the Basel II regulatory requirements for the Advanced Measurement Approaches in operational risk, the bank's internal model should make use of the internal data, relevant external data, scenario analysis and factors reflecting the business environment and internal control systems. One of the unresolved challenges in operational risk is combining of these data sources appropriately. In this paper we focus on quantification of the low frequency high impact losses exceeding some high thr...

  18. Leishmania parasite detection and quantification using PCR-ELISA

    Czech Academy of Sciences Publication Activity Database

    Kobets, Tetyana; Badalová, Jana; Grekov, Igor; Havelková, Helena; Lipoldová, Marie

    2010-01-01

    Roč. 5, č. 6 (2010), s. 1074-1080 ISSN 1754-2189 R&D Projects: GA ČR GA310/08/1697; GA MŠk(CZ) LC06009 Institutional research plan: CEZ:AV0Z50520514 Keywords : polymerase chain reaction * Leishmania major infection * parasite quantification Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 8.362, year: 2010

  19. Rapid Quantification and Validation of Lipid Concentrations within Liposomes

    Directory of Open Access Journals (Sweden)

    Carla B. Roces

    2016-09-01

    Full Text Available Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics. The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC, cholesterol, dimethyldioctadecylammonium (DDA bromide, and ᴅ-(+-trehalose 6,6′-dibehenate (TDB. The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested. The corresponding limit of detection (LOD and limit of quantification (LOQ were 0.11 and 0.36 mg/mL (DMPC, 0.02 and 0.80 mg/mL (cholesterol, 0.06 and 0.20 mg/mL (DDA, and 0.05 and 0.16 mg/mL (TDB, respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes.

  20. Collagen Quantification in Tissue Specimens.

    Science.gov (United States)

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  1. Pitfalls in the analysis of volatile breath biomarkers: suggested solutions and SIFT-MS quantification of single metabolites

    Czech Academy of Sciences Publication Activity Database

    Smith, D.; Španěl, Patrik

    2015-01-01

    Roč. 9, č. 2 (2015), 022001 ISSN 1752-7155 Institutional support: RVO:61388955 Keywords : SIFT-MS * volatile biomarkers * quantifications Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 4.177, year: 2015

  2. CT quantification of pleuropulmonary lesions in severe thoracic trauma

    International Nuclear Information System (INIS)

    Kunisch-Hoppe, M.; Bachmann, G.; Weimar, B.; Bauer, T.; Rau, W.S.; Hoppe, M.; Zickmann, B.

    1997-01-01

    Purpose: Computed quantification of the extent of pleuropulmonary trauma by CT and comparison with conventional chest X-ray - Impact on therapy and correlation with mechanical ventilation support and clinical outcome. Method: In a prospective trial, 50 patients with clinically suspicious blunt chest trauma were evaluated using CT and conventional chest X-ray. The computed quantification of ventilated lung provided by CT volumetry was correlated with the consecutive artificial respiration parameters and the clinical outcome. Results: We found a high correlation between CT volumetry and artificial ventilation concerning maximal pressures and inspiratory oxygen concentration (FiO 2 , Goris-Score) (r=0.89, Pearson). The graduation of thoracic trauma correlated highly with the duration of mechanical ventilation (r=0.98, Pearson). Especially with regard to atelectases and lung contusions CT is superior compared to conventional chest X-ray; only 32% and 43%, respectively, were identified by conventional chest X-ray. (orig./AJ) [de

  3. The application of the ISO 14001 environmental management system to small hydropower plants; L'application de l'ISO 14001 systeme environnemental de gestion aux petites centrales hydro-electriques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-07-01

    The ISO 14000 environmental management standards exist to help organisations minimise how their operations negatively affect the environment and to comply with applicable laws and regulations. More specifically, ISO 14001 is the international specification for an environmental management system (EMS). It specifies requirements for establishing an environmental policy, determining environmental aspects and impacts of products/activities/services, planning environmental objectives and measurable targets, implementation and operation of programs to meet objectives and targets, checking and corrective action, and management review. The overall idea is to establish an organized approach to systematically reduce the impact of the environmental aspects that an organization can control. Tools are available for the analysis of environmental aspects and for the generation of options for improvement. As with ISO 9000 (quality management), certification is performed by third-party organizations. Hydroelectricity enables to generate clean energy with no direct emissions of greenhouse gases and without consuming fossil fuels. However, this activity is implemented within a sensitive natural environment: the watercourses are shared with several users such as the fishermen, the kayakers, farmers and industry. Generating hydroelectricity induces very little discharges into the environment. Conversely, its implementation on the watercourses can alter the flow rates and the ecosystem: leading to disruption in the free passage of fish, change in the hydrodynamics of a watercourse, emission of noise, production of waste, pollution through oil leak, damage inflicted on the landscape, etc. These environmental impacts form the subject of several monitoring and control operations that are designed to limit and preserve the natural environment. Additionally, relations with the water users and the administrations are sometimes difficult and this would require a dialogue to be established to

  4. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...... standard curve in the Quantifiler Human DNA Quantification kit, the DNA quantification results of the human DNA preparations were 31% higher than expected based on the manufacturers' information. The results indicate a calibration problem with the Quantifiler human DNA standard for its use...

  5. [¹⁸F]Altanserin and small animal PET: impact of multidrug efflux transporters on ligand brain uptake and subsequent quantification of 5-HT₂A receptor densities in the rat brain.

    Science.gov (United States)

    Kroll, Tina; Elmenhorst, David; Matusch, Andreas; Celik, A Avdo; Wedekind, Franziska; Weisshaupt, Angela; Beer, Simone; Bauer, Andreas

    2014-01-01

    The selective 5-hydroxytryptamine type 2a receptor (5-HT(2A)R) radiotracer [(18)F]altanserin is a promising ligand for in vivo brain imaging in rodents. However, [(18)F]altanserin is a substrate of P-glycoprotein (P-gp) in rats. Its applicability might therefore be constrained by both a differential expression of P-gp under pathological conditions, e.g. epilepsy, and its relatively low cerebral uptake. The aim of the present study was therefore twofold: (i) to investigate whether inhibition of multidrug transporters (MDT) is suitable to enhance the cerebral uptake of [(18)F]altanserin in vivo and (ii) to test different pharmacokinetic, particularly reference tissue-based models for exact quantification of 5-HT(2A)R densities in the rat brain. Eighteen Sprague-Dawley rats, either treated with the MDT inhibitor cyclosporine A (CsA, 50 mg/kg, n=8) or vehicle (n=10) underwent 180-min PET scans with arterial blood sampling. Kinetic analyses of tissue time-activity curves (TACs) were performed to validate invasive and non-invasive pharmacokinetic models. CsA application lead to a two- to threefold increase of [(18)F]altanserin uptake in different brain regions and showed a trend toward higher binding potentials (BP(ND)) of the radioligand. MDT inhibition led to an increased cerebral uptake of [(18)F]altanserin but did not improve the reliability of BP(ND) as a non-invasive estimate of 5-HT(2A)R. This finding is most probable caused by the heterogeneous distribution of P-gp in the rat brain and its incomplete blockade in the reference region (cerebellum). Differential MDT expressions in experimental animal models or pathological conditions are therefore likely to influence the applicability of imaging protocols and have to be carefully evaluated. © 2013.

  6. Impacts

    NARCIS (Netherlands)

    Hellmuth, M.; Kabat, P.

    2003-01-01

    Even without the impacts of climate change, water managers face prodigious challenges in meeting sustainable development goals. Growing populations need affordable food, water and energy. Industrial development demands a growing share of water resources and contaminates those same resources with its

  7. Voltammetric Quantification of Paraquat and Glyphosate in Surface Waters

    Directory of Open Access Journals (Sweden)

    William Roberto Alza-Camacho

    2016-09-01

    Full Text Available The indiscriminate use of pesticides on crops has a negative environmental impact that affects organisms, soil and water resources, essential for life. Therefore, it is necessary to evaluate the residual effect of these substances in water sources. A simple, affordable and accessible electrochemical method for Paraquat and Glyphosate quantification in water was developed. The study was conducted using as supporting electrolyte Britton-Robinson buffer solution, working electrode of glassy carbon, Ag/AgCl as the reference electrode, and platinum as auxiliary electrode. Differential pulse voltammetry (VDP method for both compounds were validated. Linearity of the methods presented a correlation coefficient of 0.9949 and 0.9919 and the limits of detection and quantification were 130 and 190 mg/L for Paraquat and 40 and 50 mg/L for glyphosate. Comparison with the reference method showed that the electrochemical method provides superior results in quantification of analytes. Of the samples tested, a value of Paraquat was between 0,011 to 1,572 mg/L and for glyphosate it was between 0.201 to 2.777 mg/L, indicating that these compounds are present in water sources and that those may be causing serious problems to human health.

  8. Nuclear and mitochondrial DNA quantification of various forensic materials.

    Science.gov (United States)

    Andréasson, H; Nilsson, M; Budowle, B; Lundberg, H; Allen, M

    2006-12-01

    Due to the different types and quality of forensic evidence materials, their DNA content can vary substantially, and particularly low quantities can impact the results in an identification analysis. In this study, the quantity of mitochondrial and nuclear DNA was determined in a variety of materials using a previously described real-time PCR method. DNA quantification in the roots and distal sections of plucked and shed head hairs revealed large variations in DNA content particularly between the root and the shaft of plucked hairs. Also large intra- and inter-individual variations were found among hairs. In addition, DNA content was estimated in samples collected from fingerprints and accessories. The quantification of DNA on various items also displayed large variations, with some materials containing large amounts of nuclear DNA while no detectable nuclear DNA and only limited amounts of mitochondrial DNA were seen in others. Using this sensitive real-time PCR quantification assay, a better understanding was obtained regarding DNA content and variation in commonly analysed forensic evidence materials and this may guide the forensic scientist as to the best molecular biology approach for analysing various forensic evidence materials.

  9. Quantification bias caused by plasmid DNA conformation in quantitative real-time PCR assay.

    Science.gov (United States)

    Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming

    2011-01-01

    Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification.

  10. IMPACTS !

    CERN Multimedia

    2008-01-01

    (Photo courtesy of Don Davis / NASA)The University of Geneva (UNIGE) and the Ecole Polytechnique Fédérale of Lausanne (EPFL) are organising the 4th series of public lectures on astronomy, on the theme of "Impacts". The schedule is as follows: Il y a 100 ans : une explosion dans la Tunguska – Dr. Frédéric COURBIN, EPFL Les impacts sur Terre – Prof. Didier Queloz, UNIGE La fin des dinosaures – Dr. Stéphane Paltani, UNIGE Wednesday 7 May 2008, from 7.00 p.m. to 9.00 p.m. Auditoire CO1, EPFL, Ecublens Thursday 08 May 2008, from 7.00 p.m. to 9.00 p.m. Auditoire Rouiller, Uni-Dufour, Genève All 3 lectures will be givent each evening! Admission free Information: 022 379 22 00

  11. A refined methodology for modeling volume quantification performance in CT

    Science.gov (United States)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  12. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Min, Jung Joon; Lee, Byeong Il; Choi, Eun Seo; Tak, Yoon O; Choi, Heung Kook; Lee, Ju Young

    2009-01-01

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  13. Uncertainty quantification in ion–solid interaction simulations

    Energy Technology Data Exchange (ETDEWEB)

    Preuss, R., E-mail: preuss@ipp.mpg.de; Toussaint, U. von

    2017-02-15

    Within the framework of Bayesian uncertainty quantification we propose a non-intrusive reduced-order spectral approach (polynomial chaos expansion) to the simulation of ion–solid interactions. The method not only reduces the number of function evaluations but provides simultaneously a quantitative measure for which combinations of inputs have the most important impact on the result. It is applied to SDTRIM-simulations (Möller et al., 1988) with several uncertain and Gaussian distributed input parameters (i.e. angle, projectile energy, surface binding energy, target composition) and the results are compared to full-grid based approaches and sampling based methods with respect to reliability, efficiency and scalability.

  14. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  15. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  16. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  17. Quantification of virus syndrome in chili peppers

    African Journals Online (AJOL)

    Jane

    2011-06-15

    Jun 15, 2011 ... alternative for the quantification of the disease' syndromes in regards to this crop. The result of these ..... parison of treatments such as cultivars or control measures and ..... Vascular discoloration and stem necrosis. 2.

  18. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  19. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    Science.gov (United States)

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Localized quantification of anhydrous calcium carbonate polymorphs using micro-Raman spectroscopy

    Czech Academy of Sciences Publication Activity Database

    Ševčík, Radek; Mácová, Petra

    2018-01-01

    Roč. 95, March (2018), s. 1-6 ISSN 0924-2031 R&D Projects: GA MŠk(CZ) LO1219 Keywords : micro-Raman * quantification * calcite * vaterite * aragonite * nanolime Subject RIV: CA - Inorganic Chemistry Impact factor: 1.740, year: 2016 http://www.sciencedirect.com/science/article/pii/S0924203117302886

  1. Quantification methods of Black Carbon: Comparison of Rock-Eval analysis with traditional methods

    NARCIS (Netherlands)

    Poot, A.; Quik, J.T.K.; Veld, H.; Koelmans, A.A.

    2009-01-01

    Black Carbon (BC) quantification methods are reviewed, including new Rock-Eval 6 data on BC reference materials. BC has been reported to have major impacts on climate, human health and environmental quality. Especially for risk assessment of persistent organic pollutants (POPs) it is important to

  2. LC-MS3 quantification of O-glycopeptides in human serum

    Czech Academy of Sciences Publication Activity Database

    Sanda, M.; Pompach, Petr; Benicky, J.; Goldman, R.

    2013-01-01

    Roč. 34, č. 16 (2013), s. 2342-2349 ISSN 0173-0835 R&D Projects: GA MŠk LH13051 Institutional support: RVO:61388971 Keywords : LC-MS3 * O-glycosylation * Quantification of glycopeptides Subject RIV: CE - Biochemistry Impact factor: 3.161, year: 2013

  3. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  4. Stochastic approach for radionuclides quantification

    Science.gov (United States)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  5. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  6. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.

    2014-01-01

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  7. Inverse problems and uncertainty quantification

    KAUST Repository

    Litvinenko, Alexander

    2013-12-18

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)— the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  8. Palmier à huile : le management environnemental des plantations

    Directory of Open Access Journals (Sweden)

    Caliman Jean-Pierre

    2011-05-01

    Full Text Available PT. Smart, an Indonesian oil palm plantations company, technically manages all plantings of Golden Agri-Resources (GAR. Recently challenged by several environmental NGOs, accusing the society of violating the durability, despite its accession to the panel for the production of sustainable palm oil (RSPO, the company has re-affirmed its commitment, strengthening its governance with the goal of becoming not only a leader in its field of production, but also in terms of environmental and social sustainability, implementingcoordination activities and exchanges to findworkable solutions to problems. Recognizing that some mistakes were made, the company insists, however, that many economic initiatives, social and environmental development marked its action since the early 1980s according to the advance of scientific knowledge, and the existence of operational tools to implement them. What the company calls ‘‘the road towards sustainability’’.

  9. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A.; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M.

    2014-01-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  10. Quantification model for energy consumption in edification

    Directory of Open Access Journals (Sweden)

    Mercader, Mª P.

    2012-12-01

    Full Text Available The research conducted in this paper focuses on the generation of a model for the quantification of energy consumption in building. This is to be done through one of the most relevant environmental impact indicators associated with weight per m2 of construction, as well as the energy consumption resulting from the manufacturing process of materials used in building construction. The practical application of the proposed model on different buildings typologies in Seville, will provide information regarding the building materials, the subsystems and the most relevant construction elements. Hence, we will be able to observe the impact the built surface has on the environment. The results obtained aim to reference the scientific community, providing quantitative data comparable to other types of buildings and geographical areas. Furthermore, it may also allow the analysis and the characterization of feasible solutions to reduce the environmental impact generated by the different materials, subsystems and construction elements commonly used in the different building types defined in this study.

    La investigación realizada en el presente trabajo plantea la generación de un modelo de cuantificación del consumo energético en edificación, a través de uno de los indicadores de impacto ambiental más relevantes asociados al peso por m2 de construcción, el consumo energético derivado del proceso de fabricación de los materiales de construcción empleados en edificación. La aplicación práctica del modelo propuesto sobre diferentes tipologías edificatorias en Sevilla aportará información respecto a los materiales de construcción, subsistemas y elementos constructivos más impactantes, permitiendo visualizar la influencia que presenta la superficie construida en cuanto al impacto ambiental generado. Los resultados obtenidos pretenden servir de referencia a la comunidad científica, aportando datos num

  11. Nuclear Magnetic Resonance: new applications in the quantification and assessment of polysaccharide-based vaccine intermediates

    International Nuclear Information System (INIS)

    Garrido, Raine; Velez, Herman; Verez, Vicente

    2013-01-01

    Nuclear Magnetic Resonance has become the choice for structural studies, identity assays and simultaneous quantification of active pharmaceutical ingredient of different polysaccharide-based vaccine. In the last two decades, the application of quantitative Nuclear Magnetic Resonance had an increasing impact to support several quantification necessities. The technique involves experiments with several modified parameters in order to obtain spectra with quantifiable signals. The present review is supported by some recent relevant reports and it discusses several applications of NMR in carbohydrate-based vaccines. Moreover, it emphasizes and describes several parameters and applications of quantitative Nuclear Magnetic Resonance

  12. Biomass to energy : GHG reduction quantification protocols and case study

    International Nuclear Information System (INIS)

    Reusing, G.; Taylor, C.; Nolan, W.; Kerr, G.

    2009-01-01

    With the growing concerns over greenhouses gases and their contribution to climate change, it is necessary to find ways of reducing environmental impacts by diversifying energy sources to include non-fossil fuel energy sources. Among the fastest growing green energy sources is energy from waste facilities that use biomass that would otherwise be landfilled or stockpiled. The quantification of greenhouse gas reductions through the use of biomass to energy systems can be calculated using various protocols and methodologies. This paper described each of these methodologies and presented a case study comparing some of these quantification methodologies. A summary and comparison of biomass to energy greenhouse gas reduction protocols in use or under development by the United Nations, the European Union, the Province of Alberta and Environment Canada was presented. It was concluded that regulatory, environmental pressures, and public policy will continue to impact the practices associated with biomass processing or landfill operations, such as composting, or in the case of landfills, gas collection systems, thus reducing the amount of potential credit available for biomass to energy facility offset projects. 10 refs., 2 tabs., 6 figs

  13. Biomass to energy : GHG reduction quantification protocols and case study

    Energy Technology Data Exchange (ETDEWEB)

    Reusing, G.; Taylor, C. [Conestoga - Rovers and Associates, Waterloo, ON (Canada); Nolan, W. [Liberty Energy, Hamilton, ON (Canada); Kerr, G. [Index Energy, Ajax, ON (Canada)

    2009-07-01

    With the growing concerns over greenhouses gases and their contribution to climate change, it is necessary to find ways of reducing environmental impacts by diversifying energy sources to include non-fossil fuel energy sources. Among the fastest growing green energy sources is energy from waste facilities that use biomass that would otherwise be landfilled or stockpiled. The quantification of greenhouse gas reductions through the use of biomass to energy systems can be calculated using various protocols and methodologies. This paper described each of these methodologies and presented a case study comparing some of these quantification methodologies. A summary and comparison of biomass to energy greenhouse gas reduction protocols in use or under development by the United Nations, the European Union, the Province of Alberta and Environment Canada was presented. It was concluded that regulatory, environmental pressures, and public policy will continue to impact the practices associated with biomass processing or landfill operations, such as composting, or in the case of landfills, gas collection systems, thus reducing the amount of potential credit available for biomass to energy facility offset projects. 10 refs., 2 tabs., 6 figs.

  14. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  15. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc; Vitriolo, Alessandro; Adamo, Antonio; Laise, Pasquale; Das, Vivek; Testa, Giuseppe

    2016-01-01

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  16. The quantification of risk and tourism

    Directory of Open Access Journals (Sweden)

    Piet Croucamp

    2014-01-01

    Full Text Available Tourism in South Africa comprises 9.5% of Gross Domestic Product (GDP, but remains an underresearched industry, especially regarding the quantification of the risks prevailing in the social, political and economic environment in which the industry operates. Risk prediction, extrapolation forecasting is conducted largely in the context of a qualitative methodology. This article reflects on the quantification of social constructs as variables of risk in the tourism industry with reference to South Africa. The theory and methodology of quantification is briefly reviewed and the indicators of risk are conceptualized and operationalized. The identified indicators are scaled in indices for purposes of quantification. Risk assessments and the quantification of constructs rely heavily on the experience - often personal - of the researcher and this scholarly endeavour is, therefore, not inclusive of all possible identified indicators of risk. It is accepted that tourism in South Africa is an industry comprising of a large diversity of sectors, each with a different set of risk indicators and risk profiles. The emphasis of this article is thus on the methodology to be applied to a risk profile. A secondary endeavour is to provide for clarity about the conceptual and operational confines of risk in general, as well as how quantified risk relates to the tourism industry. The indices provided include both domesticand international risk indicators. The motivation for the article is to encourage a greater emphasis on quantitative research in our efforts to understand and manage a risk profile for the tourist industry.

  17. Adjusting for unrecorded consumption in survey and per capita sales data: quantification of impact on gender- and age-specific alcohol-attributable fractions for oral and pharyngeal cancers in Great Britain.

    Science.gov (United States)

    Meier, Petra Sylvia; Meng, Yang; Holmes, John; Baumberg, Ben; Purshouse, Robin; Hill-McManus, Daniel; Brennan, Alan

    2013-01-01

    Large discrepancies are typically found between per capita alcohol consumption estimated via survey data compared with sales, excise or production figures. This may lead to significant inaccuracies when calculating levels of alcohol-attributable harms. Using British data, we demonstrate an approach to adjusting survey data to give more accurate estimates of per capita alcohol consumption. First, sales and survey data are adjusted to account for potential biases (e.g. self-pouring, under-sampled populations) using evidence from external data sources. Secondly, survey and sales data are aligned using different implementations of Rehm et al.'s method [in (2010) Statistical modeling of volume of alcohol exposure for epidemiological studies of population health: the US example. Pop Health Metrics 8, 1-12]. Thirdly, the impact of our approaches is tested by using our revised survey dataset to calculate alcohol-attributable fractions (AAFs) for oral and pharyngeal cancers. British sales data under-estimate per capita consumption by 8%, primarily due to illicit alcohol. Adjustments to survey data increase per capita consumption estimates by 35%, primarily due to under-sampling of dependent drinkers and under-estimation of home-poured spirits volumes. Before aligning sales and survey data, the revised survey estimate remains 22% lower than the revised sales estimate. Revised AAFs for oral and pharyngeal cancers are substantially larger with our preferred method for aligning data sources, yielding increases in an AAF from the original survey dataset of 0.47-0.60 (males) and 0.28-0.35 (females). It is possible to use external data sources to adjust survey data to reduce the under-estimation of alcohol consumption and then account for residual under-estimation using a statistical calibration technique. These revisions lead to markedly higher estimated levels of alcohol-attributable harm.

  18. QUANTIFICATION OF GENETICALLY MODIFIED MAIZE MON 810 IN PROCESSED FOODS

    Directory of Open Access Journals (Sweden)

    Peter Siekel

    2012-12-01

    Full Text Available 800x600 Normal 0 21 false false false SK X-NONE X-NONE MicrosoftInternetExplorer4 Maize MON 810 (Zea mays L. represents the majority of genetically modified food crops. It is the only transgenic cultivar grown in the EU (European Union countries and food products with its content higher than 0.9 % must be labelled. This study was aimed at impact of food processing (temperature, pH and pressure on DNA degradation and quantification of the genetically modified maize MON 810. The transgenic DNA was quantified by the real-time polymerase chain reaction method. Processing as is high temperature (121 °C, elevated pressure (0.1 MPa and low pH 2.25 fragmented DNA. A consequence of two order difference in the species specific gene content compared to the transgenic DNA content in plant materials used has led to false negative results in the quantification of transgenic DNA. The maize containing 4.2 % of the transgene after processing appeared to be as low as 3.0 % (100 °C and 1.9 % (121 °C, 0.1 MPa. The 2.1 % amount of transgene dropped at 100 °C to 1.0 % and at 121 °C, 0.1 MPa to 0.6 %. Under such make up the DNA degradation of transgenic content showed up 2 or 3 time higher decrease a consequence of unequal gene presence. Such genes disparity is expressed as considerable decrease of transgenic content while the decrease of species specific gene content remains unnoticed. Based on our findings we conclude that high degree of processing might have led to false negative results of the transgenic constituent quantification. Determination of GMO content in processed foods may leads to incorrect statement and labelling in these cases could misleads consumers.doi:10.5219/212

  19. Terahertz identification and quantification of penicillamine enantiomers

    International Nuclear Information System (INIS)

    Ji Te; Zhao Hongwei; Chen Min; Xiao Tiqiao; Han Pengyu

    2013-01-01

    Identification and characterization of L-, D- and DL- penicillamine were demonstrated by Terahertz time-domain spectroscopy (THz-TDS). To understand the physical origins of the low frequency resonant modes, the density functional theory (DFT) was adopted for theoretical calculation. It was found that the collective THz frequency motions were decided by the intramolecular and intermolecular hydrogen bond interactions. Moreover, the quantification of penicillamine enantiomers mixture was demonstrated by a THz spectra fitting method with a relative error of less than 3.5%. This technique can be a valuable tool for the discrimination and quantification of chiral drugs in pharmaceutical industry. (authors)

  20. Overview of Environmental Impact Assessment of Oil and Gas ...

    African Journals Online (AJOL)

    The environmental impact assessment (EIA) of oil and gas projects in Nigeria ... natural, social and health components of the environment; Determination of issues ... of impact quantification through which the Environmental Management Plan ...

  1. Benchmarking common quantification strategies for large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Hogrebe, Alexander; von Stechow, Louise; Bekker-Jensen, Dorte B

    2018-01-01

    Comprehensive mass spectrometry (MS)-based proteomics is now feasible, but reproducible quantification remains challenging, especially for post-translational modifications such as phosphorylation. Here, we compare the most popular quantification techniques for global phosphoproteomics: label-free...

  2. A critical view on microplastic quantification in aquatic organisms

    International Nuclear Information System (INIS)

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R.; Marques, Antonio; Granby, Kit; Fait, Gabriella; Kotterman, Michiel J.J.; Diogène, Jorge; Bekaert, Karen; Robbens, Johan; Devriese, Lisa

    2015-01-01

    Microplastics, plastic particles and fragments smaller than 5 mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different “hotspot” locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g −1 w.w. for the Acid mix Method and 0.12±0.04 total microplastics g −1 w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g −1 w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring.

  3. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  4. An EPGPT-based approach for uncertainty quantification

    International Nuclear Information System (INIS)

    Wang, C.; Abdel-Khalik, H. S.

    2012-01-01

    Generalized Perturbation Theory (GPT) has been widely used by many scientific disciplines to perform sensitivity analysis and uncertainty quantification. This manuscript employs recent developments in GPT theory, collectively referred to as Exact-to-Precision Generalized Perturbation Theory (EPGPT), to enable uncertainty quantification for computationally challenging models, e.g. nonlinear models associated with many input parameters and many output responses and with general non-Gaussian parameters distributions. The core difference between EPGPT and existing GPT is in the way the problem is formulated. GPT formulates an adjoint problem that is dependent on the response of interest. It tries to capture via the adjoint solution the relationship between the response of interest and the constraints on the state variations. EPGPT recasts the problem in terms of a smaller set of what is referred to as the 'active' responses which are solely dependent on the physics model and the boundary and initial conditions rather than on the responses of interest. The objective of this work is to apply an EPGPT methodology to propagate cross-sections variations in typical reactor design calculations. The goal is to illustrate its use and the associated impact for situations where the typical Gaussian assumption for parameters uncertainties is not valid and when nonlinear behavior must be considered. To allow this demonstration, exaggerated variations will be employed to stimulate nonlinear behavior in simple prototypical neutronics models. (authors)

  5. Colour thresholding and objective quantification in bioimaging

    Science.gov (United States)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  6. Recurrence quantification analysis in Liu's attractor

    International Nuclear Information System (INIS)

    Balibrea, Francisco; Caballero, M. Victoria; Molera, Lourdes

    2008-01-01

    Recurrence Quantification Analysis is used to detect transitions chaos to periodical states or chaos to chaos in a new dynamical system proposed by Liu et al. This system contains a control parameter in the second equation and was originally introduced to investigate the forming mechanism of the compound structure of the chaotic attractor which exists when the control parameter is zero

  7. Quantification of coating aging using impedance measurements

    NARCIS (Netherlands)

    Westing, E.P.M. van; Weijde, D.H. van der; Vreijling, M.P.W.; Ferrari, G.M.; Wit, J.H.W. de

    1998-01-01

    This chapter shows the application results of a novel approach to quantify the ageing of organic coatings using impedance measurements. The ageing quantification is based on the typical impedance behaviour of barrier coatings in immersion. This immersion behaviour is used to determine the limiting

  8. Quantification analysis of CT for aphasic patients

    International Nuclear Information System (INIS)

    Watanabe, Shunzo; Ooyama, Hiroshi; Hojo, Kei; Tasaki, Hiroichi; Hanazono, Toshihide; Sato, Tokijiro; Metoki, Hirobumi; Totsuka, Motokichi; Oosumi, Noboru.

    1987-01-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on Slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis). (author)

  9. Quantification analysis of CT for aphasic patients

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, S.; Ooyama, H.; Hojo, K.; Tasaki, H.; Hanazono, T.; Sato, T.; Metoki, H.; Totsuka, M.; Oosumi, N.

    1987-02-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis).

  10. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  11. Quantification of glycyrrhizin biomarker in Glycyrrhiza glabra ...

    African Journals Online (AJOL)

    Background: A simple and sensitive thin-layer chromatographic method has been established for quantification of glycyrrhizin in Glycyrrhiza glabra rhizome and baby herbal formulations by validated Reverse Phase HPTLC method. Materials and Methods: RP-HPTLC Method was carried out using glass coated with RP-18 ...

  12. Noninvasive Quantification of Pancreatic Fat in Humans

    OpenAIRE

    Lingvay, Ildiko; Esser, Victoria; Legendre, Jaime L.; Price, Angela L.; Wertz, Kristen M.; Adams-Huet, Beverley; Zhang, Song; Unger, Roger H.; Szczepaniak, Lidia S.

    2009-01-01

    Objective: To validate magnetic resonance spectroscopy (MRS) as a tool for non-invasive quantification of pancreatic triglyceride (TG) content and to measure the pancreatic TG content in a diverse human population with a wide range of body mass index (BMI) and glucose control.

  13. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  14. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  15. Quantification of the Impact of Integrated Soil and Water ...

    African Journals Online (AJOL)

    Bheema

    can be retained by these physical measures were computed using the measured dimensions of the structures (Table 1). Table 1. Physical conservation measures and actual volume of surface runoff that can be contained by the structures (as it was measured in April 2012). No. Type of physical SWC Unit Volume of work.

  16. Impact of iterative reconstruction on CT coronary calcium quantification

    DEFF Research Database (Denmark)

    Kurata, Akira; Dharampal, Anoeshka; Dedic, Admir

    2013-01-01

    We evaluated the influence of sinogram-affirmed iterative reconstruction (SAFIRE) on the coronary artery calcium (CAC) score by computed tomography (CT).......We evaluated the influence of sinogram-affirmed iterative reconstruction (SAFIRE) on the coronary artery calcium (CAC) score by computed tomography (CT)....

  17. Quantification of the impact of data in reservoir modeling

    NARCIS (Netherlands)

    Krymskaya, M.V.

    2013-01-01

    Global energy use is increasing. As societies advance, they will continue to need energy to power residential and commercial buildings, in the industrial sector, for transportation and other vital services. To satisfy this rising demand, liquid, natural gas, coal, nuclear power and renewable fuel

  18. A critical view on microplastic quantification in aquatic organisms

    DEFF Research Database (Denmark)

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R.

    2015-01-01

    Microplastics, plastic particles and fragments smaller than 5mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics...... to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature...... review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different "hotspot" locations in Europe (Po estuary, Italy; Tagus...

  19. Metal Stable Isotope Tagging: Renaissance of Radioimmunoassay for Multiplex and Absolute Quantification of Biomolecules.

    Science.gov (United States)

    Liu, Rui; Zhang, Shixi; Wei, Chao; Xing, Zhi; Zhang, Sichun; Zhang, Xinrong

    2016-05-17

    The unambiguous quantification of biomolecules is of great significance in fundamental biological research as well as practical clinical diagnosis. Due to the lack of a detectable moiety, the direct and highly sensitive quantification of biomolecules is often a "mission impossible". Consequently, tagging strategies to introduce detectable moieties for labeling target biomolecules were invented, which had a long and significant impact on studies of biomolecules in the past decades. For instance, immunoassays have been developed with radioisotope tagging by Yalow and Berson in the late 1950s. The later languishment of this technology can be almost exclusively ascribed to the use of radioactive isotopes, which led to the development of nonradioactive tagging strategy-based assays such as enzyme-linked immunosorbent assay, fluorescent immunoassay, and chemiluminescent and electrochemiluminescent immunoassay. Despite great success, these strategies suffered from drawbacks such as limited spectral window capacity for multiplex detection and inability to provide absolute quantification of biomolecules. After recalling the sequences of tagging strategies, an apparent question is why not use stable isotopes from the start? A reasonable explanation is the lack of reliable means for accurate and precise quantification of stable isotopes at that time. The situation has changed greatly at present, since several atomic mass spectrometric measures for metal stable isotopes have been developed. Among the newly developed techniques, inductively coupled plasma mass spectrometry is an ideal technique to determine metal stable isotope-tagged biomolecules, for its high sensitivity, wide dynamic linear range, and more importantly multiplex and absolute quantification ability. Since the first published report by our group, metal stable isotope tagging has become a revolutionary technique and gained great success in biomolecule quantification. An exciting research highlight in this area

  20. Sedimentary Processes. Quantification Using Radionuclides

    International Nuclear Information System (INIS)

    Carroll, J.; Lerche, I.

    2003-01-01

    The advent of radionuclide methods in geochronology has revolutionized our understanding of modern sedimentary processes in aquatic systems. This book examines the principles of the method and its use as a quantitative tool in marine geology, with emphasis on the Pb-210 method. The assumptions and consequences of models and their behaviour are described providing the necessary background to assess the advantages and trade-offs involved when choosing a particular model for application. One of the purposes of this volume is to disentangle the influences of complicating factors, such as sediment flux variations, post-depositional diffusion of radionuclides, and bio-irrigation of sediments, to arrive at sediment ages and to properly assess the attendant data uncertainty. Environmental impacts of chemical, nuclear, or other waste material are of concern in a variety of areas around the world today. A number of relevant examples are included, demonstrating how dating models are useful for determining sources of contaminants and interpreting their influence on the environment. The book is set at a level so that an able student or professional should have no difficulty in following the procedures and methods developed. Each chapter includes case histories showing the strengths and weaknesses of a given procedure with respect to a data example. Included with this volume is the computer source code of a new generation of modelling tools based on inverse numerical analysis techniques. This first generation of the modelling tool is included, along with detailed instructions and examples for its use, in an appendix

  1. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  2. Uncertainty Quantification in Alchemical Free Energy Methods.

    Science.gov (United States)

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-05-02

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  3. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  4. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  5. SPECT quantification of regional radionuclide distributions

    International Nuclear Information System (INIS)

    Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1986-01-01

    SPECT quantification of regional radionuclide activities within the human body is affected by several physical and instrumental factors including attenuation of photons within the patient, Compton scattered events, the system's finite spatial resolution and object size, finite number of detected events, partial volume effects, the radiopharmaceutical biokinetics, and patient and/or organ motion. Furthermore, other instrumentation factors such as calibration of the center-of-rotation, sampling, and detector nonuniformities will affect the SPECT measurement process. These factors are described, together with examples of compensation methods that are currently available for improving SPECT quantification. SPECT offers the potential to improve in vivo estimates of absorbed dose, provided the acquisition, reconstruction, and compensation procedures are adequately implemented and utilized. 53 references, 2 figures

  6. Uncertainty quantification for hyperbolic and kinetic equations

    CERN Document Server

    Pareschi, Lorenzo

    2017-01-01

    This book explores recent advances in uncertainty quantification for hyperbolic, kinetic, and related problems. The contributions address a range of different aspects, including: polynomial chaos expansions, perturbation methods, multi-level Monte Carlo methods, importance sampling, and moment methods. The interest in these topics is rapidly growing, as their applications have now expanded to many areas in engineering, physics, biology and the social sciences. Accordingly, the book provides the scientific community with a topical overview of the latest research efforts.

  7. Quantification of heterogeneity observed in medical images

    OpenAIRE

    Brooks, Frank J; Grigsby, Perry W

    2013-01-01

    Background There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging mod...

  8. Real-time PCR quantification of arbuscular mycorrhizal fungi: does the use of nuclear or mitochondrial markers make a difference?

    Czech Academy of Sciences Publication Activity Database

    Voříšková, Alena; Jansa, J.; Püschel, David; Krüger, Manuela; Cajthaml, T.; Vosátka, Miroslav; Janoušková, Martina

    2017-01-01

    Roč. 27, č. 6 (2017), s. 577-585 ISSN 0940-6360 R&D Projects: GA ČR GA15-05466S Institutional support: RVO:67985939 Keywords : real-time PCR * quantification * arbuscular mycorrhizal fungi Subject RIV: EF - Botanics OBOR OECD: Plant sciences, botany Impact factor: 3.047, year: 2016

  9. Real-time PCR quantification of arbuscular mycorrhizal fungi: does the use of nuclear or mitochondrial markers make a difference?

    Czech Academy of Sciences Publication Activity Database

    Voříšková, A.; Jansa, J.; Püschel, D.; Krüger, Manuela; Cajthaml, T.; Vosátka, M.; Janoušková, M.

    2017-01-01

    Roč. 27, č. 6 (2017), s. 577-585 ISSN 0940-6360 Institutional support: RVO:61389030 Keywords : Arbuscular mycorrhizal fungi * Isolate discrimination * Microsymbiont screening * Mitochondrial DNA * Molecular genetic quantification * Nuclear ribosomal DNA * plfa * Real-time PCR Subject RIV: EA - Cell Biology OBOR OECD: Cell biology Impact factor: 3.047, year: 2016

  10. Artifacts Quantification of Metal Implants in MRI

    Science.gov (United States)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  11. Impact assessment revisited

    DEFF Research Database (Denmark)

    Thiele, Jan; Kollmann, Johannes Christian; Markussen, Bo

    2010-01-01

    ; and (4) the total invaded range is an inappropriate measure for quantifying regional impact because the habitat area available for invasion can vary markedly among invasive species. Mathematical models and empirical data using an invasive alien plant species (Heracleum mantegazzianum) indicate......The theoretical underpinnings of the assessment of invasive alien species impacts need to be improved. At present most approaches are unreliable to quantify impact at regional scales and do not allow for comparison of different invasive species. There are four basic problems that need...... and we discuss the quantification of the invaded range. These improvements are crucial for impact assessment with the overall aim of prioritizing management of invasive species....

  12. A critical view on microplastic quantification in aquatic organisms

    Energy Technology Data Exchange (ETDEWEB)

    Vandermeersch, Griet, E-mail: griet.vandermeersch@ilvo.vlaanderen.be [Institute for Agricultural and Fisheries Research (ILVO), Animal Sciences Unit – Marine Environment and Quality, Ankerstraat 1, 8400 Oostende (Belgium); Van Cauwenberghe, Lisbeth; Janssen, Colin R. [Ghent University, Laboratory of Environmental Toxicology and Aquatic Ecology, Environmental Toxicology Unit (GhEnToxLab), Jozef Plateaustraat 22, 9000 Ghent (Belgium); Marques, Antonio [Division of Aquaculture and Upgrading (DivAV), Portuguese Institute for the Sea and Atmosphere (IPMA), Avenida de Brasília s/n, 1449-006 Lisboa (Portugal); Granby, Kit [Technical University of Denmark, National Food Institute, Mørkhøj Bygade 19, 2860 Søborg (Denmark); Fait, Gabriella [Aeiforia Srl, 29027 Gariga di Podenzano (PC) (Italy); Kotterman, Michiel J.J. [Institute for Marine Resources and Ecosystem Studies (IMARES), Wageningen University and Research Center, Ijmuiden (Netherlands); Diogène, Jorge [Institut de la Recerca i Tecnologia Agroalimentàries (IRTA), Ctra. Poble Nou km 5,5, Sant Carles de la Ràpita E-43540 (Spain); Bekaert, Karen; Robbens, Johan [Institute for Agricultural and Fisheries Research (ILVO), Animal Sciences Unit – Marine Environment and Quality, Ankerstraat 1, 8400 Oostende (Belgium); Devriese, Lisa, E-mail: lisa.devriese@ilvo.vlaanderen.be [Institute for Agricultural and Fisheries Research (ILVO), Animal Sciences Unit – Marine Environment and Quality, Ankerstraat 1, 8400 Oostende (Belgium)

    2015-11-15

    Microplastics, plastic particles and fragments smaller than 5 mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different “hotspot” locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g{sup −1} w.w. for the Acid mix Method and 0.12±0.04 total microplastics g{sup −1} w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g{sup −1} w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring.

  13. Quantification of the genetic risk of environmental mutagens

    International Nuclear Information System (INIS)

    Ehling, U.H.

    1988-01-01

    Screening methods are used for hazard identification. Assays for heritable mutations in mammals are used for the confirmation of short-term test results and for the quantification of the genetic risk. There are two main approaches in making genetic risk estimates. One of these, termed the direct method, expresses risk in terms of the expected frequency of genetic changes induced per unit. The other, referred to as the doubling dose method or the indirect method, expresses risk in relation to the observed incidence of genetic disorders now present in man. The indirect method uses experimental data only for the calculation of the doubling dose. The quality of the risk estimation depends on the assumption of persistence of the induced mutations and the ability to determine the current incidence of genetic diseases. The difficulties of improving the estimates of current incidences of genetic diseases or the persistence of the genes in the population led them to the development of an alternative method, the direct estimation of the genetic risk. The direct estimation uses experimental data for the induced frequency for dominant mutations in mice. For the verification of these quantifications one can use the data of Hiroshima and Nagasaki. According to the estimation with the direct method, one would expect less than 1 radiation-induced dominant cataract in 19,000 children with one or both parents exposed. The expected overall frequency of dominant mutations in the first generation would be 20-25, based on radiation-induced dominant cataract mutations. It is estimated that 10 times more recessive than dominant mutations are induced. The same approaches can be used to determine the impact of chemical mutagens

  14. EVALUATION DE L’IMPACT ENVIRONNEMENTAL : Vers une compréhension de la dynamique des populations de colza « échappées » des cultures à l’échelle d’une région agricole

    Directory of Open Access Journals (Sweden)

    Pessel Fabrice-Del

    2000-07-01

    Full Text Available Dans le cadre des études de risques agro-environnementaux, comme dans celui de la mise en place de stratégies de gestion des cultures transgéniques (par exemple la segmentation des filières de production OGM/non OGM, s’il est indispensable de quantifier les différents processus d’échappement des transgènes depuis les parcelles cultivées (flux de pollen et/ou de graines, il est également nécessaire de déterminer leurs potentialités de maintien en dehors des parcelles cultivées (aptitudes écologiques des hybrides interspécifiques et/ou de la plante dans des environnements hors des parcelles cultivées. En effet, des populations, dans lesquelles seraient présents un ou plusieurs transgènes, seraient suceptibles de se comporter comme des réservoirs et/ou des relais de pollutions génétiques entre deux filières de production avec et sans OGM. À plus long terme, la connaissance des processus gouvernant la dynamique et la persistance de ces populations s’avère indispensable pour une gestion efficace des différentes situations de crise possibles (sanitaire, écologique ou agronomique, en particulier pour déterminer la durée des suivis après un arrêt éventuel des cultures transgéniques.

  15. EVALUATION DE L’IMPACT ENVIRONNEMENTAL : GeneSys-Colza : un modèle des effets à moyen et à long terme des systèmes de culture sur les flux de gènes entre champs de colza et repousses dans un espace agricole

    Directory of Open Access Journals (Sweden)

    Colbach Nathalie

    2000-07-01

    Full Text Available On a beaucoup parlé, lors de l’examen des demandes d’autorisation de mise en culture de variétés transgéniques de colza, des risques d’hybridation interspécifique du colza avec les crucifères sauvages, et d’introgression de transgènes dans les espèces adventices [1, 2]. Les flux de gènes intraspécifiques ont moins attiré l’attention du grand public; ils sont pourtant bien plus probables. Les transgènes peuvent être disséminés dans le temps, par l’intermédiaire de repousses de colza apparaissant dans les cultures suivant les variétés transgéniques, à cause de la perte d’une partie des graines produites par les cultures de colza avant ou pendant la récolte [3, 4], et dans l’espace, par l’intermédiaire de semences et de pollen disséminés par le vent ou d’autres vecteurs tels que les oiseaux ou les insectes [5-8]. Les semences dispersées peuvent produire directement des repousses de colza dans des champs voisins, tandis que le pollen peut transmettre le transgène en fécondant des plantes de colza présentes dans ces autres champs. Ce flux de gènes peut être à l’origine de différents problèmes tels que l’apparition de repousses de colza résistantes aux herbicides, difficiles à éliminer lorsqu’il s’agit d’un transgène de résistance à un herbicide, ou la pollution de récoltes de colza classique par le transgène, quelle que soit sa nature, et l’impossibilité d’écouler ces récoltes dans une filière « non-OGM ». Il est donc apparu nécessaire et urgent de mieux évaluer ce risque et d’identifier les moyens de le maîtriser. Les premiers éléments de réponse ont été donnés par les plates-formes « plantes transgéniques » des instituts techniques [9, 10]; cependant, ces plates-formes sont limitées dans le temps et l’espace et il n’est pas possible d’y rendre compte de la variabilité régionale des systèmes de culture et de leurs effets sur le devenir des repousses de colza; il n’est pas non plus possible d’attendre les résultats des plates-formes pour estimer des risques à long terme. Pour ces raisons, nous avons entrepris de construire un modèle rendant compte de la répartition spatiale des systèmes de culture ainsi que de leurs effets sur la dissémination, dans le temps et l’espace, d’un transgène (par exemple un gène de résistance aux herbicides ou un gène codant pour un acide gras et sur sa persistance dans les populations de repousses de colza, sur des parcelles ayant ou non été cultivées avec la variété de colza transgénique. Les autres risques liés aux cultures transgéniques (résistance aux antibiotiques, allergies alimentaires, etc. ne sont pas considérés dans cette étude. Dans cet article, nous allons présenter les grands principes du modèle d’évolution démographique et génétique des repousses de colza fonctionnant au niveau d’un champ cultivé, puis l’intégration et le fonctionnement de ce modèle au niveau régional. Les exemples décrits lors de la présentation du modèle et des simulations se rapportent généralement à un colza transgénique résistant à un herbicide. Mais le modèle peut également être utilisé pour évaluer le flux de gènes à partir de nouvelles variétés obtenues par sélection classique et/ou pour des gènes codant pour d’autres caractéristiques telles qu’une teneur en acide gras.

  16. Quantification in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Buvat, Irene

    2005-01-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena; 2 - quantification in SPECT, problems and correction methods: Attenuation, scattering, un-stationary spatial resolution, partial volume effect, movement, tomographic reconstruction, calibration; 3 - Synthesis: actual quantification accuracy; 4 - Beyond the activity concentration measurement

  17. Digital PCR for direct quantification of viruses without DNA extraction.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration material and it has higher tolerance to inhibitors. DNA quantification without an extraction step (i.e. direct quantification) was performed here using dPCR and two different human cytomegalovirus whole-virus materials. Two dPCR platforms were used for this direct quantification of the viral DNA, and these were compared with quantification of the extracted viral DNA in terms of yield and variability. Direct quantification of both whole-virus materials present in simple matrices like cell lysate or Tris-HCl buffer provided repeatable measurements of virus concentrations that were probably in closer agreement with the actual viral load than when estimated through quantification of the extracted DNA. Direct dPCR quantification of other viruses, reference materials and clinically relevant matrices is now needed to show the full versatility of this very promising and cost-efficient development in virus quantification.

  18. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  19. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  20. Quantification of competitive value of documents

    Directory of Open Access Journals (Sweden)

    Pavel Šimek

    2009-01-01

    Full Text Available The majority of Internet users use the global network to search for different information using fulltext search engines such as Google, Yahoo!, or Seznam. The web presentation operators are trying, with the help of different optimization techniques, to get to the top places in the results of fulltext search engines. Right there is a great importance of Search Engine Optimization and Search Engine Marketing, because normal users usually try links only on the first few pages of the fulltext search engines results on certain keywords and in catalogs they use primarily hierarchically higher placed links in each category. Key to success is the application of optimization methods which deal with the issue of keywords, structure and quality of content, domain names, individual sites and quantity and reliability of backward links. The process is demanding, long-lasting and without a guaranteed outcome. A website operator without advanced analytical tools do not identify the contribution of individual documents from which the entire web site consists. If the web presentation operators want to have an overview of their documents and web site in global, it is appropriate to quantify these positions in a specific way, depending on specific key words. For this purpose serves the quantification of competitive value of documents, which consequently sets global competitive value of a web site. Quantification of competitive values is performed on a specific full-text search engine. For each full-text search engine can be and often are, different results. According to published reports of ClickZ agency or Market Share is according to the number of searches by English-speaking users most widely used Google search engine, which has a market share of more than 80%. The whole procedure of quantification of competitive values is common, however, the initial step which is the analysis of keywords depends on a choice of the fulltext search engine.

  1. Advances in forensic DNA quantification: a review.

    Science.gov (United States)

    Lee, Steven B; McCord, Bruce; Buel, Eric

    2014-11-01

    This review focuses upon a critical step in forensic biology: detection and quantification of human DNA from biological samples. Determination of the quantity and quality of human DNA extracted from biological evidence is important for several reasons. Firstly, depending on the source and extraction method, the quality (purity and length), and quantity of the resultant DNA extract can vary greatly. This affects the downstream method as the quantity of input DNA and its relative length can determine which genotyping procedure to use-standard short-tandem repeat (STR) typing, mini-STR typing or mitochondrial DNA sequencing. Secondly, because it is important in forensic analysis to preserve as much of the evidence as possible for retesting, it is important to determine the total DNA amount available prior to utilizing any destructive analytical method. Lastly, results from initial quantitative and qualitative evaluations permit a more informed interpretation of downstream analytical results. Newer quantitative techniques involving real-time PCR can reveal the presence of degraded DNA and PCR inhibitors, that provide potential reasons for poor genotyping results and may indicate methods to use for downstream typing success. In general, the more information available, the easier it is to interpret and process the sample resulting in a higher likelihood of successful DNA typing. The history of the development of quantitative methods has involved two main goals-improving precision of the analysis and increasing the information content of the result. This review covers advances in forensic DNA quantification methods and recent developments in RNA quantification. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Quantification of viral DNA during HIV-1 infection: A review of relevant clinical uses and laboratory methods.

    Science.gov (United States)

    Alidjinou, E K; Bocket, L; Hober, D

    2015-02-01

    Effective antiretroviral therapy usually leads to undetectable HIV-1 RNA in the plasma. However, the virus persists in some cells of infected patients as various DNA forms, both integrated and unintegrated. This reservoir represents the greatest challenge to the complete cure of HIV-1 infection and its characteristics highly impact the course of the disease. The quantification of HIV-1 DNA in blood samples constitutes currently the most practical approach to measure this residual infection. Real-time quantitative PCR (qPCR) is the most common method used for HIV-DNA quantification and many strategies have been developed to measure the different forms of HIV-1 DNA. In the literature, several "in-house" PCR methods have been used and there is a need for standardization to have comparable results. In addition, qPCR is limited for the precise quantification of low levels by background noise. Among new assays in development, digital PCR was shown to allow an accurate quantification of HIV-1 DNA. Total HIV-1 DNA is most commonly measured in clinical routine. The absolute quantification of proviruses and unintegrated forms is more often used for research purposes. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  3. Quantification of thermal damage in skin tissue

    Institute of Scientific and Technical Information of China (English)

    Xu Feng; Wen Ting; Lu Tianjian; Seffen Keith

    2008-01-01

    Skin thermal damage or skin burns are the most commonly encountered type of trauma in civilian and military communities. Besides, advances in laser, microwave and similar technologies have led to recent developments of thermal treatments for disease and damage involving skin tissue, where the objective is to induce thermal damage precisely within targeted tissue structures but without affecting the surrounding, healthy tissue. Further, extended pain sensation induced by thermal damage has also brought great problem for burn patients. Thus, it is of great importance to quantify the thermal damage in skin tissue. In this paper, the available models and experimental methods for quantification of thermal damage in skin tissue are discussed.

  4. Automated Quantification of Pneumothorax in CT

    Science.gov (United States)

    Do, Synho; Salvaggio, Kristen; Gupta, Supriya; Kalra, Mannudeep; Ali, Nabeel U.; Pien, Homer

    2012-01-01

    An automated, computer-aided diagnosis (CAD) algorithm for the quantification of pneumothoraces from Multidetector Computed Tomography (MDCT) images has been developed. Algorithm performance was evaluated through comparison to manual segmentation by expert radiologists. A combination of two-dimensional and three-dimensional processing techniques was incorporated to reduce required processing time by two-thirds (as compared to similar techniques). Volumetric measurements on relative pneumothorax size were obtained and the overall performance of the automated method shows an average error of just below 1%. PMID:23082091

  5. Uncertainty quantification for PZT bimorph actuators

    Science.gov (United States)

    Bravo, Nikolas; Smith, Ralph C.; Crews, John

    2018-03-01

    In this paper, we discuss the development of a high fidelity model for a PZT bimorph actuator used for micro-air vehicles, which includes the Robobee. We developed a high-fidelity model for the actuator using the homogenized energy model (HEM) framework, which quantifies the nonlinear, hysteretic, and rate-dependent behavior inherent to PZT in dynamic operating regimes. We then discussed an inverse problem on the model. We included local and global sensitivity analysis of the parameters in the high-fidelity model. Finally, we will discuss the results of Bayesian inference and uncertainty quantification on the HEM.

  6. Linking probe thermodynamics to microarray quantification

    International Nuclear Information System (INIS)

    Li, Shuzhao; Pozhitkov, Alexander; Brouwer, Marius

    2010-01-01

    Understanding the difference in probe properties holds the key to absolute quantification of DNA microarrays. So far, Langmuir-like models have failed to link sequence-specific properties to hybridization signals in the presence of a complex hybridization background. Data from washing experiments indicate that the post-hybridization washing has no major effect on the specifically bound targets, which give the final signals. Thus, the amount of specific targets bound to probes is likely determined before washing, by the competition against nonspecific binding. Our competitive hybridization model is a viable alternative to Langmuir-like models. (comment)

  7. Image cytometry: nuclear and chromosomal DNA quantification.

    Science.gov (United States)

    Carvalho, Carlos Roberto; Clarindo, Wellington Ronildo; Abreu, Isabella Santiago

    2011-01-01

    Image cytometry (ICM) associates microscopy, digital image and software technologies, and has been particularly useful in spatial and densitometric cytological analyses, such as DNA ploidy and DNA content measurements. Basically, ICM integrates methodologies of optical microscopy calibration, standard density filters, digital CCD camera, and image analysis softwares for quantitative applications. Apart from all system calibration and setup, cytological protocols must provide good slide preparations for efficient and reliable ICM analysis. In this chapter, procedures for ICM applications employed in our laboratory are described. Protocols shown here for human DNA ploidy determination and quantification of nuclear and chromosomal DNA content in plants could be used as described, or adapted for other studies.

  8. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  9. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  10. Review of the quantification techniques for polycyclic aromatic hydrocarbons (PAHs) in food products.

    Science.gov (United States)

    Bansal, Vasudha; Kumar, Pawan; Kwon, Eilhann E; Kim, Ki-Hyun

    2017-10-13

    There is a growing need for accurate detection of trace-level PAHs in food products due to the numerous detrimental effects caused by their contamination (e.g., toxicity, carcinogenicity, and teratogenicity). This review aims to discuss the up-to-date knowledge on the measurement techniques available for PAHs contained in food or its related products. This article aims to provide a comprehensive outline on the measurement techniques of PAHs in food to help reduce their deleterious impacts on human health based on the accurate quantification. The main part of this review is dedicated to the opportunities and practical options for the treatment of various food samples and for accurate quantification of PAHs contained in those samples. Basic information regarding all available analytical measurement techniques for PAHs in food samples is also evaluated with respect to their performance in terms of quality assurance.

  11. Quantifying "apparent" impact and distinguishing impact from invasiveness in multispecies plant invasions

    Science.gov (United States)

    Dean E. Pearson; Yvette K. Ortega; Ozkan Eren; Jose L. Hierro

    2015-01-01

    The quantification of invader impacts remains a major hurdle to understanding and managing invasions. Here, we demonstrate a method for quantifying the community-level impact of multiple plant invaders by applying Parker et al.'s (1999) equation (impact = range x local abundance x per capita effect or per unit effect) using data from 620 survey plots from 31...

  12. Quantification of complex modular architecture in plants.

    Science.gov (United States)

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  13. Quantification of prebiotics in commercial infant formulas.

    Science.gov (United States)

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Seed shape quantification in the order Cucurbitales

    Directory of Open Access Journals (Sweden)

    Emilio Cervantes

    2018-02-01

    Full Text Available Seed shape quantification in diverse species of the families belonging to the order Cucurbitales is done based on the comparison of seed images with geometric figures. Quantification of seed shape is a useful tool in plant description for phenotypic characterization and taxonomic analysis. J index gives the percent of similarity of the image of a seed with a geometric figure and it is useful in taxonomy for the study of relationships between plant groups. Geometric figures used as models in the Cucurbitales are the ovoid, two ellipses with different x/y ratios and the outline of the Fibonacci spiral. The images of seeds have been compared with these figures and values of J index obtained. The results obtained for 29 species in the family Cucurbitaceae support a relationship between seed shape and species ecology. Simple seed shape, with images resembling simple geometric figures like the ovoid, ellipse or the Fibonacci spiral, may be a feature in the basal clades of taxonomic groups.

  15. Quantification of abdominal aortic deformation after EVAR

    Science.gov (United States)

    Demirci, Stefanie; Manstad-Hulaas, Frode; Navab, Nassir

    2009-02-01

    Quantification of abdominal aortic deformation is an important requirement for the evaluation of endovascular stenting procedures and the further refinement of stent graft design. During endovascular aortic repair (EVAR) treatment, the aortic shape is subject to severe deformation that is imposed by medical instruments such as guide wires, catheters, and, the stent graft. This deformation can affect the flow characteristics and morphology of the aorta which have been shown to be elicitors for stent graft failures and be reason for reappearance of aneurysms. We present a method for quantifying the deformation of an aneurysmatic aorta imposed by an inserted stent graft device. The outline of the procedure includes initial rigid alignment of the two abdominal scans, segmentation of abdominal vessel trees, and automatic reduction of their centerline structures to one specified region of interest around the aorta. This is accomplished by preprocessing and remodeling of the pre- and postoperative aortic shapes before performing a non-rigid registration. We further narrow the resulting displacement fields to only include local non-rigid deformation and therefore, eliminate all remaining global rigid transformations. Finally, deformations for specified locations can be calculated from the resulting displacement fields. In order to evaluate our method, experiments for the extraction of aortic deformation fields are conducted on 15 patient datasets from endovascular aortic repair (EVAR) treatment. A visual assessment of the registration results and evaluation of the usage of deformation quantification were performed by two vascular surgeons and one interventional radiologist who are all experts in EVAR procedures.

  16. Virus detection and quantification using electrical parameters

    Science.gov (United States)

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-10-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.

  17. CT quantification of central airway in tracheobronchomalacia

    Energy Technology Data Exchange (ETDEWEB)

    Im, Won Hyeong; Jin, Gong Yong; Han, Young Min; Kim, Eun Young [Dept. of Radiology, Chonbuk National University Hospital, Jeonju (Korea, Republic of)

    2016-05-15

    To know which factors help to diagnose tracheobronchomalacia (TBM) using CT quantification of central airway. From April 2013 to July 2014, 19 patients (68.0 ± 15.0 years; 6 male, 13 female) were diagnosed as TBM on CT. As case-matching, 38 normal subjects (65.5 ± 21.5 years; 6 male, 13 female) were selected. All 57 subjects underwent CT with end-inspiration and end-expiration. Airway parameters of trachea and both main bronchus were assessed using software (VIDA diagnostic). Airway parameters of TBM patients and normal subjects were compared using the Student t-test. In expiration, both wall perimeter and wall thickness in TBM patients were significantly smaller than normal subjects (wall perimeter: trachea, 43.97 mm vs. 49.04 mm, p = 0.020; right main bronchus, 33.52 mm vs. 42.69 mm, p < 0.001; left main bronchus, 26.76 mm vs. 31.88 mm, p = 0.012; wall thickness: trachea, 1.89 mm vs. 2.22 mm, p = 0.017; right main bronchus, 1.64 mm vs. 1.83 mm, p = 0.021; left main bronchus, 1.61 mm vs. 1.75 mm, p = 0.016). Wall thinning and decreased perimeter of central airway of expiration by CT quantification would be a new diagnostic indicators in TBM.

  18. Development of hydrate risk quantification in oil and gas production

    Science.gov (United States)

    Chaudhari, Piyush N.

    order to reduce the parametric study that may require a long duration of time using The Colorado School of Mines Hydrate Kinetic Model (CSMHyK). The evolution of the hydrate plugging risk along flowline-riser systems is modeled for steady state and transient operations considering the effect of several critical parameters such as oil-hydrate slip, duration of shut-in, and water droplet size on a subsea tieback system. This research presents a novel platform for quantification of the hydrate plugging risk, which in-turn will play an important role in improving and optimizing current hydrate management strategies. The predictive strength of the hydrate risk quantification and hydrate prediction models will have a significant impact on flow assurance engineering and design with respect to building safe and efficient hydrate management techniques for future deep-water developments.

  19. Molecular quantification of genes encoding for green-fluorescent proteins

    DEFF Research Database (Denmark)

    Felske, A; Vandieken, V; Pauling, B V

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification...... PCR strategy is a highly specific and sensitive way to monitor recombinant DNA in environments like the efflux of a biotechnological plant....

  20. La quantification en Kabiye: une approche linguistique | Pali ...

    African Journals Online (AJOL)

    ... which is denoted by lexical quantifiers. Quantification with specific reference is provided by different types of linguistic units (nouns, numerals, adjectives, adverbs, ideophones and verbs) in arguments/noun phrases and in the predicative phrase in the sense of Chomsky. Keywords: quantification, class, number, reference, ...

  1. In vivo MRS metabolite quantification using genetic optimization

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure.

  2. In vivo MRS metabolite quantification using genetic optimization

    International Nuclear Information System (INIS)

    Papakostas, G A; Mertzios, B G; Karras, D A; Van Ormondt, D; Graveron-Demilly, D

    2011-01-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure

  3. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  4. Accurate Digital Polymerase Chain Reaction Quantification of Challenging Samples Applying Inhibitor-Tolerant DNA Polymerases.

    Science.gov (United States)

    Sidstedt, Maja; Romsos, Erica L; Hedell, Ronny; Ansell, Ricky; Steffen, Carolyn R; Vallone, Peter M; Rådström, Peter; Hedman, Johannes

    2017-02-07

    Digital PCR (dPCR) enables absolute quantification of nucleic acids by partitioning of the sample into hundreds or thousands of minute reactions. By assuming a Poisson distribution for the number of DNA fragments present in each chamber, the DNA concentration is determined without the need for a standard curve. However, when analyzing nucleic acids from complex matrixes such as soil and blood, the dPCR quantification can be biased due to the presence of inhibitory compounds. In this study, we evaluated the impact of varying the DNA polymerase in chamber-based dPCR for both pure and impure samples using the common PCR inhibitor humic acid (HA) as a model. We compared the TaqMan Universal PCR Master Mix with two alternative DNA polymerases: ExTaq HS and Immolase. By using Bayesian modeling, we show that there is no difference among the tested DNA polymerases in terms of accuracy of absolute quantification for pure template samples, i.e., without HA present. For samples containing HA, there were great differences in performance: the TaqMan Universal PCR Master Mix failed to correctly quantify DNA with more than 13 pg/nL HA, whereas Immolase (1 U) could handle up to 375 pg/nL HA. Furthermore, we found that BSA had a moderate positive effect for the TaqMan Universal PCR Master Mix, enabling accurate quantification for 25 pg/nL HA. Increasing the amount of DNA polymerase from 1 to 5 U had a strong effect for ExTaq HS, elevating HA-tolerance four times. We also show that the average Cq values of positive reactions may be used as a measure of inhibition effects, e.g., to determine whether or not a dPCR quantification result is reliable. The statistical models developed to objectively analyze the data may also be applied in quality control. We conclude that the choice of DNA polymerase in dPCR is crucial for the accuracy of quantification when analyzing challenging samples.

  5. Accurate episomal HIV 2-LTR circles quantification using optimized DNA isolation and droplet digital PCR.

    Science.gov (United States)

    Malatinkova, Eva; Kiselinova, Maja; Bonczkowski, Pawel; Trypsteen, Wim; Messiaen, Peter; Vermeire, Jolien; Verhasselt, Bruno; Vervisch, Karen; Vandekerckhove, Linos; De Spiegelaere, Ward

    2014-01-01

    In HIV-infected patients on combination antiretroviral therapy (cART), the detection of episomal HIV 2-LTR circles is a potential marker for ongoing viral replication. Quantification of 2-LTR circles is based on quantitative PCR or more recently on digital PCR assessment, but is hampered due to its low abundance. Sample pre-PCR processing is a critical step for 2-LTR circles quantification, which has not yet been sufficiently evaluated in patient derived samples. We compared two sample processing procedures to more accurately quantify 2-LTR circles using droplet digital PCR (ddPCR). Episomal HIV 2-LTR circles were either isolated by genomic DNA isolation or by a modified plasmid DNA isolation, to separate the small episomal circular DNA from chromosomal DNA. This was performed in a dilution series of HIV-infected cells and HIV-1 infected patient derived samples (n=59). Samples for the plasmid DNA isolation method were spiked with an internal control plasmid. Genomic DNA isolation enables robust 2-LTR circles quantification. However, in the lower ranges of detection, PCR inhibition caused by high genomic DNA load substantially limits the amount of sample input and this impacts sensitivity and accuracy. Moreover, total genomic DNA isolation resulted in a lower recovery of 2-LTR templates per isolate, further reducing its sensitivity. The modified plasmid DNA isolation with a spiked reference for normalization was more accurate in these low ranges compared to genomic DNA isolation. A linear correlation of both methods was observed in the dilution series (R2=0.974) and in the patient derived samples with 2-LTR numbers above 10 copies per million peripheral blood mononuclear cells (PBMCs), (R2=0.671). Furthermore, Bland-Altman analysis revealed an average agreement between the methods within the 27 samples in which 2-LTR circles were detectable with both methods (bias: 0.3875±1.2657 log10). 2-LTR circles quantification in HIV-infected patients proved to be more

  6. Quantification Methods of Management Skills in Shipping

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2012-04-01

    Full Text Available Romania can not overcome the financial crisis without business growth, without finding opportunities for economic development and without attracting investment into the country. Successful managers find ways to overcome situations of uncertainty. The purpose of this paper is to determine the managerial skills developed by the Romanian fluvial shipping company NAVROM (hereinafter CNFR NAVROM SA, compared with ten other major competitors in the same domain, using financial information of these companies during the years 2005-2010. For carrying out the work it will be used quantification methods of managerial skills to CNFR NAVROM SA Galati, Romania, as example mentioning the analysis of financial performance management based on profitability ratios, net profit margin, suppliers management, turnover.

  7. Recurrence quantification analysis of global stock markets

    Science.gov (United States)

    Bastos, João A.; Caiado, Jorge

    2011-04-01

    This study investigates the presence of deterministic dependencies in international stock markets using recurrence plots and recurrence quantification analysis (RQA). The results are based on a large set of free float-adjusted market capitalization stock indices, covering a period of 15 years. The statistical tests suggest that the dynamics of stock prices in emerging markets is characterized by higher values of RQA measures when compared to their developed counterparts. The behavior of stock markets during critical financial events, such as the burst of the technology bubble, the Asian currency crisis, and the recent subprime mortgage crisis, is analyzed by performing RQA in sliding windows. It is shown that during these events stock markets exhibit a distinctive behavior that is characterized by temporary decreases in the fraction of recurrence points contained in diagonal and vertical structures.

  8. Recurrence quantification analysis theory and best practices

    CERN Document Server

    Jr, Jr; Marwan, Norbert

    2015-01-01

    The analysis of recurrences in dynamical systems by using recurrence plots and their quantification is still an emerging field.  Over the past decades recurrence plots have proven to be valuable data visualization and analysis tools in the theoretical study of complex, time-varying dynamical systems as well as in various applications in biology, neuroscience, kinesiology, psychology, physiology, engineering, physics, geosciences, linguistics, finance, economics, and other disciplines.   This multi-authored book intends to comprehensively introduce and showcase recent advances as well as established best practices concerning both theoretical and practical aspects of recurrence plot based analysis.  Edited and authored by leading researcher in the field, the various chapters address an interdisciplinary readership, ranging from theoretical physicists to application-oriented scientists in all data-providing disciplines.

  9. Quantification practices in the nuclear industry

    International Nuclear Information System (INIS)

    1986-01-01

    In this chapter the quantification of risk practices adopted by the nuclear industries in Germany, Britain and France are examined as representative of the practices adopted throughout Europe. From this examination a number of conclusions are drawn about the common features of the practices adopted. In making this survey, the views expressed in the report of the Task Force on Safety Goals/Objectives appointed by the Commission of the European Communities, are taken into account. For each country considered, the legal requirements for presentation of quantified risk assessment as part of the licensing procedure are examined, and the way in which the requirements have been developed for practical application are then examined. (author)

  10. Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model

    Science.gov (United States)

    Nikbay, Melike; Heeg, Jennifer

    2017-01-01

    This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.

  11. Identification and Quantification of Celery Allergens Using Fiber Optic Surface Plasmon Resonance PCR.

    Science.gov (United States)

    Daems, Devin; Peeters, Bernd; Delport, Filip; Remans, Tony; Lammertyn, Jeroen; Spasic, Dragana

    2017-07-31

    Abstract : Accurate identification and quantification of allergens is key in healthcare, biotechnology and food quality and safety. Celery ( Apium graveolens ) is one of the most important elicitors of food allergic reactions in Europe. Currently, the golden standards to identify, quantify and discriminate celery in a biological sample are immunoassays and two-step molecular detection assays in which quantitative PCR (qPCR) is followed by a high-resolution melting analysis (HRM). In order to provide a DNA-based, rapid and simple detection method suitable for one-step quantification, a fiber optic PCR melting assay (FO-PCR-MA) was developed to determine different concentrations of celery DNA (1 pM-0.1 fM). The presented method is based on the hybridization and melting of DNA-coated gold nanoparticles to the FO sensor surface in the presence of the target gene (mannitol dehydrogenase, Mtd ). The concept was not only able to reveal the presence of celery DNA, but also allowed for the cycle-to-cycle quantification of the target sequence through melting analysis. Furthermore, the developed bioassay was benchmarked against qPCR followed by HRM, showing excellent agreement ( R ² = 0.96). In conclusion, this innovative and sensitive diagnostic test could further improve food quality control and thus have a large impact on allergen induced healthcare problems.

  12. Identification and Quantification of Celery Allergens Using Fiber Optic Surface Plasmon Resonance PCR

    Directory of Open Access Journals (Sweden)

    Devin Daems

    2017-07-01

    Full Text Available Abstract: Accurate identification and quantification of allergens is key in healthcare, biotechnology and food quality and safety. Celery (Apium graveolens is one of the most important elicitors of food allergic reactions in Europe. Currently, the golden standards to identify, quantify and discriminate celery in a biological sample are immunoassays and two-step molecular detection assays in which quantitative PCR (qPCR is followed by a high-resolution melting analysis (HRM. In order to provide a DNA-based, rapid and simple detection method suitable for one-step quantification, a fiber optic PCR melting assay (FO-PCR-MA was developed to determine different concentrations of celery DNA (1 pM–0.1 fM. The presented method is based on the hybridization and melting of DNA-coated gold nanoparticles to the FO sensor surface in the presence of the target gene (mannitol dehydrogenase, Mtd. The concept was not only able to reveal the presence of celery DNA, but also allowed for the cycle-to-cycle quantification of the target sequence through melting analysis. Furthermore, the developed bioassay was benchmarked against qPCR followed by HRM, showing excellent agreement (R2 = 0.96. In conclusion, this innovative and sensitive diagnostic test could further improve food quality control and thus have a large impact on allergen induced healthcare problems.

  13. Convex geometry of quantum resource quantification

    Science.gov (United States)

    Regula, Bartosz

    2018-01-01

    We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \

  14. Kinetic quantification of plyometric exercise intensity.

    Science.gov (United States)

    Ebben, William P; Fauth, McKenzie L; Garceau, Luke R; Petushek, Erich J

    2011-12-01

    Ebben, WP, Fauth, ML, Garceau, LR, and Petushek, EJ. Kinetic quantification of plyometric exercise intensity. J Strength Cond Res 25(12): 3288-3298, 2011-Quantification of plyometric exercise intensity is necessary to understand the characteristics of these exercises and the proper progression of this mode of exercise. The purpose of this study was to assess the kinetic characteristics of a variety of plyometric exercises. This study also sought to assess gender differences in these variables. Twenty-six men and 23 women with previous experience in performing plyometric training served as subjects. The subjects performed a variety of plyometric exercises including line hops, 15.24-cm cone hops, squat jumps, tuck jumps, countermovement jumps (CMJs), loaded CMJs equal to 30% of 1 repetition maximum squat, depth jumps normalized to the subject's jump height (JH), and single leg jumps. All plyometric exercises were assessed with a force platform. Outcome variables associated with the takeoff, airborne, and landing phase of each plyometric exercise were evaluated. These variables included the peak vertical ground reaction force (GRF) during takeoff, the time to takeoff, flight time, JH, peak power, landing rate of force development, and peak vertical GRF during landing. A 2-way mixed analysis of variance with repeated measures for plyometric exercise type demonstrated main effects for exercise type and all outcome variables (p ≤ 0.05) and for the interaction between gender and peak vertical GRF during takeoff (p ≤ 0.05). Bonferroni-adjusted pairwise comparisons identified a number of differences between the plyometric exercises for the outcome variables assessed (p ≤ 0.05). These findings can be used to guide the progression of plyometric training by incorporating exercises of increasing intensity over the course of a program.

  15. Quantification of heterogeneity observed in medical images

    International Nuclear Information System (INIS)

    Brooks, Frank J; Grigsby, Perry W

    2013-01-01

    There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity

  16. Quantification of heterogeneity observed in medical images.

    Science.gov (United States)

    Brooks, Frank J; Grigsby, Perry W

    2013-03-02

    There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity.

  17. Serendipity: Global Detection and Quantification of Plant Stress

    Science.gov (United States)

    Schimel, D.; Verma, M.; Drewry, D.

    2016-12-01

    Detecting and quantifying plant stress is a grand challenge for remote sensing, and is important for understanding climate impacts on ecosystems broadly and also for early warning systems supporting food security. The long record from moderate resolution sensors providing frequent data has allowed using phenology to detect stress in forest and agroecosystems, but can fail or give ambiguous results when stress occurs during later phases of growth and in high leaf area systems. The recent recognition that greenhouse gas satellites such as GOSAT and OCO-2 observe Solar-Induced Fluorescence has added a new and complementary tool for the quantification of stress but algorithms to detect and quantify stress using SIF are in their infancy. Here we report new results showing a more complex response of SIF to stress by evaluating spaceborne SIF against in situ eddy covariance data. The response observed is as predicted by theory, and shows that SIF, used in conjunction with moderate resolution remote sensing, can detect and likely quantify stress by indexing the nonlinear part of the SIF-GPP relationship using the photochemical reflectance index and remotely observed light absorption. There are several exciting opportunities on the near horizon for the implementation of SIF, together with syngeristic measurements such as PRI and evapotranspiration that suggest the next few years will be a golden age for global ecology. Adancing the science and associated algorithms now is essential to fully exploiting the next wave of missions.

  18. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    International Nuclear Information System (INIS)

    Seebauer, Matthias

    2014-01-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO 2  ha −1  yr −1 with significantly different mitigation benefits depending on typologies of the crop–livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms. (paper)

  19. Uncertainty Quantification Bayesian Framework for Porous Media Flows

    Science.gov (United States)

    Demyanov, V.; Christie, M.; Erbas, D.

    2005-12-01

    Uncertainty quantification is an increasingly important aspect of many areas of applied science, where the challenge is to make reliable predictions about the performance of complex physical systems in the absence of complete or reliable data. Predicting flows of fluids through undersurface reservoirs is an example of a complex system where accuracy in prediction is needed (e.g. in oil industry it is essential for financial reasons). Simulation of fluid flow in oil reservoirs is usually carried out using large commercially written finite difference simulators solving conservation equations describing the multi-phase flow through the porous reservoir rocks, which is a highly computationally expensive task. This work examines a Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed time series data. The framework is flexible for a wide range of general physical/statistical parametric models, which are used to describe the underlying hydro-geological process in its temporal dynamics. The approach is based on exploration of the parameter space and update of the prior beliefs about what the most likely model definitions are. Optimization problem for a highly parametric physical model usually have multiple solutions, which impact the uncertainty of the made predictions. Stochastic search algorithm (e.g. genetic algorithm) allows to identify multiple "good enough" models in the parameter space. Furthermore, inference of the generated model ensemble via MCMC based algorithm evaluates the posterior probability of the generated models and quantifies uncertainty of the predictions. Machine learning algorithm - Artificial Neural Networks - are used to speed up the identification of regions in parameter space where good matches to observed data can be found. Adaptive nature of ANN allows to develop different ways of integrating them into the Bayesian framework: as direct time

  20. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    International Nuclear Information System (INIS)

    Jackson, Christopher B.; Gallati, Sabina; Schaller, André

    2012-01-01

    Highlights: ► Serial qPCR accurately determines fragmentation state of any given DNA sample. ► Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. ► Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. ► Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze–thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA (λ nDNA ) and mtDNA (λ mtDNA ) we present an approach to possibly correct measurements in degraded samples in the future. To our knowledge this is the first time different degradation impact of the two

  1. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Gallati, Sabina, E-mail: sabina.gallati@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Schaller, Andre, E-mail: andre.schaller@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland)

    2012-07-06

    degraded samples in the future. To our knowledge this is the first time different degradation impact of the two genomes is demonstrated and which evaluates systematically the impact of DNA degradation on quantification of mtDNA copy number.

  2. Risques et impacts environnementaux des retenues d’altitude pour la production de neige de culture dans un contexte de changement climatique

    Directory of Open Access Journals (Sweden)

    Stéphanie Gaucherand

    2011-10-01

    Full Text Available Les retenues d’altitude sont des ouvrages hydrauliques implantés dans les stations de loisirs de montagne et destinés à créer une réserve d’eau, dédiée principalement à la production de neige de culture. Leur implantation en altitude en fait indubitablement des retenues spécifiques, subissant et induisant des risques et des impacts sur leur environnement anthropique et écologique. Le Cemagref a engagé un projet de recherche sur la sûreté des retenues d’altitude. Le présent article est issu de ces travaux et vise à établir un état des lieux des risques liés aux retenues d’altitude et de leurs impacts sur l’environnement. Il replace le développement des retenues d’altitude dans leurs contextes sociétal, social et environnemental. Il développe ensuite les risques et impacts des retenues d’altitude, en focalisant son analyse sur les différents risques et aléas spécifiques auxquels sont exposés les ouvrages, et sur les différents impacts environnementaux liés à la réalisation et la gestion des retenues.Mountain reservoirs are hydraulic structures implanted in recreational mountain resorts designed to provide a water reserve mainly used for the production of artificial snow. Their implantation in high-altitude zones makes them highly specific reservoirs subjected to and inducing risks and impacts on their human and ecological environment. Based on in-depth bibliographic and field research, Cemagref has launched a study on mountain reservoirs. The present article aims to establish the current state of the risks related to mountain reservoirs and their impacts on the environment, placing the development of mountain reservoirs in their societal, social, and environmental contexts. It will then develop mountain reservoir risks and impacts, focusing on the specific risks and uncertainties to which these structures are exposed, and the different environmental impacts related to the construction and management of

  3. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  4. Application of Fuzzy Comprehensive Evaluation Method in Trust Quantification

    Directory of Open Access Journals (Sweden)

    Shunan Ma

    2011-10-01

    Full Text Available Trust can play an important role for the sharing of resources and information in open network environments. Trust quantification is thus an important issue in dynamic trust management. By considering the fuzziness and uncertainty of trust, in this paper, we propose a fuzzy comprehensive evaluation method to quantify trust along with a trust quantification algorithm. Simulation results show that the trust quantification algorithm that we propose can effectively quantify trust and the quantified value of an entity's trust is consistent with the behavior of the entity.

  5. Recommendations for adaptation and validation of commercial kits for biomarker quantification in drug development.

    Science.gov (United States)

    Khan, Masood U; Bowsher, Ronald R; Cameron, Mark; Devanarayan, Viswanath; Keller, Steve; King, Lindsay; Lee, Jean; Morimoto, Alyssa; Rhyne, Paul; Stephen, Laurie; Wu, Yuling; Wyant, Timothy; Lachno, D Richard

    2015-01-01

    Increasingly, commercial immunoassay kits are used to support drug discovery and development. Longitudinally consistent kit performance is crucial, but the degree to which kits and reagents are characterized by manufacturers is not standardized, nor are the approaches by users to adapt them and evaluate their performance through validation prior to use. These factors can negatively impact data quality. This paper offers a systematic approach to assessment, method adaptation and validation of commercial immunoassay kits for quantification of biomarkers in drug development, expanding upon previous publications and guidance. These recommendations aim to standardize and harmonize user practices, contributing to reliable biomarker data from commercial immunoassays, thus, enabling properly informed decisions during drug development.

  6. Quantification of Uncertainties in Integrated Spacecraft System Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  7. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.; Adams, M.L.; McClarren, R.G.; Mallick, B.K.

    2011-01-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework

  8. Direct quantification of negatively charged functional groups on membrane surfaces

    KAUST Repository

    Tiraferri, Alberto; Elimelech, Menachem

    2012-01-01

    groups at the surface of dense polymeric membranes. Both techniques consist of associating the membrane surface moieties with chemical probes, followed by quantification of the bound probes. Uranyl acetate and toluidine blue O dye, which interact

  9. A Micropillar Compression Methodology for Ductile Damage Quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  10. Multi data reservior history matching and uncertainty quantification framework

    KAUST Repository

    Katterbauer, Klemens; Hoteit, Ibrahim; Sun, Shuyu

    2015-01-01

    A multi-data reservoir history matching and uncertainty quantification framework is provided. The framework can utilize multiple data sets such as production, seismic, electromagnetic, gravimetric and surface deformation data for improving

  11. A micropillar compression methodology for ductile damage quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  12. The value of serum Hepatitis B surface antigen quantification in ...

    African Journals Online (AJOL)

    The value of serum Hepatitis B surface antigen quantification in determining viralactivity in chronic Hepatitis B virus infection. ... ofCHB andalso higher in hepatitis e antigen positive patients compared to hepatitis e antigen negative patients.

  13. an expansion of the aboveground biomass quantification model for ...

    African Journals Online (AJOL)

    Research Note BECVOL 3: an expansion of the aboveground biomass quantification model for ... African Journal of Range and Forage Science ... encroachment and estimation of food to browser herbivore species, was proposed during 1989.

  14. (1) H-MRS processing parameters affect metabolite quantification

    DEFF Research Database (Denmark)

    Bhogal, Alex A; Schür, Remmelt R; Houtepen, Lotte C

    2017-01-01

    investigated the influence of model parameters and spectral quantification software on fitted metabolite concentration values. Sixty spectra in 30 individuals (repeated measures) were acquired using a 7-T MRI scanner. Data were processed by four independent research groups with the freedom to choose their own...... + NAAG/Cr + PCr and Glu/Cr + PCr, respectively. Metabolite quantification using identical (1) H-MRS data was influenced by processing parameters, basis sets and software choice. Locally preferred processing choices affected metabolite quantification, even when using identical software. Our results......Proton magnetic resonance spectroscopy ((1) H-MRS) can be used to quantify in vivo metabolite levels, such as lactate, γ-aminobutyric acid (GABA) and glutamate (Glu). However, there are considerable analysis choices which can alter the accuracy or precision of (1) H-MRS metabolite quantification...

  15. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    OpenAIRE

    Zhang , Min; Yang , Le; Zhao , Huizhong; Zhang , Leijie; Zhong , Zhiyou; Liu , Yanling; Chen , Jianhua

    2009-01-01

    International audience; A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature dis...

  16. Quantification of aortic regurgitation by magnetic resonance velocity mapping

    DEFF Research Database (Denmark)

    Søndergaard, Lise; Lindvig, K; Hildebrandt, P

    1993-01-01

    The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients, and the regurgit......The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients...

  17. FRANX. Application for analysis and quantification of the APS fire

    International Nuclear Information System (INIS)

    Snchez, A.; Osorio, F.; Ontoso, N.

    2014-01-01

    The FRANX application has been developed by EPRI within the Risk and Reliability User Group in order to facilitate the process of quantification and updating APS Fire (also covers floods and earthquakes). By applying fire scenarios are quantified in the central integrating the tasks performed during the APS fire. This paper describes the main features of the program to allow quantification of an APS Fire. (Author)

  18. Tissue quantification for development of pediatric phantom

    International Nuclear Information System (INIS)

    Alves, A.F.F.; Miranda, J.R.A.; Pina, D.R.

    2013-01-01

    The optimization of the risk- benefit ratio is a major concern in the pediatric radiology, due to the greater vulnerability of children to the late somatic effects and genetic effects of exposure to radiation compared to adults. In Brazil, it is estimated that the causes of death from head trauma are 18 % for the age group between 1-5 years and the radiograph is the primary diagnostic test for the detection of skull fracture . Knowing that the image quality is essential to ensure the identification of structures anatomical and minimizing errors diagnostic interpretation, this paper proposed the development and construction of homogeneous phantoms skull, for the age group 1-5 years. The construction of the phantoms homogeneous was performed using the classification and quantification of tissue present in the skull of pediatric patients. In this procedure computational algorithms were used, using Matlab, to quantify distinct biological tissues present in the anatomical regions studied , using pictures retrospective CT scans. Preliminary data obtained from measurements show that between the ages of 1-5 years, assuming an average anteroposterior diameter of the pediatric skull region of the 145.73 ± 2.97 mm, can be represented by 92.34 mm ± 5.22 of lucite and 1.75 ± 0:21 mm of aluminum plates of a provision of PEP (Pacient equivalent phantom). After its construction, the phantoms will be used for image and dose optimization in pediatric protocols process to examinations of computerized radiography

  19. Uncertainty Quantification in High Throughput Screening ...

    Science.gov (United States)

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for

  20. Uncertainty quantification in flood risk assessment

    Science.gov (United States)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  1. Quantification of the vocal folds’ dynamic displacements

    International Nuclear Information System (INIS)

    Hernández-Montes, María del Socorro; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-01-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ∼100–1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues. (paper)

  2. Tentacle: distributed quantification of genes in metagenomes.

    Science.gov (United States)

    Boulund, Fredrik; Sjögren, Anders; Kristiansson, Erik

    2015-01-01

    In metagenomics, microbial communities are sequenced at increasingly high resolution, generating datasets with billions of DNA fragments. Novel methods that can efficiently process the growing volumes of sequence data are necessary for the accurate analysis and interpretation of existing and upcoming metagenomes. Here we present Tentacle, which is a novel framework that uses distributed computational resources for gene quantification in metagenomes. Tentacle is implemented using a dynamic master-worker approach in which DNA fragments are streamed via a network and processed in parallel on worker nodes. Tentacle is modular, extensible, and comes with support for six commonly used sequence aligners. It is easy to adapt Tentacle to different applications in metagenomics and easy to integrate into existing workflows. Evaluations show that Tentacle scales very well with increasing computing resources. We illustrate the versatility of Tentacle on three different use cases. Tentacle is written for Linux in Python 2.7 and is published as open source under the GNU General Public License (v3). Documentation, tutorials, installation instructions, and the source code are freely available online at: http://bioinformatics.math.chalmers.se/tentacle.

  3. Cross recurrence quantification for cover song identification

    Energy Technology Data Exchange (ETDEWEB)

    Serra, Joan; Serra, Xavier; Andrzejak, Ralph G [Department of Information and Communication Technologies, Universitat Pompeu Fabra, Roc Boronat 138, 08018 Barcelona (Spain)], E-mail: joan.serraj@upf.edu

    2009-09-15

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  4. Verification Validation and Uncertainty Quantification for CGS

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kamm, James R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    The overall conduct of verification, validation and uncertainty quantification (VVUQ) is discussed through the construction of a workflow relevant to computational modeling including the turbulence problem in the coarse grained simulation (CGS) approach. The workflow contained herein is defined at a high level and constitutes an overview of the activity. Nonetheless, the workflow represents an essential activity in predictive simulation and modeling. VVUQ is complex and necessarily hierarchical in nature. The particular characteristics of VVUQ elements depend upon where the VVUQ activity takes place in the overall hierarchy of physics and models. In this chapter, we focus on the differences between and interplay among validation, calibration and UQ, as well as the difference between UQ and sensitivity analysis. The discussion in this chapter is at a relatively high level and attempts to explain the key issues associated with the overall conduct of VVUQ. The intention is that computational physicists can refer to this chapter for guidance regarding how VVUQ analyses fit into their efforts toward conducting predictive calculations.

  5. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  6. [Quantification of acetabular coverage in normal adult].

    Science.gov (United States)

    Lin, R M; Yang, C Y; Yu, C Y; Yang, C R; Chang, G L; Chou, Y L

    1991-03-01

    Quantification of acetabular coverage is important and can be expressed by superimposition of cartilage tracings on the maximum cross-sectional area of the femoral head. A practical Autolisp program on PC AutoCAD has been developed by us to quantify the acetabular coverage through numerical expression of the images of computed tomography. Thirty adults (60 hips) with normal center-edge angle and acetabular index in plain X ray were randomly selected for serial drops. These slices were prepared with a fixed coordination and in continuous sections of 5 mm in thickness. The contours of the cartilage of each section were digitized into a PC computer and processed by AutoCAD programs to quantify and characterize the acetabular coverage of normal and dysplastic adult hips. We found that a total coverage ratio of greater than 80%, an anterior coverage ratio of greater than 75% and a posterior coverage ratio of greater than 80% can be categorized in a normal group. Polar edge distance is a good indicator for the evaluation of preoperative and postoperative coverage conditions. For standardization and evaluation of acetabular coverage, the most suitable parameters are the total coverage ratio, anterior coverage ratio, posterior coverage ratio and polar edge distance. However, medial coverage and lateral coverage ratios are indispensable in cases of dysplastic hip because variations between them are so great that acetabuloplasty may be impossible. This program can also be used to classify precisely the type of dysplastic hip.

  7. On uncertainty quantification in hydrogeology and hydrogeophysics

    Science.gov (United States)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  8. Quantification of the vocal folds’ dynamic displacements

    Science.gov (United States)

    del Socorro Hernández-Montes, María; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-05-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ~100-1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues.

  9. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost which quickly becomes intractable with the current explosion of data sizes. In this work we reduce this complexity to quadratic with the synergy of two algorithms that gracefully complement each other and lead to a radically different approach. First, we turned to stochastic estimation of the diagonal. This allowed us to cast the problem as a linear system with a relatively small number of multiple right hand sides. Second, for this linear system we developed a novel, mixed precision, iterative refinement scheme, which uses iterative solvers instead of matrix factorizations. We demonstrate that the new framework not only achieves the much needed quadratic cost but in addition offers excellent opportunities for scaling at massively parallel environments. We based our implementation on BLAS 3 kernels that ensure very high processor performance. We achieved a peak performance of 730 TFlops on 72 BG/P racks, with a sustained performance 73% of theoretical peak. We stress that the techniques presented in this work are quite general and applicable to several other important applications. Copyright © 2009 ACM.

  10. Standardless quantification methods in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Trincavelli, Jorge, E-mail: trincavelli@famaf.unc.edu.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Limandri, Silvina, E-mail: s.limandri@conicet.gov.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Bonetto, Rita, E-mail: bonetto@quimica.unlp.edu.ar [Centro de Investigación y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Facultad de Ciencias Exactas, de la Universidad Nacional de La Plata, Calle 47 N° 257, 1900 La Plata (Argentina)

    2014-11-01

    The elemental composition of a solid sample can be determined by electron probe microanalysis with or without the use of standards. The standardless algorithms are quite faster than the methods that require standards; they are useful when a suitable set of standards is not available or for rough samples, and also they help to solve the problem of current variation, for example, in equipments with cold field emission gun. Due to significant advances in the accuracy achieved during the last years, product of the successive efforts made to improve the description of generation, absorption and detection of X-rays, the standardless methods have increasingly become an interesting option for the user. Nevertheless, up to now, algorithms that use standards are still more precise than standardless methods. It is important to remark, that care must be taken with results provided by standardless methods that normalize the calculated concentration values to 100%, unless an estimate of the errors is reported. In this work, a comprehensive discussion of the key features of the main standardless quantification methods, as well as the level of accuracy achieved by them is presented. - Highlights: • Standardless methods are a good alternative when no suitable standards are available. • Their accuracy reaches 10% for 95% of the analyses when traces are excluded. • Some of them are suitable for the analysis of rough samples.

  11. Quantification of variability in trichome patterns

    Directory of Open Access Journals (Sweden)

    Bettina eGreese

    2014-11-01

    Full Text Available While pattern formation is studied in various areas of biology, little is known about the intrinsic noise leading to variations between individual realizations of the pattern. One prominent example for de novo pattern formation in plants is the patterning of trichomes on Arabidopsis leaves, which involves genetic regulation and cell-to-cell communication. These processes are potentially variable due to , e.g., the abundance of cell components or environmental conditions. To elevate the understanding of the regulatory processes underlying the pattern formation it is crucial to quantitatively analyze the variability in naturally occurring patterns. Here, we review recent approaches towards characterization of noise on trichome initiation. We present methods for the quantification of spatial patterns, which are the basis for data-driven mathematical modeling and enable the analysis of noise from different sources. Besides the insight gained on trichome formation, the examination of observed trichome patterns also shows that highly regulated biological processes can be substantially affected by variability.

  12. Cross recurrence quantification for cover song identification

    International Nuclear Information System (INIS)

    Serra, Joan; Serra, Xavier; Andrzejak, Ralph G

    2009-01-01

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  13. Quality Quantification of Evaluated Cross Section Covariances

    International Nuclear Information System (INIS)

    Varet, S.; Dossantos-Uzarralde, P.; Vayatis, N.

    2015-01-01

    Presently, several methods are used to estimate the covariance matrix of evaluated nuclear cross sections. Because the resulting covariance matrices can be different according to the method used and according to the assumptions of the method, we propose a general and objective approach to quantify the quality of the covariance estimation for evaluated cross sections. The first step consists in defining an objective criterion. The second step is computation of the criterion. In this paper the Kullback-Leibler distance is proposed for the quality quantification of a covariance matrix estimation and its inverse. It is based on the distance to the true covariance matrix. A method based on the bootstrap is presented for the estimation of this criterion, which can be applied with most methods for covariance matrix estimation and without the knowledge of the true covariance matrix. The full approach is illustrated on the 85 Rb nucleus evaluations and the results are then used for a discussion on scoring and Monte Carlo approaches for covariance matrix estimation of the cross section evaluations

  14. Quantification of nanowire uptake by live cells

    KAUST Repository

    Margineanu, Michael B.

    2015-05-01

    Nanostructures fabricated by different methods have become increasingly important for various applications at the cellular level. In order to understand how these nanostructures “behave” and for studying their internalization kinetics, several attempts have been made at tagging and investigating their interaction with living cells. In this study, magnetic iron nanowires with an iron oxide layer are coated with (3-Aminopropyl)triethoxysilane (APTES), and subsequently labeled with a fluorogenic pH-dependent dye pHrodo™ Red, covalently bound to the aminosilane surface. Time-lapse live imaging of human colon carcinoma HCT 116 cells interacting with the labeled iron nanowires is performed for 24 hours. As the pHrodo™ Red conjugated nanowires are non-fluorescent outside the cells but fluoresce brightly inside, internalized nanowires are distinguished from non-internalized ones and their behavior inside the cells can be tracked for the respective time length. A machine learning-based computational framework dedicated to automatic analysis of live cell imaging data, Cell Cognition, is adapted and used to classify cells with internalized and non-internalized nanowires and subsequently determine the uptake percentage by cells at different time points. An uptake of 85 % by HCT 116 cells is observed after 24 hours incubation at NW-to-cell ratios of 200. While the approach of using pHrodo™ Red for internalization studies is not novel in the literature, this study reports for the first time the utilization of a machine-learning based time-resolved automatic analysis pipeline for quantification of nanowire uptake by cells. This pipeline has also been used for comparison studies with nickel nanowires coated with APTES and labeled with pHrodo™ Red, and another cell line derived from the cervix carcinoma, HeLa. It has thus the potential to be used for studying the interaction of different types of nanostructures with potentially any live cell types.

  15. Quantification of water in hydrous ringwoodite

    Directory of Open Access Journals (Sweden)

    Sylvia-Monique eThomas

    2015-01-01

    Full Text Available Ringwoodite, γ-(Mg,Fe2SiO4, in the lower 150 km of Earth’s mantle transition zone (410-660 km depth can incorporate up to 1.5-2 wt% H2O as hydroxyl defects. We present a mineral-specific IR calibration for the absolute water content in hydrous ringwoodite by combining results from Raman spectroscopy, secondary ion mass spectrometery (SIMS and proton-proton (pp-scattering on a suite of synthetic Mg- and Fe-bearing hydrous ringwoodites. H2O concentrations in the crystals studied here range from 0.46 to 1.7 wt% H2O (absolute methods, with the maximum H2O in the same sample giving 2.5 wt% by SIMS calibration. Anchoring our spectroscopic results to absolute H-atom concentrations from pp-scattering measurements, we report frequency-dependent integrated IR-absorption coefficients for water in ringwoodite ranging from 78180 to 158880 L mol-1cm-2, depending upon frequency of the OH absorption. We further report a linear wavenumber IR calibration for H2O quantification in hydrous ringwoodite across the Mg2SiO4-Fe2SiO4 solid solution, which will lead to more accurate estimations of the water content in both laboratory-grown and naturally occurring ringwoodites. Re-evaluation of the IR spectrum for a natural hydrous ringwoodite inclusion in diamond from the study of Pearson et al. (2014 indicates the crystal contains 1.43 ± 0.27 wt% H2O, thus confirming near-maximum amounts of H2O for this sample from the transition zone.

  16. Uncertainty Quantification in Geomagnetic Field Modeling

    Science.gov (United States)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  17. Rapid quantification and sex determination of forensic evidence materials.

    Science.gov (United States)

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  18. GMO quantification: valuable experience and insights for the future.

    Science.gov (United States)

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  19. On the complex quantification of risk: systems-based perspective on terrorism.

    Science.gov (United States)

    Haimes, Yacov Y

    2011-08-01

    This article highlights the complexity of the quantification of the multidimensional risk function, develops five systems-based premises on quantifying the risk of terrorism to a threatened system, and advocates the quantification of vulnerability and resilience through the states of the system. The five premises are: (i) There exists interdependence between a specific threat to a system by terrorist networks and the states of the targeted system, as represented through the system's vulnerability, resilience, and criticality-impact. (ii) A specific threat, its probability, its timing, the states of the targeted system, and the probability of consequences can be interdependent. (iii) The two questions in the risk assessment process: "What is the likelihood?" and "What are the consequences?" can be interdependent. (iv) Risk management policy options can reduce both the likelihood of a threat to a targeted system and the associated likelihood of consequences by changing the states (including both vulnerability and resilience) of the system. (v) The quantification of risk to a vulnerable system from a specific threat must be built on a systemic and repeatable modeling process, by recognizing that the states of the system constitute an essential step to construct quantitative metrics of the consequences based on intelligence gathering, expert evidence, and other qualitative information. The fact that the states of all systems are functions of time (among other variables) makes the time frame pivotal in each component of the process of risk assessment, management, and communication. Thus, risk to a system, caused by an initiating event (e.g., a threat) is a multidimensional function of the specific threat, its probability and time frame, the states of the system (representing vulnerability and resilience), and the probabilistic multidimensional consequences. © 2011 Society for Risk Analysis.

  20. Large differences in land use emission quantifications implied by definition discrepancies

    Science.gov (United States)

    Stocker, B. D.; Joos, F.

    2015-03-01

    The quantification of CO2 emissions from anthropogenic land use and land use change (eLUC) is essential to understand the drivers of the atmospheric CO2 increase and to inform climate change mitigation policy. Reported values in synthesis reports are commonly derived from different approaches (observation-driven bookkeeping and process-modelling) but recent work has emphasized that inconsistencies between methods may imply substantial differences in eLUC estimates. However, a consistent quantification is lacking and no concise modelling protocol for the separation of primary and secondary components of eLUC has been established. Here, we review the conceptual differences of eLUC quantification methods and apply an Earth System Model to demonstrate that what is claimed to represent total eLUC differs by up to ~20% when quantified from ESM vs. offline vegetation models. Under a future business-as-usual scenario, differences tend to increase further due to slowing land conversion rates and an increasing impact of altered environmental conditions on land-atmosphere fluxes. We establish how coupled Earth System Models may be applied to separate component fluxes of eLUC arising from the replacement of potential C sinks/sources and the land use feedback and show that secondary fluxes derived from offline vegetation models are conceptually and quantitatively not identical to either, nor their sum. Therefore, we argue that synthesis studies and global carbon budget accountings should resort to the "least common denominator" of different methods, following the bookkeeping approach where only primary land use emissions are quantified under the assumption of constant environmental boundary conditions.

  1. Iron overload in the liver diagnostic and quantification

    International Nuclear Information System (INIS)

    Alustiza, Jose M.; Castiella, Agustin; Juan, Maria D. de; Emparanza, Jose I.; Artetxe, Jose; Uranga, Maite

    2007-01-01

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification

  2. A highly sensitive method for quantification of iohexol

    DEFF Research Database (Denmark)

    Schulz, A.; Boeringer, F.; Swifka, J.

    2014-01-01

    -chromatography-electrospray-massspectrometry (LC-ESI-MS) approach using the multiple reaction monitoring mode for iohexol quantification. In order to test whether a significantly decreased amount of iohexol is sufficient for reliable quantification, a LC-ESI-MS approach was assessed. We analyzed the kinetics of iohexol in rats after application...... of different amounts of iohexol (15 mg to 150 1.tg per rat). Blood sampling was conducted at four time points, at 15, 30, 60, and 90 min, after iohexol injection. The analyte (iohexol) and the internal standard (iotha(amic acid) were separated from serum proteins using a centrifugal filtration device...... with a cut-off of 3 kDa. The chromatographic separation was achieved on an analytical Zorbax SB C18 column. The detection and quantification were performed on a high capacity trap mass spectrometer using positive ion ESI in the multiple reaction monitoring (MRM) mode. Furthermore, using real-time polymerase...

  3. Iron overload in the liver diagnostic and quantification

    Energy Technology Data Exchange (ETDEWEB)

    Alustiza, Jose M. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)]. E-mail: jmalustiza@osatek.es; Castiella, Agustin [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Juan, Maria D. de [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Emparanza, Jose I. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Artetxe, Jose [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Uranga, Maite [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)

    2007-03-15

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification.

  4. Stereological quantification of mast cells in human synovium

    DEFF Research Database (Denmark)

    Damsgaard, T E; Sørensen, Flemming Brandt; Herlin, T

    1999-01-01

    Mast cells participate in both the acute allergic reaction as well as in chronic inflammatory diseases. Earlier studies have revealed divergent results regarding the quantification of mast cells in the human synovium. The aim of the present study was therefore to quantify these cells in the human...... synovium, using stereological techniques. Different methods of staining and quantification have previously been used for mast cell quantification in human synovium. Stereological techniques provide precise and unbiased information on the number of cell profiles in two-dimensional tissue sections of......, in this case, human synovium. In 10 patients suffering from osteoarthritis a median of 3.6 mast cells/mm2 synovial membrane was found. The total number of cells (synoviocytes, fibroblasts, lymphocytes, leukocytes) present was 395.9 cells/mm2 (median). The mast cells constituted 0.8% of all the cell profiles...

  5. Superlattice band structure: New and simple energy quantification condition

    Energy Technology Data Exchange (ETDEWEB)

    Maiz, F., E-mail: fethimaiz@gmail.com [University of Cartage, Nabeul Engineering Preparatory Institute, Merazka, 8000 Nabeul (Tunisia); King Khalid University, Faculty of Science, Physics Department, P.O. Box 9004, Abha 61413 (Saudi Arabia)

    2014-10-01

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga{sub 0.5}Al{sub 0.5}As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results.

  6. Recent Developments in Quantification Methods for Metallothionein

    Czech Academy of Sciences Publication Activity Database

    Dabrio, M.; Rodriquez, A. R.; Bordin, G.; Bebiano, M. J.; De Ley, M.; Šestáková, Ivana; Vašák, M.; Nordberg, M.

    2002-01-01

    Roč. 88, č. 2 (2002), s. 123-134 ISSN 0162-0134 R&D Projects: GA MŠk OC D21.002; GA MŠk OC D8.10 Institutional research plan: CEZ:AV0Z4040901 Keywords : electrochemistry * metallothionein * mass spectrometry Subject RIV: CG - Electrochemistry Impact factor: 2.204, year: 2002

  7. Molecular quantification of environmental DNA using microfluidics and digital PCR.

    Science.gov (United States)

    Hoshino, Tatsuhiko; Inagaki, Fumio

    2012-09-01

    Real-time PCR has been widely used to evaluate gene abundance in natural microbial habitats. However, PCR-inhibitory substances often reduce the efficiency of PCR, leading to the underestimation of target gene copy numbers. Digital PCR using microfluidics is a new approach that allows absolute quantification of DNA molecules. In this study, digital PCR was applied to environmental samples, and the effect of PCR inhibitors on DNA quantification was tested. In the control experiment using λ DNA and humic acids, underestimation of λ DNA at 1/4400 of the theoretical value was observed with 6.58 ng μL(-1) humic acids. In contrast, digital PCR provided accurate quantification data with a concentration of humic acids up to 9.34 ng μL(-1). The inhibitory effect of paddy field soil extract on quantification of the archaeal 16S rRNA gene was also tested. By diluting the DNA extract, quantified copy numbers from real-time PCR and digital PCR became similar, indicating that dilution was a useful way to remedy PCR inhibition. The dilution strategy was, however, not applicable to all natural environmental samples. For example, when marine subsurface sediment samples were tested the copy number of archaeal 16S rRNA genes was 1.04×10(3) copies/g-sediment by digital PCR, whereas real-time PCR only resulted in 4.64×10(2) copies/g-sediment, which was most likely due to an inhibitory effect. The data from this study demonstrated that inhibitory substances had little effect on DNA quantification using microfluidics and digital PCR, and showed the great advantages of digital PCR in accurate quantifications of DNA extracted from various microbial habitats. Copyright © 2012 Elsevier GmbH. All rights reserved.

  8. Quantification is Neither Necessary Nor Sufficient for Measurement

    International Nuclear Information System (INIS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-01-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement

  9. 2D histomorphometric quantification from 3D computerized tomography

    International Nuclear Information System (INIS)

    Lima, Inaya; Oliveira, Luis Fernando de; Lopes, Ricardo T.; Jesus, Edgar Francisco O. de; Alves, Jose Marcos

    2002-01-01

    In the present article, preliminary results are presented showing the application of the tridimensional computerized microtomographic technique (3D-μCT) to bone tissue characterization, through histomorphometric quantification which are based on stereologic concepts. Two samples of human bone were correctly prepared to be submitted to the tomographic system. The system used to realize that process were a radiographic system with a microfocus X-ray tube. Through these three processes, acquisition, reconstruction and quantification, it was possible to get the good results and coherent to the literature data. From this point, it is intended to compare these results with the information due the conventional method, that is, conventional histomorphometry. (author)

  10. Quantification and Negation in Event Semantics

    Directory of Open Access Journals (Sweden)

    Lucas Champollion

    2010-12-01

    Full Text Available Recently, it has been claimed that event semantics does not go well together with quantification, especially if one rejects syntactic, LF-based approaches to quantifier scope. This paper shows that such fears are unfounded, by presenting a simple, variable-free framework which combines a Neo-Davidsonian event semantics with a type-shifting based account of quantifier scope. The main innovation is that the event variable is bound inside the verbal denotation, rather than at sentence level by existential closure. Quantifiers can then be interpreted in situ. The resulting framework combines the strengths of event semantics and type-shifting accounts of quantifiers and thus does not force the semanticist to posit either a default underlying word order or a syntactic LF-style level. It is therefore well suited for applications to languages where word order is free and quantifier scope is determined by surface order. As an additional benefit, the system leads to a straightforward account of negation, which has also been claimed to be problematic for event-based frameworks.ReferencesBarker, Chris. 2002. ‘Continuations and the nature of quantification’. Natural Language Semantics 10: 211–242.http://dx.doi.org/10.1023/A:1022183511876Barker, Chris & Shan, Chung-chieh. 2008. ‘Donkey anaphora is in-scope binding’. Semantics and Pragmatics 1: 1–46.Beaver, David & Condoravdi, Cleo. 2007. ‘On the logic of verbal modification’. In Maria Aloni, Paul Dekker & Floris Roelofsen (eds. ‘Proceedings of the Sixteenth Amsterdam Colloquium’, 3–9. Amsterdam, Netherlands: University of Amsterdam.Beghelli, Filippo & Stowell, Tim. 1997. ‘Distributivity and negation: The syntax of each and every’. In Anna Szabolcsi (ed. ‘Ways of scope taking’, 71–107. Dordrecht, Netherlands: Kluwer.Brasoveanu, Adrian. 2010. ‘Modified Numerals as Post-Suppositions’. In Maria Aloni, Harald Bastiaanse, Tikitu de Jager & Katrin Schulz (eds. ‘Logic, Language

  11. Stimulation and quantification of Babesia divergens gametocytogenesis

    Czech Academy of Sciences Publication Activity Database

    Jalovecká, Marie; Bonsergent, C.; Hajdušek, Ondřej; Kopáček, Petr; Malandrin, L.

    2016-01-01

    Roč. 9, č. 1 (2016), č. článku 439. ISSN 1756-3305 R&D Projects: GA ČR GA13-11043S; GA ČR GP13-27630P; GA ČR GJ15-12006Y Institutional support: RVO:60077344 Keywords : Babesia divergens * gametocytes * transmission * bdccp genes * qRT-PCR Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 3.080, year: 2016

  12. HPLC for simultaneous quantification of total ceramide, glucosylceramide, and ceramide trihexoside concentrations in plasma

    NARCIS (Netherlands)

    Groener, Johanna E. M.; Poorthuis, Ben J. H. M.; Kuiper, Sijmen; Helmond, Mariette T. J.; Hollak, Carla E. M.; Aerts, Johannes M. F. G.

    2007-01-01

    BACKGROUND: Simple, reproducible assays are needed for the quantification of sphingolipids, ceramide (Cer), and sphingoid bases. We developed an HPLC method for simultaneous quantification of total plasma concentrations of Cer, glucosylceramide (GlcCer), and ceramide trihexoside (CTH). METHODS:

  13. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    Science.gov (United States)

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  14. Two-Phase Microfluidic Systems for High Throughput Quantification of Agglutination Assays

    KAUST Repository

    Castro, David

    2018-01-01

    assay, with a minimum detection limit of 50 ng/mL using optical image analysis. We compare optical image analysis and light scattering as quantification methods, and demonstrate the first light scattering quantification of agglutination assays in a two

  15. Quantification of Gravel Rural Road Sediment Production

    Science.gov (United States)

    Silliman, B. A.; Myers Toman, E.

    2014-12-01

    Unbound rural roads are thought to be one of the largest anthropogenic sources of sediment reaching stream channels in small watersheds. This sediment deposition can reduce water quality in the streams negatively impacting aquatic habitat as well as impacting municipal drinking water sources. These roads are thought to see an increase in construction and use in southeast Ohio due to the expansion of shale gas development in the region. This study set out to quantify the amount of sediment these rural roads are able to produce. A controlled rain event of 12.7 millimeters of rain over a half hour period was used to drive sediment production over a 0.03 kilometer section of gravel rural road. These 8 segments varied in many characteristics and produced from 2.0 to 8.4 kilograms of sediment per 0.03 kilometers of road with the average production over the 8 segments being 5.5 kilograms of sediment. Sediment production was not strongly correlated with road segment slope but traffic was found to increase sediment production from 1.1 to 3.9 times as much sediment after traffic use. These results will help inform watershed scale sediment budgeting, and inform best management practices for road maintenance and construction. This study also adds to the understanding of the impacts of rural road use and construction associated with the changing land use from agricultural to natural gas extraction.

  16. Reliable quantification of phthalates in environmental matrices (air, water, sludge, sediment and soil): a review.

    Science.gov (United States)

    Net, Sopheak; Delmont, Anne; Sempéré, Richard; Paluselli, Andrea; Ouddane, Baghdad

    2015-05-15

    Because of their widespread application, phthalates or phthalic acid esters (PAEs) are ubiquitous in the environment. Their presence has attracted considerable attention due to their potential impacts on ecosystem functioning and on public health, so their quantification has become a necessity. Various extraction procedures as well as gas/liquid chromatography and mass spectrometry detection techniques are found as suitable for reliable detection of such compounds. However, PAEs are ubiquitous in the laboratory environment including ambient air, reagents, sampling equipment, and various analytical devices, that induces difficult analysis of real samples with a low PAE background. Therefore, accurate PAE analysis in environmental matrices is a challenging task. This paper reviews the extensive literature data on the techniques for PAE quantification in natural media. Sampling, sample extraction/pretreatment and detection for quantifying PAEs in different environmental matrices (air, water, sludge, sediment and soil) have been reviewed and compared. The concept of "green analytical chemistry" for PAE determination is also discussed. Moreover useful information about the material preparation and the procedures of quality control and quality assurance are presented to overcome the problem of sample contamination and these encountered due to matrix effects in order to avoid overestimating PAE concentrations in the environment. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Uncertainties and quantification of common cause failure rates and probabilities for system analyses

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2005-01-01

    Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology

  18. Quantification of viable spray-dried potential probiotic lactobacilli using real-time PCR

    Directory of Open Access Journals (Sweden)

    Radulović Zorica

    2012-01-01

    Full Text Available The basic requirement for probiotic bacteria to be able to perform expected positive effects is to be alive. Therefore, appropriate quantification methods are crucial. Bacterial quantification based on nucleic acid detection is increasingly used. Spray-drying (SD is one of the possibilities to improve the survival of probiotic bacteria against negative environmental effects. The aim of this study was to investigate the survival of spray-dried Lactobacillus plantarum 564 and Lactobacillus paracasei Z-8, and to investigate the impact on some probiotic properties caused by SD of both tested strains. Besides the plate count technique, the aim was to examine the possibility of using propidium monoazide (PMA in combination with real-time polymerase chain reaction (PCR for determining spray-dried tested strains. The number of intact cells, Lb. plantarum 564 and Lb. paracasei Z-8, was determined by real-time PCR with PMA, and it was similar to the number of investigated strains obtained by the plate count method. Spray-dried Lb. plantarum 564 and Lb. paracasei Z-8 demonstrated very good probiotic ability. It may be concluded that the PMA real-time PCR determination of the viability of probiotic bacteria could complement the plate count method and SD may be a cost-effective way to produce large quantities of some probiotic cultures. [Projekat Ministarstva nauke Republike Srbije, br. 046010

  19. Identification of flow paths and quantification of return flow volumes and timing at field scale

    Science.gov (United States)

    Claes, N.; Paige, G. B.; Parsekian, A.

    2017-12-01

    Flood irrigation, which constitutes a large part of agricultural water use, accounts for a significant amount of the water that is diverted from western streams. Return flow, the portion of the water applied to irrigated areas that returns to the stream, is important for maintaining base flows in streams and ecological function of riparian zones and wetlands hydrologically linked with streams. Prediction of timing and volumes of return flow during and after flood irrigation pose a challenge due to the heterogeneity of pedogenic and soil physical factors that influence vadose zone processes. In this study, we quantify volumes of return flow and potential pathways in the subsurface through a vadose zone flow model that is informed by both hydrological and geophysical observations in a Bayesian setting. We couple a two-dimensional vadose zone flow model through a Bayesian Markov Chain Monte Carlo approach with time lapse ERT, borehole NMR datasets that are collected during and after flood irrigation experiments, and soil physical lab analysis. The combination of both synthetic models and field observations leads to flow path identification and allows for quantification of volumes and timing and associated uncertainties of subsurface return that stems from flood irrigation. The quantification of the impact of soil heterogeneity enables us to translate these results to other sites and predict return flow under different soil physical settings. This is key when managing irrigation water resources and predictions of outcomes of different scenarios have to be evaluated.

  20. A reduction approach to improve the quantification of linked fault trees through binary decision diagrams

    International Nuclear Information System (INIS)

    Ibanez-Llano, Cristina; Rauzy, Antoine; Melendez, Enrique; Nieto, Francisco

    2010-01-01

    Over the last two decades binary decision diagrams have been applied successfully to improve Boolean reliability models. Conversely to the classical approach based on the computation of the MCS, the BDD approach involves no approximation in the quantification of the model and is able to handle correctly negative logic. However, when models are sufficiently large and complex, as for example the ones coming from the PSA studies of the nuclear industry, it begins to be unfeasible to compute the BDD within a reasonable amount of time and computer memory. Therefore, simplification or reduction of the full model has to be considered in some way to adapt the application of the BDD technology to the assessment of such models in practice. This paper proposes a reduction process based on using information provided by the set of the most relevant minimal cutsets of the model in order to perform the reduction directly on it. This allows controlling the degree of reduction and therefore the impact of such simplification on the final quantification results. This reduction is integrated in an incremental procedure that is compatible with the dynamic generation of the event trees and therefore adaptable to the recent dynamic developments and extensions of the PSA studies. The proposed method has been applied to a real case study, and the results obtained confirm that the reduction enables the BDD computation while maintaining accuracy.

  1. A reduction approach to improve the quantification of linked fault trees through binary decision diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez-Llano, Cristina, E-mail: cristina.ibanez@iit.upcomillas.e [Instituto de Investigacion Tecnologica (IIT), Escuela Tecnica Superior de Ingenieria ICAI, Universidad Pontificia Comillas, C/Santa Cruz de Marcenado 26, 28015 Madrid (Spain); Rauzy, Antoine, E-mail: Antoine.RAUZY@3ds.co [Dassault Systemes, 10 rue Marcel Dassault CS 40501, 78946 Velizy Villacoublay, Cedex (France); Melendez, Enrique, E-mail: ema@csn.e [Consejo de Seguridad Nuclear (CSN), C/Justo Dorado 11, 28040 Madrid (Spain); Nieto, Francisco, E-mail: nieto@iit.upcomillas.e [Instituto de Investigacion Tecnologica (IIT), Escuela Tecnica Superior de Ingenieria ICAI, Universidad Pontificia Comillas, C/Santa Cruz de Marcenado 26, 28015 Madrid (Spain)

    2010-12-15

    Over the last two decades binary decision diagrams have been applied successfully to improve Boolean reliability models. Conversely to the classical approach based on the computation of the MCS, the BDD approach involves no approximation in the quantification of the model and is able to handle correctly negative logic. However, when models are sufficiently large and complex, as for example the ones coming from the PSA studies of the nuclear industry, it begins to be unfeasible to compute the BDD within a reasonable amount of time and computer memory. Therefore, simplification or reduction of the full model has to be considered in some way to adapt the application of the BDD technology to the assessment of such models in practice. This paper proposes a reduction process based on using information provided by the set of the most relevant minimal cutsets of the model in order to perform the reduction directly on it. This allows controlling the degree of reduction and therefore the impact of such simplification on the final quantification results. This reduction is integrated in an incremental procedure that is compatible with the dynamic generation of the event trees and therefore adaptable to the recent dynamic developments and extensions of the PSA studies. The proposed method has been applied to a real case study, and the results obtained confirm that the reduction enables the BDD computation while maintaining accuracy.

  2. Optimization of SPECT calibration for quantification of images applied to dosimetry with iodine-131

    International Nuclear Information System (INIS)

    Carvalho, Samira Marques de

    2018-01-01

    SPECT systems calibration plays an essential role in the accuracy of the quantification of images. In this work, in its first stage, an optimized SPECT calibration method was proposed for 131 I studies, considering the partial volume effect (PVE) and the position of the calibration source. In the second stage, the study aimed to investigate the impact of count density and reconstruction parameters on the determination of the calibration factor and the quantification of the image in dosimetry studies, considering the reality of clinical practice in Brazil. In the final step, the study aimed evaluating the influence of several factors in the calibration for absorbed dose calculation using Monte Carlo simulations (MC) GATE code. Calibration was performed by determining a calibration curve (sensitivity versus volume) obtained by applying different thresholds. Then, the calibration factors were determined with an exponential function adjustment. Images were performed with high and low counts densities for several source positions within the simulator. To validate the calibration method, the calibration factors were used for absolute quantification of the total reference activities. The images were reconstructed adopting two approaches of different parameters, usually used in patient images. The methodology developed for the calibration of the tomographic system was easier and faster to implement than other procedures suggested to improve the accuracy of the results. The study also revealed the influence of the location of the calibration source, demonstrating better precision in the absolute quantification considering the location of the target region during the calibration of the system. The study applied in the Brazilian thyroid protocol suggests the revision of the calibration of the SPECT system, including different positions for the reference source, besides acquisitions considering the Signal to Noise Ratio (SNR) of the images. Finally, the doses obtained with the

  3. An open tool for input function estimation and quantification of dynamic PET FDG brain scans.

    Science.gov (United States)

    Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro

    2016-08-01

    Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main

  4. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    Science.gov (United States)

    Akram, Muhammad Farooq Bin

    The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for

  5. Semi-automated quantification of living cells with internalized nanostructures

    KAUST Repository

    Margineanu, Michael B.; Julfakyan, Khachatur; Sommer, Christoph; Perez, Jose E.; Contreras, Maria F.; Khashab, Niveen M.; Kosel, Jü rgen; Ravasi, Timothy

    2016-01-01

    novel method for the quantification of cells that internalize a specific type of nanostructures. This approach is suitable for high-throughput and real-time data analysis and has the potential to be used to study the interaction of different types

  6. Quantification of the sequestration of indium 111 labelled platelets

    International Nuclear Information System (INIS)

    Najean, Y.; Picard, N.; Dufour, V.; Rain, J.D.

    1988-01-01

    A simple method is proposed for an accurate quantification of the splenic and/or hepatic sequestration of the 111 In-labelled platelets. It could be allow a better prediction of the efficiency of splenectomy in idiopathic thrombocytopenic purpura [fr

  7. Data-driven Demand Response Characterization and Quantification

    DEFF Research Database (Denmark)

    Le Ray, Guillaume; Pinson, Pierre; Larsen, Emil Mahler

    2017-01-01

    Analysis of load behavior in demand response (DR) schemes is important to evaluate the performance of participants. Very few real-world experiments have been carried out and quantification and characterization of the response is a difficult task. Nevertheless it will be a necessary tool for portf...

  8. MRI-based quantification of brain damage in cerebrovascular disorders

    NARCIS (Netherlands)

    de Bresser, J.H.J.M.

    2011-01-01

    Brain diseases can lead to diverse structural abnormalities that can be assessed on magnetic resonance imaging (MRI) scans. These abnormalities can be quantified by (semi-)automated techniques. The studies described in this thesis aimed to optimize and apply cerebral quantification techniques in

  9. Automated image analysis for quantification of filamentous bacteria

    DEFF Research Database (Denmark)

    Fredborg, Marlene; Rosenvinge, Flemming Schønning; Spillum, Erik

    2015-01-01

    in systems relying on colorimetry or turbidometry (such as Vitek-2, Phoenix, MicroScan WalkAway). The objective was to examine an automated image analysis algorithm for quantification of filamentous bacteria using the 3D digital microscopy imaging system, oCelloScope. Results Three E. coli strains displaying...

  10. Standardless quantification by parameter optimization in electron probe microanalysis

    International Nuclear Information System (INIS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-01-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: ► A method for standardless quantification in EPMA is presented. ► It gives better results than the commercial software GENESIS Spectrum. ► It gives better results than the software DTSA. ► It allows the determination of the conductive coating thickness. ► It gives an estimation for the concentration uncertainties.

  11. Damage Localization and Quantification of Earthquake Excited RC-Frames

    DEFF Research Database (Denmark)

    Skjærbæk, P.S.; Nielsen, Søren R.K.; Kirkegaard, Poul Henning

    In the paper a recently proposed method for damage localization and quantification of RC-structures from response measurements is tested on experimental data. The method investigated requires at least one response measurement along the structure and the ground surface acceleration. Further, the t...

  12. Automatic quantification of subarachnoid hemorrhage on noncontrast CT

    NARCIS (Netherlands)

    Boers, Anna Maria Merel; Zijlstra, I.A.; Gathier, C.S.; van den Berg, R.; Slump, Cornelis H.; Marquering, H.A.; Majoie, C.B.

    2014-01-01

    Quantification of blood after SAH on initial NCCT is an important radiologic measure to predict patient outcome and guide treatment decisions. In current scales, hemorrhage volume and density are not accounted for. The purpose of this study was to develop and validate a fully automatic method for

  13. Enhancement of Electroluminescence (EL) image measurements for failure quantification methods

    DEFF Research Database (Denmark)

    Parikh, Harsh; Spataru, Sergiu; Sera, Dezso

    2018-01-01

    Enhanced quality images are necessary for EL image analysis and failure quantification. A method is proposed which determines image quality in terms of more accurate failure detection of solar panels through electroluminescence (EL) imaging technique. The goal of the paper is to determine the most...

  14. Investigation on feasibility of recurrence quantification analysis for ...

    African Journals Online (AJOL)

    The RQA parameters such as percent recurrence (REC), trapping time (TT), percent laminarity (LAM) and entropy (ENT), and also the recurrence plots color patterns for different flank wear, can be used in detecting insert wear in face milling. Keywords: milling, flank wear, recurrence plot, recurrence quantification analysis.

  15. Quantification and presence of human ancient DNA in burial place ...

    African Journals Online (AJOL)

    Quantification and presence of human ancient DNA in burial place remains of Turkey using real time polymerase chain reaction. ... A published real-time PCR assay, which allows for the combined analysis of nuclear or ancient DNA and mitochondrial DNA, was modified. This approach can be used for recovering DNA from ...

  16. Real-Time PCR for Universal Phytoplasma Detection and Quantification

    DEFF Research Database (Denmark)

    Christensen, Nynne Meyn; Nyskjold, Henriette; Nicolaisen, Mogens

    2013-01-01

    Currently, the most efficient detection and precise quantification of phytoplasmas is by real-time PCR. Compared to nested PCR, this method is less sensitive to contamination and is less work intensive. Therefore, a universal real-time PCR method will be valuable in screening programs and in other...

  17. Double-layer Tablets of Lornoxicam: Validation of Quantification ...

    African Journals Online (AJOL)

    Double-layer Tablets of Lornoxicam: Validation of Quantification Method, In vitro Dissolution and Kinetic Modelling. ... Satisfactory results were obtained from all the tablet formulations met compendial requirements. The slowest drug release rate was obtained with tablet cores based on PVP K90 (1.21 mg%.h-1).

  18. Methane emission quantification from landfills using a double tracer approach

    DEFF Research Database (Denmark)

    Scheutz, Charlotte; Samuelsson, J.; Fredenslund, Anders Michael

    2007-01-01

    A tracer method was successfully used for quantification of the whole methane (CH4) emission from Fakse landfill. By using two different tracers the emission from different sections of the landfill could be quantified. Furthermore, is was possible to determine the emissions from local on site...

  19. Protocol for Quantification of Defects in Natural Fibres for Composites

    DEFF Research Database (Denmark)

    Mortensen, Ulrich Andreas; Madsen, Bo

    2014-01-01

    Natural bast-type plant fibres are attracting increasing interest for being used for structural composite applications where high quality fibres with good mechanical properties are required. A protocol for the quantification of defects in natural fibres is presented. The protocol is based...

  20. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Science.gov (United States)

    Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...

  1. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian [University of Bern, From the Institute of Forensic Medicine, Bern (Switzerland); Persson, Anders; Warntjes, Marcel J. [University of Linkoeping, The Center for Medical Image Science and Visualization (CMIV), Linkoeping (Sweden)

    2015-08-15

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  2. Quantification of microbial quality and safety in minimally processed foods

    NARCIS (Netherlands)

    Zwietering, M.H.

    2002-01-01

    To find a good equilibrium between quality and margin of safety of minimally processed foods, often various hurdles are used. Quantification of the kinetics should be used to approach an optimum processing and to select the main aspects. Due to many factors of which the exact quantitative effect is

  3. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    Science.gov (United States)

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-04-06

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality.

  4. Machine Learning for Quantification of Small Vessel Disease Imaging Biomarkers

    NARCIS (Netherlands)

    Ghafoorian, M.

    2018-01-01

    This thesis is devoted to developing fully automated methods for quantification of small vessel disease imaging bio-markers, namely WMHs and lacunes, using vari- ous machine learning/deep learning and computer vision techniques. The rest of the thesis is organized as follows: Chapter 2 describes

  5. Direct quantification of nickel in stainless steels by spectrophotometry

    International Nuclear Information System (INIS)

    Singh, Ritu; Raut, Vaibhavi V.; Jeyakumar, S.; Ramakumar, K.L.

    2007-01-01

    A spectrophotometric method based on the Ni-DMG complex for the quantification of nickel in steel samples without employing any prior separation is reported in the present study. The interfering ions are masked by suitable complexing agents and the method was extended to real samples after validating with BCS and Euro steel standards. (author)

  6. Development of a competitive PCR assay for the quantification of ...

    African Journals Online (AJOL)

    ONOS

    2010-01-25

    Jan 25, 2010 ... quantification of total Escherichia coli DNA in water. Omar Kousar Banu, Barnard .... Thereafter the product was ligated into the pGEM®T-easy cloning ... agarose gel using the high pure PCR product purification kit. (Roche® ...

  7. Recognition and quantification of pain in horses: A tutorial review

    DEFF Research Database (Denmark)

    Gleerup, Karina Charlotte Bech; Lindegaard, Casper

    2016-01-01

    Pain management is dependent on the quality of the pain evaluation. Ideally, pain evaluation is objective, pain-specific and easily incorporated into a busy equine clinic. This paper reviews the existing knowledge base regarding the identification and quantification of pain in horses. Behavioural...

  8. Quantification in dynamic and small-animal positron emission tomography

    NARCIS (Netherlands)

    Disselhorst, Johannes Antonius

    2011-01-01

    This thesis covers two aspects of positron emission tomography (PET) quantification. The first section addresses the characterization and optimization of a small-animal PET/CT scanner. The sensitivity and resolution as well as various parameters affecting image quality (reconstruction settings, type

  9. 15 CFR 990.52 - Injury assessment-quantification.

    Science.gov (United States)

    2010-01-01

    ... (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION ACT..., trustees must quantify the degree, and spatial and temporal extent of such injuries relative to baseline. (b) Quantification approaches. Trustees may quantify injuries in terms of: (1) The degree, and...

  10. A posteriori uncertainty quantification of PIV-based pressure data

    NARCIS (Netherlands)

    Azijli, I.; Sciacchitano, A.; Ragni, D.; Palha Da Silva Clérigo, A.; Dwight, R.P.

    2016-01-01

    A methodology for a posteriori uncertainty quantification of pressure data retrieved from particle image velocimetry (PIV) is proposed. It relies upon the Bayesian framework, where the posterior distribution (probability distribution of the true velocity, given the PIV measurements) is obtained from

  11. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    International Nuclear Information System (INIS)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian; Persson, Anders; Warntjes, Marcel J.

    2015-01-01

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  12. A Study of Tongue and Pulse Diagnosis in Traditional Korean Medicine for Stroke Patients Based on Quantification Theory Type II

    Directory of Open Access Journals (Sweden)

    Mi Mi Ko

    2013-01-01

    Full Text Available In traditional Korean medicine (TKM, pattern identification (PI diagnosis is important for treating diseases. The aim of this study was to comprehensively investigate the relationship between the PI type and tongue diagnosis or pulse diagnosis variables. The study included 1,879 stroke patients who were admitted to 12 oriental medical university hospitals from June 2006 through March 2009. The status of the pulse and tongue was examined in each patient. Additionally, to investigate relatively important indicators related to specialist PI, the quantification theory type II analysis was performed regarding the PI type. In the first axis quantification of the external criteria, the Qi-deficiency and the Yin-deficiency patterns were located in the negative direction, while the dampness-phlegm (DP and fire-heat patterns were located in the positive direction. The explanatory variable with the greatest impact on the assessment was a fine pulse. In the second axis quantification, the external criteria were divided into either the DP or non-DP patterns. The slippery pulse exhibited the greatest effect on the division. This study attempted to build a model using a statistical method to objectively quantify PI and various indicators that constitute the unique diagnosis system of TKM. These results should assist the development of future diagnostic standards in stroke PI.

  13. Effect of gadolinium on hepatic fat quantification using multi-echo reconstruction technique with T2* correction and estimation

    Energy Technology Data Exchange (ETDEWEB)

    Ge, Mingmei; Wu, Bing; Liu, Zhiqin; Song, Hai; Meng, Xiangfeng; Wu, Xinhuai [The Military General Hospital of Beijing PLA, Department of Radiology, Beijing (China); Zhang, Jing [The 309th Hospital of Chinese People' s Liberation Army, Department of Radiology, Beijing (China)

    2016-06-15

    To determine whether hepatic fat quantification is affected by administration of gadolinium using a multiecho reconstruction technique with T2* correction and estimation. Forty-eight patients underwent the investigational sequence for hepatic fat quantification at 3.0T MRI once before and twice after administration of gadopentetate dimeglumine (0.1 mmol/kg). A one-way repeated-measures analysis of variance with pairwise comparisons was conducted to evaluate the systematic bias of fat fraction (FF) and R2* measurements between three acquisitions. Bland-Altman plots were used to assess the agreements between pre- and post-contrast FF measurements in the liver. A P value <0.05 indicated statistically significant difference. FF measurements of liver, spleen and spine revealed no significant systematic bias between the three measurements (P > 0.05 for all). Good agreements (95 % confidence interval) of FF measurements were demonstrated between pre-contrast and post-contrast1 (-0.49 %, 0.52 %) and post-contrast2 (-0.83 %, 0.77 %). R2* increased in liver and spleen (P = 0.039, P = 0.01) after administration of gadolinium. Although under the impact of an increased R2* in liver and spleen post-contrast, the investigational sequence can still obtain stable fat quantification. Therefore, it could be applied post-contrast to substantially increase the efficiency of MR examination and also provide a backup for the occasional failure of FF measurements pre-contrast. (orig.)

  14. MDCT quantification is the dominant parameter in decision–making regarding chest tube drainage for stable patients with traumatic pneumothorax

    Science.gov (United States)

    Cai, Wenli; Lee, June-Goo; Fikry, Karim; Yoshida, Hiroyuki; Novelline, Robert; de Moya, Marc

    2013-01-01

    It is commonly believed that the size of a pneumothorax is an important determinant of treatment decision, in particular regarding whether chest tube drainage (CTD) is required. However, the volumetric quantification of pneumothoraces has not routinely been performed in clinics. In this paper, we introduced an automated computer-aided volumetry (CAV) scheme for quantification of volume of pneumothoraces in chest multi-detect CT (MDCT) images. Moreover, we investigated the impact of accurate volume of pneumothoraces in the improvement of the performance in decision-making regarding CTD in the management of traumatic pneumothoraces. For this purpose, an occurrence frequency map was calculated for quantitative analysis of the importance of each clinical parameter in the decision-making regarding CTD by a computer simulation of decision-making using a genetic algorithm (GA) and a support vector machine (SVM). A total of 14 clinical parameters, including volume of pneumothorax calculated by our CAV scheme, was collected as parameters available for decision-making. The results showed that volume was the dominant parameter in decision-making regarding CTD, with an occurrence frequency value of 1.00. The results also indicated that the inclusion of volume provided the best performance that was statistically significant compared to the other tests in which volume was excluded from the clinical parameters. This study provides the scientific evidence for the application of CAV scheme in MDCT volumetric quantification of pneumothoraces in the management of clinically stable chest trauma patients with traumatic pneumothorax. PMID:22560899

  15. Effect of gadolinium on hepatic fat quantification using multi-echo reconstruction technique with T2* correction and estimation

    International Nuclear Information System (INIS)

    Ge, Mingmei; Wu, Bing; Liu, Zhiqin; Song, Hai; Meng, Xiangfeng; Wu, Xinhuai; Zhang, Jing

    2016-01-01

    To determine whether hepatic fat quantification is affected by administration of gadolinium using a multiecho reconstruction technique with T2* correction and estimation. Forty-eight patients underwent the investigational sequence for hepatic fat quantification at 3.0T MRI once before and twice after administration of gadopentetate dimeglumine (0.1 mmol/kg). A one-way repeated-measures analysis of variance with pairwise comparisons was conducted to evaluate the systematic bias of fat fraction (FF) and R2* measurements between three acquisitions. Bland-Altman plots were used to assess the agreements between pre- and post-contrast FF measurements in the liver. A P value <0.05 indicated statistically significant difference. FF measurements of liver, spleen and spine revealed no significant systematic bias between the three measurements (P > 0.05 for all). Good agreements (95 % confidence interval) of FF measurements were demonstrated between pre-contrast and post-contrast1 (-0.49 %, 0.52 %) and post-contrast2 (-0.83 %, 0.77 %). R2* increased in liver and spleen (P = 0.039, P = 0.01) after administration of gadolinium. Although under the impact of an increased R2* in liver and spleen post-contrast, the investigational sequence can still obtain stable fat quantification. Therefore, it could be applied post-contrast to substantially increase the efficiency of MR examination and also provide a backup for the occasional failure of FF measurements pre-contrast. (orig.)

  16. Assessment of DNA degradation induced by thermal and UV radiation processing: implications for quantification of genetically modified organisms.

    Science.gov (United States)

    Ballari, Rajashekhar V; Martin, Asha

    2013-12-01

    DNA quality is an important parameter for the detection and quantification of genetically modified organisms (GMO's) using the polymerase chain reaction (PCR). Food processing leads to degradation of DNA, which may impair GMO detection and quantification. This study evaluated the effect of various processing treatments such as heating, baking, microwaving, autoclaving and ultraviolet (UV) irradiation on the relative transgenic content of MON 810 maize using pRSETMON-02, a dual target plasmid as a model system. Amongst all the processing treatments examined, autoclaving and UV irradiation resulted in the least recovery of the transgenic (CaMV 35S promoter) and taxon-specific (zein) target DNA sequences. Although a profound impact on DNA degradation was seen during the processing, DNA could still be reliably quantified by Real-time PCR. The measured mean DNA copy number ratios of the processed samples were in agreement with the expected values. Our study confirms the premise that the final analytical value assigned to a particular sample is independent of the degree of DNA degradation since the transgenic and the taxon-specific target sequences possessing approximately similar lengths degrade in parallel. The results of our study demonstrate that food processing does not alter the relative quantification of the transgenic content provided the quantitative assays target shorter amplicons and the difference in the amplicon size between the transgenic and taxon-specific genes is minimal. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Performance of the Real-Q EBV Quantification Kit for Epstein-Barr Virus DNA Quantification in Whole Blood.

    Science.gov (United States)

    Huh, Hee Jae; Park, Jong Eun; Kim, Ji Youn; Yun, Sun Ae; Lee, Myoung Keun; Lee, Nam Yong; Kim, Jong Won; Ki, Chang Seok

    2017-03-01

    There has been increasing interest in standardized and quantitative Epstein-Barr virus (EBV) DNA testing for the management of EBV disease. We evaluated the performance of the Real-Q EBV Quantification Kit (BioSewoom, Korea) in whole blood (WB). Nucleic acid extraction and real-time PCR were performed by using the MagNA Pure 96 (Roche Diagnostics, Germany) and 7500 Fast real-time PCR system (Applied Biosystems, USA), respectively. Assay sensitivity, linearity, and conversion factor were determined by using the World Health Organization international standard diluted in EBV-negative WB. We used 81 WB clinical specimens to compare performance of the Real-Q EBV Quantification Kit and artus EBV RG PCR Kit (Qiagen, Germany). The limit of detection (LOD) and limit of quantification (LOQ) for the Real-Q kit were 453 and 750 IU/mL, respectively. The conversion factor from EBV genomic copies to IU was 0.62. The linear range of the assay was from 750 to 10⁶ IU/mL. Viral load values measured with the Real-Q assay were on average 0.54 log₁₀ copies/mL higher than those measured with the artus assay. The Real-Q assay offered good analytical performance for EBV DNA quantification in WB.

  18. Comparative quantification of alcohol exposure as risk factor for global burden of disease.

    Science.gov (United States)

    Rehm, Jürgen; Klotsche, Jens; Patra, Jayadeep

    2007-01-01

    Alcohol has been identified as one of the most important risk factors in the burden experienced as a result of disease. The objective of the present contribution is to establish a framework to comparatively quantify alcohol exposure as it is relevant for burden of disease. Different key indicators are combined to derive this quantification. First, adult per capita consumption, composed of recorded and unrecorded consumption, yields the best overall estimate of alcohol exposure for a country or region. Second, survey information is used to allocate the per capita consumption into sex and age groups. Third, an index for detrimental patterns of drinking is used to determine the additional impact on injury and cardiovascular burden. The methodology is applied to estimate global alcohol exposure for the year 2002. Finally, assumptions and potential problems of the approach are discussed. Copyright (c) 2007 John Wiley & Sons, Ltd.

  19. Quantification of biologically effective environmental UV irradiance

    Science.gov (United States)

    Horneck, G.

    To determine the impact of environmental UV radiation on human health and ecosystems demands monitoring systems that weight the spectral irradiance according to the biological responses under consideration. In general, there are three different approaches to quantify a biologically effective solar irradiance: (i) weighted spectroradiometry where the biologically weighted radiometric quantities are derived from spectral data by multiplication with an action spectrum of a relevant photobiological reaction, e.g. erythema, DNA damage, skin cancer, reduced productivity of terrestrial plants and aquatic foodweb; (ii) wavelength integrating chemical-based or physical dosimetric systems with spectral sensitivities similar to a biological response curve; and (iii) biological dosimeters that directly weight the incident UV components of sunlight in relation to the effectiveness of the different wavelengths and to interactions between them. Most biological dosimeters, such as bacteria, bacteriophages, or biomolecules, are based on the UV sensitivity of DNA. If precisely characterized, biological dosimeters are applicable as field and personal dosimeters.

  20. Quantification of analytes affected by relevant interfering signals under quality controlled conditions

    International Nuclear Information System (INIS)

    Bettencourt da Silva, Ricardo J.N.; Santos, Julia R.; Camoes, M. Filomena G.F.C.

    2006-01-01

    The analysis of organic contaminants or residues in biological samples is frequently affected by the presence of compounds producing interfering instrumental signals. This feature is responsible for the higher complexity and cost of these analyses and/or by a significant reduction of the number of studied analytes in a multi-analyte method. This work presents a methodology to estimate the impact of the interfering compounds on the quality of the analysis of complex samples, based on separative instrumental methods of analysis, aiming at supporting the inclusion of analytes affected by interfering compounds in the list of compounds analysed in the studied samples. The proposed methodology involves the study of the magnitude of the signal produced by the interfering compounds in the analysed matrix, and is applicable to analytical systems affected by interfering compounds with varying concentration in the studied matrix. The proposed methodology is based on the comparison of the signals from a representative number of examples of the studied matrix, in order to estimate the impact of the presence of such compounds on the measurement quality. The treatment of the chromatographic signals necessary to collect these data can be easily performed considering algorithms of subtraction of chromatographic signals available in most of the analytical instrumentation software. The subtraction of the interfering compounds signal from the sample signal allows the compensation of the interfering effect irrespective of the relative magnitude of the interfering and analyte signals, supporting the applicability of the same model of the method performance for a broader concentration range. The quantification of the measurement uncertainty was performed using the differential approach, which allows the estimation of the contribution of the presence of the interfering compounds to the quality of the measurement. The proposed methodology was successfully applied to the analysis of

  1. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR)

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Baiyu [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 and Carl E. Ravin Advanced Imaging Laboratories, Duke University, Durham, North Carolina 27705 (United States); Barnhart, Huiman [Department of Biostatistics and Bioinformatics, Duke University, Durham, North Carolina 27705 (United States); Richard, Samuel [Carl E. Ravin Advanced Imaging Laboratories, Duke University, Durham, North Carolina 27705 and Department of Radiology, Duke University, Durham, North Carolina 27705 (United States); Robins, Marthony [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Colsher, James [Department of Radiology, Duke University, Durham, North Carolina 27705 (United States); Samei, Ehsan [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Duke University, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University, Durham, North Carolina 27705 (United States); Department of Physics, Department of Biomedical Engineering, and Department of Electronic and Computer Engineering, Duke University, Durham, North Carolina 27705 (United States)

    2013-11-15

    Purpose: Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables.Methods: Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision.Results: Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A.Conclusions: The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of

  2. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR).

    Science.gov (United States)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Robins, Marthony; Colsher, James; Samei, Ehsan

    2013-11-01

    Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables. Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision. Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A. The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of accuracy on reconstruction algorithms

  3. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR)

    International Nuclear Information System (INIS)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Robins, Marthony; Colsher, James; Samei, Ehsan

    2013-01-01

    Purpose: Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables.Methods: Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision.Results: Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A.Conclusions: The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of

  4. Selective classification and quantification model of C&D waste from material resources consumed in residential building construction.

    Science.gov (United States)

    Mercader-Moyano, Pilar; Ramírez-de-Arellano-Agudo, Antonio

    2013-05-01

    The unfortunate economic situation involving Spain and the European Union is, among other factors, the result of intensive construction activity over recent years. The excessive consumption of natural resources, together with the impact caused by the uncontrolled dumping of untreated C&D waste in illegal landfills have caused environmental pollution and a deterioration of the landscape. The objective of this research was to generate a selective classification and quantification model of C&D waste based on the material resources consumed in the construction of residential buildings, either new or renovated, namely the Conventional Constructive Model (CCM). A practical example carried out on ten residential buildings in Seville, Spain, enabled the identification and quantification of the C&D waste generated in their construction and the origin of the waste, in terms of the building material from which it originated and its impact for every m(2) constructed. This model enables other researchers to establish comparisons between the various improvements proposed for the minimization of the environmental impact produced by building a CCM, new corrective measures to be proposed in future policies that regulate the production and management of C&D waste generated in construction from the design stage to the completion of the construction process, and the establishment of sustainable management for C&D waste and for the selection of materials for the construction on projected or renovated buildings.

  5. Cross-impact method

    Directory of Open Access Journals (Sweden)

    Suzić Nenad

    2014-01-01

    Full Text Available The paper displays the application of the Cross-Impact method in pedagogy, namely a methodological approach which crosses variables in a novel, but statistically justified manner. The method is an innovation in pedagogy as well as in research methodology of social and psychological phenomena. Specifically, events and processes are crossed, that is, experts' predictions of about future interaction of events and processes. Therefore, this methodology is futuristic; it concerns predicting future, which is of key importance for pedagogic objectives. The paper presents two instances of the cross-impact approach: the longer, displayed in fourteen steps, and the shorter, in four steps. They are both accompanied with mathematic and statistical formulae allowing for quantification, that is, a numerical expression of the probability of a certain event happening in the future. The advantage of this approach is that it facilitates planning in education which so far has been solely based on lay estimates and assumptions.

  6. Quantification of social contributions to earthquake mortality

    Science.gov (United States)

    Main, I. G.; NicBhloscaidh, M.; McCloskey, J.; Pelling, M.; Naylor, M.

    2013-12-01

    Death tolls in earthquakes, which continue to grow rapidly, are the result of complex interactions between physical effects, such as strong shaking, and the resilience of exposed populations and supporting critical infrastructures and institutions. While it is clear that the social context in which the earthquake occurs has a strong effect on the outcome, the influence of this context can only be exposed if we first decouple, as much as we can, the physical causes of mortality from our consideration. (Our modelling assumes that building resilience to shaking is a social factor governed by national wealth, legislation and enforcement and governance leading to reduced levels of corruption.) Here we attempt to remove these causes by statistically modelling published mortality, shaking intensity and population exposure data; unexplained variance from this physical model illuminates the contribution of socio-economic factors to increasing earthquake mortality. We find that this variance partitions countries in terms of basic socio-economic measures and allows the definition of a national vulnerability index identifying both anomalously resilient and anomalously vulnerable countries. In many cases resilience is well correlated with GDP; people in the richest countries are unsurprisingly safe from even the worst shaking. However some low-GDP countries rival even the richest in resilience, showing that relatively low cost interventions can have a positive impact on earthquake resilience and that social learning between these countries might facilitate resilience building in the absence of expensive engineering interventions.

  7. Quantification of UHMWPE wear in periprosthetic tissues of hip arthroplasty: Description of a new method based on IR and comparison with radiographic appearance

    Czech Academy of Sciences Publication Activity Database

    Šlouf, Miroslav; Pokorný, D.; Entlicher, G.; Dybal, Jiří; Synková, Hana; Lapčíková, Monika; Fejfarková, Z.; Špundová, M.; Veselý, F.; Sosna, A.

    2008-01-01

    Roč. 265, 5-6 (2008), s. 674-684 ISSN 0043-1648 R&D Projects: GA ČR GA106/04/1118; GA MŠk 2B06096 Institutional research plan: CEZ:AV0Z40500505 Keywords : UHMWPE * isolation of wear debris * quantification of wear particles Subject RIV: CD - Macromolecular Chemistry Impact factor: 1.509, year: 2008

  8. Quantification of rat brain SPECT with 123I-ioflupane: evaluation of different reconstruction methods and image degradation compensations using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Roé-Vellvé, N; Pino, F; Cot, A; Ros, D; Falcon, C; Gispert, J D; Pavía, J; Marin, C

    2014-01-01

    SPECT studies with 123 I-ioflupane facilitate the diagnosis of Parkinson’s disease (PD). The effect on quantification of image degradations has been extensively evaluated in human studies but their impact on studies of experimental PD models is still unclear. The aim of this work was to assess the effect of compensating for the degrading phenomena on the quantification of small animal SPECT studies using 123 I-ioflupane. This assessment enabled us to evaluate the feasibility of quantitatively detecting small pathological changes using different reconstruction methods and levels of compensation for the image degrading phenomena. Monte Carlo simulated studies of a rat phantom were reconstructed and quantified. Compensations for point spread function (PSF), scattering, attenuation and partial volume effect were progressively included in the quantification protocol. A linear relationship was found between calculated and simulated specific uptake ratio (SUR) in all cases. In order to significantly distinguish disease stages, noise-reduction during the reconstruction process was the most relevant factor, followed by PSF compensation. The smallest detectable SUR interval was determined by biological variability rather than by image degradations or coregistration errors. The quantification methods that gave the best results allowed us to distinguish PD stages with SUR values that are as close as 0.5 using groups of six rats to represent each stage. (paper)

  9. Compositional Solution Space Quantification for Probabilistic Software Analysis

    Science.gov (United States)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  10. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [Univ. of Texas, Austin, TX (United States)

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  11. Improved perfusion quantification in FAIR imaging by offset correction

    DEFF Research Database (Denmark)

    Sidaros, Karam; Andersen, Irene Klærke; Gesmar, Henrik

    2001-01-01

    Perfusion quantification using pulsed arterial spin labeling has been shown to be sensitive to the RF pulse slice profiles. Therefore, in Flow-sensitive Alternating-Inversion Recovery (FAIR) imaging the slice selective (ss) inversion slab is usually three to four times thicker than the imaging...... slice. However, this reduces perfusion sensitivity due to the increased transit delay of the incoming blood with unperturbed spins. In the present article, the dependence of the magnetization on the RF pulse slice profiles is inspected both theoretically and experimentally. A perfusion quantification...... model is presented that allows the use of thinner ss inversion slabs by taking into account the offset of RF slice profiles between ss and nonselective inversion slabs. This model was tested in both phantom and human studies. Magn Reson Med 46:193-197, 2001...

  12. Good quantification practices of flavours and fragrances by mass spectrometry.

    Science.gov (United States)

    Begnaud, Frédéric; Chaintreau, Alain

    2016-10-28

    Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.

  13. Techniques of biomolecular quantification through AMS detection of radiocarbon

    International Nuclear Information System (INIS)

    Vogel, S.J.; Turteltaub, K.W.; Frantz, C.; Felton, J.S.; Gledhill, B.L.

    1992-01-01

    Accelerator mass spectrometry offers a large gain over scintillation counting in sensitivity for detecting radiocarbon in biomolecular tracing. Application of this sensitivity requires new considerations of procedures to extract or isolate the carbon fraction to be quantified, to inventory all carbon in the sample, to prepare graphite from the sample for use in the spectrometer, and to derive a meaningful quantification from the measured isotope ratio. These procedures need to be accomplished without contaminating the sample with radiocarbon, which may be ubiquitous in laboratories and on equipment previously used for higher dose, scintillation experiments. Disposable equipment, materials and surfaces are used to control these contaminations. Quantification of attomole amounts of labeled substances are possible through these techniques

  14. Quantification of the effects of dependence on human error probabilities

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1980-01-01

    In estimating the probabilities of human error in the performance of a series of tasks in a nuclear power plant, the situation-specific characteristics of the series must be considered. A critical factor not to be overlooked in this estimation is the dependence or independence that pertains to any of the several pairs of task performances. In discussing the quantification of the effects of dependence, the event tree symbology described will be used. In any series of tasks, the only dependence considered for quantification in this document will be that existing between the task of interest and the immediately preceeding task. Tasks performed earlier in the series may have some effect on the end task, but this effect is considered negligible

  15. Metering error quantification under voltage and current waveform distortion

    Science.gov (United States)

    Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran

    2017-09-01

    With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.

  16. Quantification of uranyl in presence of citric acid

    International Nuclear Information System (INIS)

    Garcia G, N.; Barrera D, C.E.; Ordonez R, E.

    2007-01-01

    To determine the influence that has the organic matter of the soil on the uranyl sorption on some solids is necessary to have a detection technique and quantification of uranyl that it is reliable and sufficiently quick in the obtaining of results. For that in this work, it intends to carry out the uranyl quantification in presence of citric acid modifying the Fluorescence induced by UV-Vis radiation technique. Since the uranyl ion is very sensitive to the medium that contains it, (speciation, pH, ionic forces, etc.) it was necessary to develop an analysis technique that stands out the fluorescence of uranyl ion avoiding the out one that produce the organic acids. (Author)

  17. Development of computational algorithms for quantification of pulmonary structures

    International Nuclear Information System (INIS)

    Oliveira, Marcela de; Alvarez, Matheus; Alves, Allan F.F.; Miranda, Jose R.A.; Pina, Diana R.

    2012-01-01

    The high-resolution computed tomography has become the imaging diagnostic exam most commonly used for the evaluation of the squeals of Paracoccidioidomycosis. The subjective evaluations the radiological abnormalities found on HRCT images do not provide an accurate quantification. The computer-aided diagnosis systems produce a more objective assessment of the abnormal patterns found in HRCT images. Thus, this research proposes the development of algorithms in MATLAB® computing environment can quantify semi-automatically pathologies such as pulmonary fibrosis and emphysema. The algorithm consists in selecting a region of interest (ROI), and by the use of masks, filter densities and morphological operators, to obtain a quantification of the injured area to the area of a healthy lung. The proposed method was tested on ten HRCT scans of patients with confirmed PCM. The results of semi-automatic measurements were compared with subjective evaluations performed by a specialist in radiology, falling to a coincidence of 80% for emphysema and 58% for fibrosis. (author)

  18. Collaborative framework for PIV uncertainty quantification: comparative assessment of methods

    International Nuclear Information System (INIS)

    Sciacchitano, Andrea; Scarano, Fulvio; Neal, Douglas R; Smith, Barton L; Warner, Scott O; Vlachos, Pavlos P; Wieneke, Bernhard

    2015-01-01

    A posteriori uncertainty quantification of particle image velocimetry (PIV) data is essential to obtain accurate estimates of the uncertainty associated with a given experiment. This is particularly relevant when measurements are used to validate computational models or in design and decision processes. In spite of the importance of the subject, the first PIV uncertainty quantification (PIV-UQ) methods have been developed only in the last three years. The present work is a comparative assessment of four approaches recently proposed in the literature: the uncertainty surface method (Timmins et al 2012), the particle disparity approach (Sciacchitano et al 2013), the peak ratio criterion (Charonko and Vlachos 2013) and the correlation statistics method (Wieneke 2015). The analysis is based upon experiments conducted for this specific purpose, where several measurement techniques are employed simultaneously. The performances of the above approaches are surveyed across different measurement conditions and flow regimes. (paper)

  19. Clinical applications of MS-based protein quantification.

    Science.gov (United States)

    Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter

    2016-04-01

    Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Visualization and Quantification of Rotor Tip Vortices in Helicopter Flows

    Science.gov (United States)

    Kao, David L.; Ahmad, Jasim U.; Holst, Terry L.

    2015-01-01

    This paper presents an automated approach for effective extraction, visualization, and quantification of vortex core radii from the Navier-Stokes simulations of a UH-60A rotor in forward flight. We adopt a scaled Q-criterion to determine vortex regions and then perform vortex core profiling in these regions to calculate vortex core radii. This method provides an efficient way of visualizing and quantifying the blade tip vortices. Moreover, the vortices radii are displayed graphically in a plane.

  1. Digital PCR for direct quantification of viruses without DNA extraction

    OpenAIRE

    Pav?i?, Jernej; ?el, Jana; Milavec, Mojca

    2015-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration mat...

  2. Detection and quantification of Leveillula taurica growth in pepper leaves.

    Science.gov (United States)

    Zheng, Zheng; Nonomura, Teruo; Bóka, Károly; Matsuda, Yoshinori; Visser, Richard G F; Toyoda, Hideyoshi; Kiss, Levente; Bai, Yuling

    2013-06-01

    Leveillula taurica is an obligate fungal pathogen that causes powdery mildew disease on a broad range of plants, including important crops such as pepper, tomato, eggplant, onion, cotton, and so on. The early stage of this disease is difficult to diagnose and the disease can easily spread unobserved; for example, in pepper and tomato production fields and greenhouses. The objective of this study was to develop a detection and quantification method of L. taurica biomass in pepper leaves with special regard to the early stages of infection. We monitored the development of the disease to time the infection process on the leaf surface as well as inside the pepper leaves. The initial and final steps of the infection taking place on the leaf surface were consecutively observed using a dissecting microscope and a scanning electron microscope. The development of the intercellular mycelium in the mesophyll was followed by light and transmission electron microscopy. A pair of L. taurica-specific primers was designed based on the internal transcribed spacer sequence of L. taurica and used in real-time polymerase chain reaction (PCR) assay to quantify the fungal DNA during infection. The specificity of this assay was confirmed by testing the primer pair with DNA from host plants and also from another powdery mildew species, Oidium neolycopersici, infecting tomato. A standard curve was obtained for absolute quantification of L. taurica biomass. In addition, we tested a relative quantification method by using a plant gene as reference and the obtained results were compared with the visual disease index scoring. The real-time PCR assay for L. taurica provides a valuable tool for detection and quantification of this pathogen in breeding activities as well in plant-microbe interaction studies.

  3. Standardless quantification by parameter optimization in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Limandri, Silvina P. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Bonetto, Rita D. [Centro de Investigacion y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco (CINDECA), CONICET, 47 Street 257, (1900) La Plata (Argentina); Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1 and 47 Streets (1900) La Plata (Argentina); Josa, Victor Galvan; Carreras, Alejo C. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Trincavelli, Jorge C., E-mail: trincavelli@famaf.unc.edu.ar [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina)

    2012-11-15

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum Registered-Sign for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: Black-Right-Pointing-Pointer A method for standardless quantification in EPMA is presented. Black-Right-Pointing-Pointer It gives better results than the commercial software GENESIS Spectrum. Black-Right-Pointing-Pointer It gives better results than the software DTSA. Black-Right-Pointing-Pointer It allows the determination of the conductive coating thickness. Black-Right-Pointing-Pointer It gives an estimation for the concentration uncertainties.

  4. Rapid quantification of biomarkers during kerogen microscale pyrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Stott, A.W.; Abbott, G.D. [Fossil Fuels and Environmental Geochemistry NRG, The University, Newcastle-upon-Tyne (United Kingdom)

    1995-02-01

    A rapid, reproducible method incorporating closed system microscale pyrolysis and thermal desorption-gas chromatography/mass spectrometry has been developed and applied to the quantification of sterane biomarkers released during pyrolysis of the Messel oil shale kerogen under confined conditions. This method allows a substantial experimental concentration-time data set to be collected at accurately controlled temperatures, due to the low thermal inertia of the microscale borosilicate glass reaction vessels, which facilitates kinetic studies of biomarker reactions during kerogen microscale pyrolysis

  5. GHG emission quantification for pavement construction projects using a process-based approach

    Directory of Open Access Journals (Sweden)

    Charinee Limsawasd

    2017-03-01

    Full Text Available Climate change and greenhouse gas (GHG emissions have attracted much attention for their impacts upon the global environment. Initiating of new legislation and regulations for control of GHG emissions from the industrial sectors has been applied to address this problem. The transportation industries, which include operation of road pavement and pavement construction equipment, are the highest GHG-emitting sectors. This study presents a novel quantification model of GHG emissions of pavement construction using process-based analysis. The model is composed of five modules that evaluate GHG emissions. These are: material production and acquisition, (2 material transport to a project site, (3 heavy equipment use, (4 on-site machinery use, and, (5 on-site electricity use. The model was applied to a hypothetical pavement project to compare the environmental impacts of flexible and rigid pavement types during construction. The resulting model can be used for evaluation of environmental impacts, as well as for designing and planning highway pavement construction.

  6. Outcome quantification using SPHARM-PDM toolbox in orthognathic surgery

    Science.gov (United States)

    Cevidanes, Lucia; Zhu, HongTu; Styner, Martin

    2011-01-01

    Purpose Quantification of surgical outcomes in longitudinal studies has led to significant progress in the treatment of dentofacial deformity, both by offering options to patients who might not otherwise have been recommended for treatment and by clarifying the selection of appropriate treatment methods. Most existing surgical treatments have not been assessed in a systematic way. This paper presents the quantification of surgical outcomes in orthognathic surgery via our localized shape analysis framework. Methods In our setting, planning and surgical simulation is performed using the surgery planning software CMFapp. We then employ the SPHARM-PDM to measure the difference between pre-surgery and virtually simulated post-surgery models. This SPHARM-PDM shape framework is validated for use with craniofacial structures via simulating known 3D surgical changes within CMFapp. Results Our results show that SPHARM-PDM analysis accurately measures surgical displacements, compared with known displacement values. Visualization of color maps of virtually simulated surgical displacements describe corresponding surface distances that precisely describe location of changes, and difference vectors indicate directionality and magnitude of changes. Conclusions SPHARM-PDM-based quantification of surgical outcome is feasible. When compared to prior solutions, our method has the potential to make the surgical planning process more flexible, increase the level of detail and accuracy of the plan, yield higher operative precision and control and enhance the follow-up and documentation of clinical cases. PMID:21161693

  7. Initial water quantification results using neutron computed tomography

    Science.gov (United States)

    Heller, A. K.; Shi, L.; Brenizer, J. S.; Mench, M. M.

    2009-06-01

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at the Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  8. Quantification of rice bran oil in oil blends

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, R.; Sharma, H. K.; Sengar, G.

    2012-11-01

    Blends consisting of physically refined rice bran oil (PRBO): sunflower oil (SnF) and PRBO: safflower oil (SAF) in different proportions were analyzed for various physicochemical parameters. The quantification of pure rice bran oil in the blended oils was carried out using different methods including gas chromatographic, HPLC, ultrasonic velocity and methods based on physico-chemical parameters. The physicochemical parameters such as ultrasonic velocity, relative association and acoustic impedance at 2 MHz, iodine value, palmitic acid content and oryzanol content reflected significant changes with increased proportions of PRBO in the blended oils. These parameters were selected as dependent parameters and % PRBO proportion was selected as independent parameters. The study revealed that regression equations based on the oryzanol content, palmitic acid composition, ultrasonic velocity, relative association, acoustic impedance, and iodine value can be used for the quantification of rice bran oil in blended oils. The rice bran oil can easily be quantified in the blended oils based on the oryzanol content by HPLC even at a 1% level. The palmitic acid content in blended oils can also be used as an indicator to quantify rice bran oil at or above the 20% level in blended oils whereas the method based on ultrasonic velocity, acoustic impedance and relative association showed initial promise in the quantification of rice bran oil. (Author) 23 refs.

  9. Development of Accident Scenarios and Quantification Methodology for RAON Accelerator

    International Nuclear Information System (INIS)

    Lee, Yongjin; Jae, Moosung

    2014-01-01

    The RIsp (Rare Isotope Science Project) plans to provide neutron-rich isotopes (RIs) and stable heavy ion beams. The accelerator is defined as radiation production system according to Nuclear Safety Law. Therefore, it needs strict operate procedures and safety assurance to prevent radiation exposure. In order to satisfy this condition, there is a need for evaluating potential risk of accelerator from the design stage itself. Though some of PSA researches have been conducted for accelerator, most of them focus on not general accident sequence but simple explanation of accident. In this paper, general accident scenarios are developed by Event Tree and deduce new quantification methodology of Event Tree. In this study, some initial events, which may occur in the accelerator, are selected. Using selected initial events, the accident scenarios of accelerator facility are developed with Event Tree. These results can be used as basic data of the accelerator for future risk assessments. After analyzing the probability of each heading, it is possible to conduct quantification and evaluate the significance of the accident result. If there is a development of the accident scenario for external events, risk assessment of entire accelerator facility will be completed. To reduce the uncertainty of the Event Tree, it is possible to produce a reliable data via the presented quantification techniques

  10. A fast and robust hepatocyte quantification algorithm including vein processing

    Directory of Open Access Journals (Sweden)

    Homeyer André

    2010-03-01

    Full Text Available Abstract Background Quantification of different types of cells is often needed for analysis of histological images. In our project, we compute the relative number of proliferating hepatocytes for the evaluation of the regeneration process after partial hepatectomy in normal rat livers. Results Our presented automatic approach for hepatocyte (HC quantification is suitable for the analysis of an entire digitized histological section given in form of a series of images. It is the main part of an automatic hepatocyte quantification tool that allows for the computation of the ratio between the number of proliferating HC-nuclei and the total number of all HC-nuclei for a series of images in one processing run. The processing pipeline allows us to obtain desired and valuable results for a wide range of images with different properties without additional parameter adjustment. Comparing the obtained segmentation results with a manually retrieved segmentation mask which is considered to be the ground truth, we achieve results with sensitivity above 90% and false positive fraction below 15%. Conclusions The proposed automatic procedure gives results with high sensitivity and low false positive fraction and can be applied to process entire stained sections.

  11. Initial water quantification results using neutron computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.K. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)], E-mail: axh174@psu.edu; Shi, L.; Brenizer, J.S.; Mench, M.M. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)

    2009-06-21

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  12. Toponomics method for the automated quantification of membrane protein translocation.

    Science.gov (United States)

    Domanova, Olga; Borbe, Stefan; Mühlfeld, Stefanie; Becker, Martin; Kubitz, Ralf; Häussinger, Dieter; Berlage, Thomas

    2011-09-19

    Intra-cellular and inter-cellular protein translocation can be observed by microscopic imaging of tissue sections prepared immunohistochemically. A manual densitometric analysis is time-consuming, subjective and error-prone. An automated quantification is faster, more reproducible, and should yield results comparable to manual evaluation. The automated method presented here was developed on rat liver tissue sections to study the translocation of bile salt transport proteins in hepatocytes. For validation, the cholestatic liver state was compared to the normal biological state. An automated quantification method was developed to analyze the translocation of membrane proteins and evaluated in comparison to an established manual method. Firstly, regions of interest (membrane fragments) are identified in confocal microscopy images. Further, densitometric intensity profiles are extracted orthogonally to membrane fragments, following the direction from the plasma membrane to cytoplasm. Finally, several different quantitative descriptors were derived from the densitometric profiles and were compared regarding their statistical significance with respect to the transport protein distribution. Stable performance, robustness and reproducibility were tested using several independent experimental datasets. A fully automated workflow for the information extraction and statistical evaluation has been developed and produces robust results. New descriptors for the intensity distribution profiles were found to be more discriminative, i.e. more significant, than those used in previous research publications for the translocation quantification. The slow manual calculation can be substituted by the fast and unbiased automated method.

  13. [DNA quantification of blood samples pre-treated with pyramidon].

    Science.gov (United States)

    Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan

    2014-06-01

    To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.

  14. HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.

    Science.gov (United States)

    Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil

    2017-04-01

    Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Preliminary study on computer automatic quantification of brain atrophy

    International Nuclear Information System (INIS)

    Li Chuanfu; Zhou Kangyuan

    2006-01-01

    Objective: To study the variability of normal brain volume with the sex and age, and put forward an objective standard for computer automatic quantification of brain atrophy. Methods: The cranial volume, brain volume and brain parenchymal fraction (BPF) of 487 cases of brain atrophy (310 males, 177 females) and 1901 cases of normal subjects (993 males, 908 females) were calculated with the newly developed algorithm of automatic quantification for brain atrophy. With the technique of polynomial curve fitting, the mathematical relationship of BPF with age in normal subjects was analyzed. Results: The cranial volume, brain volume and BPF of normal subjects were (1 271 322 ± 128 699) mm 3 , (1 211 725 ± 122 077) mm 3 and (95.3471 ± 2.3453)%, respectively, and those of atrophy subjects were (1 276 900 ± 125 180) mm 3 , (1 203 400 ± 117 760) mm 3 and BPF(91.8115 ± 2.3035)% respectively. The difference of BPF between the two groups was extremely significant (P 0.05). The expression P(x)=-0.0008x 2 + 0.0193x + 96.9999 could accurately describe the mathematical relationship between BPF and age in normal subject (lower limit of 95% CI y=-0.0008x 2 +0.0184x+95.1090). Conclusion: The lower limit of 95% confidence interval mathematical relationship between BPF and age could be used as an objective criteria for automatic quantification of brain atrophy with computer. (authors)

  16. Attitudes et risque environnemental liés à l'utilisation de sachets ...

    African Journals Online (AJOL)

    Environmental risk-taking is a topical issue in Benin. The aim of this study was to investigate the relationship between environmental attitudes and risk taking through the use of plastic bags by students. Therefore, we tested the hypothesis that there is a link between environmental attitudes and risks taking through the use of ...

  17. 43 CFR 11.71 - Quantification phase-service reduction quantification.

    Science.gov (United States)

    2010-10-01

    ... of the economic methodology to be used; (4) Technical feasibility, as that phrase is used in this... identified in an environmental impact statement or other comparable environmental analysis, and the decision... the enactment of CERCLA; or (3) The application of a pesticide product registered under the Federal...

  18. Quantification and Multi-purpose Allocation of Water Resources in a Dual-reservoir System

    Science.gov (United States)

    Salami, Y. D.

    2017-12-01

    Transboundary rivers that run through separate water management jurisdictions sometimes experience competitive water usage. Where the river has multiple existing or planned dams along its course, quantification and efficient allocation of water for such purposes as hydropower generation, irrigation for agriculture, and water supply can be a challenge. This problem is even more pronounced when large parts of the river basin are located in semi-arid regions known for water insecurity, poor crop yields from irrigation scheme failures, and human population displacement arising from water-related conflict. This study seeks to mitigate the impacts of such factors on the Kainji-Jebba dual-reservoir system located along the Niger River in Africa by seasonally quantifying and efficiently apportioning water to all stipulated uses of both dams thereby improving operational policy and long-term water security. Historical storage fluctuations (18 km3 to 5 km3) and flows into and out of both reservoirs were analyzed for relationships to such things as surrounding catchment contribution, dam operational policies, irrigation and hydropower requirements, etc. Optimum values of the aforementioned parameters were then determined by simulations based upon hydrological contributions and withdrawals and worst case scenarios of natural and anthropogenic conditions (like annual probability of reservoir depletion) affecting water availability and allocation. Finally, quantification and optimized allocation of water was done based on needs for hydropower, irrigation for agriculture, water supply, and storage evacuation for flood control. Results revealed that water supply potential increased by 69%, average agricultural yield improved by 36%, and hydropower generation increased by 54% and 66% at the upstream and downstream dams respectively. Lessons learned from this study may help provide a robust and practical means of water resources management in similar river basins and multi

  19. Improved Diagnoses and Quantification of Fusarium virguliforme, Causal Agent of Soybean Sudden Death Syndrome.

    Science.gov (United States)

    Wang, Jie; Jacobs, Janette L; Byrne, Jan M; Chilvers, Martin I

    2015-03-01

    Fusarium virguliforme (syn. F. solani f. sp. glycines) is the primary causal pathogen responsible for soybean sudden death syndrome (SDS) in North America. Diagnosis of SDS is difficult because symptoms can be inconsistent or similar to several soybean diseases and disorders. Additionally, quantification and identification of F. virguliforme by traditional dilution plating of soil or ground plant tissue is problematic due to the slow growth rate and plastic morphology of F. virguliforme. Although several real-time quantitative polymerase chain reaction (qPCR)-based assays have been developed for F. virguliforme, the performance of those assays does not allow for accurate quantification of F. virguliforme due to the reclassification of the F. solani species complex. In this study, we developed a TaqMan qPCR assay based on the ribosomal DNA (rDNA) intergenic spacer (IGS) region of F. virguliforme. Specificity of the assay was demonstrated by challenging it with genomic DNA of closely related Fusarium spp. and commonly encountered soilborne fungal pathogens. The detection limit of this assay was determined to be 100 fg of pure F. virguliforme genomic DNA or 100 macroconidia in 0.5 g of soil. An exogenous control was multiplexed with the assay to evaluate for PCR inhibition. Target locus copy number variation had minimal impact, with a range of rDNA copy number from 138 to 233 copies per haploid genome, resulting in a minor variation of up to 0.76 cycle threshold values between strains. The qPCR assay is transferable across platforms, as validated on the primary real-time PCR platform used in the Northcentral region of the National Plant Diagnostic Network. A conventional PCR assay for F. virguliforme detection was also developed and validated for use in situations where qPCR is not possible.

  20. Efficient Quantification of Uncertainties in Complex Computer Code Results, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses methods for efficient quantification of margins and uncertainties (QMU) for models that couple multiple, large-scale commercial or...

  1. Cutset Quantification Error Evaluation for Shin-Kori 1 and 2 PSA model

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2009-01-01

    Probabilistic safety assessments (PSA) for nuclear power plants (NPPs) are based on the minimal cut set (MCS) quantification method. In PSAs, the risk and importance measures are computed from a cutset equation mainly by using approximations. The conservatism of the approximations is also a source of quantification uncertainty. In this paper, exact MCS quantification methods which are based on the 'sum of disjoint products (SDP)' logic and Inclusion-exclusion formula are applied and the conservatism of the MCS quantification results in Shin-Kori 1 and 2 PSA is evaluated

  2. Quantification, improvement, and harmonization of small lesion detection with state-of-the-art PET

    Energy Technology Data Exchange (ETDEWEB)

    Vos, Charlotte S. van der [Radboud University Medical Centre, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands); University of Twente, MIRA Institute for Biomedical Technology and Technical Medicine, Enschede (Netherlands); Koopman, Danielle [University of Twente, MIRA Institute for Biomedical Technology and Technical Medicine, Enschede (Netherlands); Isala Hospital, Department of Nuclear Medicine, Zwolle (Netherlands); Rijnsdorp, Sjoerd; Arends, Albert J. [Catharina Hospital, Department of Medical Physics, Eindhoven (Netherlands); Boellaard, Ronald [University of Groningen, University Medical Centre Groningen, Department of Nuclear Medicine and Molecular Imaging, Groningen (Netherlands); VU University Medical Center, Department of Radiology and Nuclear Medicine, Amsterdam (Netherlands); Dalen, Jorn A. van [Isala Hospital, Department of Nuclear Medicine, Zwolle (Netherlands); Isala, Department of Medical Physics, Zwolle (Netherlands); Lubberink, Mark [Uppsala University, Department of Surgical Sciences, Uppsala (Sweden); Uppsala University Hospital, Department of Medical Physics, Uppsala (Sweden); Willemsen, Antoon T.M. [University of Groningen, University Medical Centre Groningen, Department of Nuclear Medicine and Molecular Imaging, Groningen (Netherlands); Visser, Eric P. [Radboud University Medical Centre, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands)

    2017-08-15

    In recent years, there have been multiple advances in positron emission tomography/computed tomography (PET/CT) that improve cancer imaging. The present generation of PET/CT scanners introduces new hardware, software, and acquisition methods. This review describes these new developments, which include time-of-flight (TOF), point-spread-function (PSF), maximum-a-posteriori (MAP) based reconstruction, smaller voxels, respiratory gating, metal artefact reduction, and administration of quadratic weight-dependent {sup 18}F-fluorodeoxyglucose (FDG) activity. Also, hardware developments such as continuous bed motion (CBM), (digital) solid-state photodetectors and combined PET and magnetic resonance (MR) systems are explained. These novel techniques have a significant impact on cancer imaging, as they result in better image quality, improved small lesion detectability, and more accurate quantification of radiopharmaceutical uptake. This influences cancer diagnosis and staging, as well as therapy response monitoring and radiotherapy planning. Finally, the possible impact of these developments on the European Association of Nuclear Medicine (EANM) guidelines and EANM Research Ltd. (EARL) accreditation for FDG-PET/CT tumor imaging is discussed. (orig.)

  3. Quantification, improvement, and harmonization of small lesion detection with state-of-the-art PET

    International Nuclear Information System (INIS)

    Vos, Charlotte S. van der; Koopman, Danielle; Rijnsdorp, Sjoerd; Arends, Albert J.; Boellaard, Ronald; Dalen, Jorn A. van; Lubberink, Mark; Willemsen, Antoon T.M.; Visser, Eric P.

    2017-01-01

    In recent years, there have been multiple advances in positron emission tomography/computed tomography (PET/CT) that improve cancer imaging. The present generation of PET/CT scanners introduces new hardware, software, and acquisition methods. This review describes these new developments, which include time-of-flight (TOF), point-spread-function (PSF), maximum-a-posteriori (MAP) based reconstruction, smaller voxels, respiratory gating, metal artefact reduction, and administration of quadratic weight-dependent 18 F-fluorodeoxyglucose (FDG) activity. Also, hardware developments such as continuous bed motion (CBM), (digital) solid-state photodetectors and combined PET and magnetic resonance (MR) systems are explained. These novel techniques have a significant impact on cancer imaging, as they result in better image quality, improved small lesion detectability, and more accurate quantification of radiopharmaceutical uptake. This influences cancer diagnosis and staging, as well as therapy response monitoring and radiotherapy planning. Finally, the possible impact of these developments on the European Association of Nuclear Medicine (EANM) guidelines and EANM Research Ltd. (EARL) accreditation for FDG-PET/CT tumor imaging is discussed. (orig.)

  4. Effects of humic acid on DNA quantification with Quantifiler® Human DNA Quantification kit and short tandem repeat amplification efficiency.

    Science.gov (United States)

    Seo, Seung Bum; Lee, Hye Young; Zhang, Ai Hua; Kim, Hye Yeon; Shin, Dong Hoon; Lee, Soong Deok

    2012-11-01

    Correct DNA quantification is an essential part to obtain reliable STR typing results. Forensic DNA analysts often use commercial kits for DNA quantification; among them, real-time-based DNA quantification kits are most frequently used. Incorrect DNA quantification due to the presence of PCR inhibitors may affect experiment results. In this study, we examined the alteration degree of DNA quantification results estimated in DNA samples containing a PCR inhibitor by using a Quantifiler® Human DNA Quantification kit. For experiments, we prepared approximately 0.25 ng/μl DNA samples containing various concentrations of humic acid (HA). The quantification results were 0.194-0.303 ng/μl at 0-1.6 ng/μl HA (final concentration in the Quantifiler reaction) and 0.003-0.168 ng/μl at 2.4-4.0 ng/μl HA. Most DNA quantity was undetermined when HA concentration was higher than 4.8 ng/μl HA. The C (T) values of an internal PCR control (IPC) were 28.0-31.0, 36.5-37.1, and undetermined at 0-1.6, 2.4, and 3.2 ng/μl HA. These results indicate that underestimated DNA quantification results may be obtained in the DNA sample with high C (T) values of IPC. Thus, researchers should carefully interpret the DNA quantification results. We additionally examined the effects of HA on the STR amplification by using an Identifiler® kit and a MiniFiler™ kit. Based on the results of this study, it is thought that a better understanding of various effects of HA would help researchers recognize and manipulate samples containing HA.

  5. Two-stream Convolutional Neural Network for Methane Emissions Quantification

    Science.gov (United States)

    Wang, J.; Ravikumar, A. P.; McGuire, M.; Bell, C.; Tchapmi, L. P.; Brandt, A. R.

    2017-12-01

    Methane, a key component of natural gas, has a 25x higher global warming potential than carbon dioxide on a 100-year basis. Accurately monitoring and mitigating methane emissions require cost-effective detection and quantification technologies. Optical gas imaging, one of the most commonly used leak detection technology, adopted by Environmental Protection Agency, cannot estimate leak-sizes. In this work, we harness advances in computer science to allow for rapid and automatic leak quantification. Particularly, we utilize two-stream deep Convolutional Networks (ConvNets) to estimate leak-size by capturing complementary spatial information from still plume frames, and temporal information from plume motion between frames. We build large leak datasets for training and evaluating purposes by collecting about 20 videos (i.e. 397,400 frames) of leaks. The videos were recorded at six distances from the source, covering 10 -60 ft. Leak sources included natural gas well-heads, separators, and tanks. All frames were labeled with a true leak size, which has eight levels ranging from 0 to 140 MCFH. Preliminary analysis shows that two-stream ConvNets provides significant accuracy advantage over single steam ConvNets. Spatial stream ConvNet can achieve an accuracy of 65.2%, by extracting important features, including texture, plume area, and pattern. Temporal stream, fed by the results of optical flow analysis, results in an accuracy of 58.3%. The integration of the two-stream ConvNets gives a combined accuracy of 77.6%. For future work, we will split the training and testing datasets in distinct ways in order to test the generalization of the algorithm for different leak sources. Several analytic metrics, including confusion matrix and visualization of key features, will be used to understand accuracy rates and occurrences of false positives. The quantification algorithm can help to find and fix super-emitters, and improve the cost-effectiveness of leak detection and repair

  6. Quantification of breast arterial calcification using full field digital mammography

    International Nuclear Information System (INIS)

    Molloi, Sabee; Xu Tong; Ducote, Justin; Iribarren, Carlos

    2008-01-01

    Breast arterial calcification is commonly detected on some mammograms. Previous studies indicate that breast arterial calcification is evidence of general atherosclerotic vascular disease and it may be a useful marker of coronary artery disease. It can potentially be a useful tool for assessment of coronary artery disease in women since mammography is widely used as a screening tool for early detection of breast cancer. However, there are currently no available techniques for quantification of calcium mass using mammography. The purpose of this study was to determine whether it is possible to quantify breast arterial calcium mass using standard digital mammography. An anthropomorphic breast phantom along with a vessel calcification phantom was imaged using a full field digital mammography system. Densitometry was used to quantify calcium mass. A calcium calibration measurement was performed at each phantom thickness and beam energy. The known (K) and measured (M) calcium mass on 5 and 9 cm thickness phantoms were related by M=0.964K-0.288 mg (r=0.997 and SEE=0.878 mg) and M=1.004K+0.324 mg (r=0.994 and SEE=1.32 mg), respectively. The results indicate that accurate calcium mass measurements can be made without correction for scatter glare as long as careful calcium calibration is made for each breast thickness. The results also indicate that composition variations and differences of approximately 1 cm between calibration phantom and breast thickness introduce only minimal error in calcium measurement. The uncertainty in magnification is expected to cause up to 5% and 15% error in calcium mass for 5 and 9 cm breast thicknesses, respectively. In conclusion, a densitometry technique for quantification of breast arterial calcium mass was validated using standard full field digital mammography. The results demonstrated the feasibility and potential utility of the densitometry technique for accurate quantification of breast arterial calcium mass using standard digital

  7. Development of the quantification procedures for in situ XRF analysis

    International Nuclear Information System (INIS)

    Kump, P.; Necemer, M.; Rupnik, P.

    2005-01-01

    For in situ XRF applications, two excitation systems (radioisotope and tube excited) and an X ray spectrometer based on an Si-PIN detector were assembled and used. The radioisotope excitation system with an Am-241 source was assembled into a prototype of a compact XRF analyser PEDUZO-01, which is also applicable in field work. The existing quantification software QAES (quantitative analysis of environmental samples) was assessed to be adequate also in field work. This QAES software was also integrated into a new software attached to the developed XRF analyser PEDUZO-01, which includes spectrum acquisition, spectrum analysis and quantification and runs in the LABVIEW environment. In a process of assessment of the Si-PIN based X ray spectrometers and QAES quantification software in field work, a comparison was made with the results obtained by the standard Si(Li) based spectrometer. The results of this study prove that the use of this spectrometer is adequate for field work. This work was accepted for publication in X ray Spectrometry. Application of a simple sample preparation of solid samples was studied in view of the analytical results obtained. It has been established that under definite conditions the results are not very different from the ones obtained by the homogenized sample pressed into the pellet. The influence of particle size and mineralogical effects on quantitative results was studied. A simple sample preparation kit was proposed. Sample preparation for the analysis of water samples by precipitation with APDC and aerosol analysis using a dichotomous sampler were also adapted and used in the field work. An adequate sample preparation kit was proposed. (author)

  8. Simultaneous quantification of flavonoids and triterpenoids in licorice using HPLC.

    Science.gov (United States)

    Wang, Yuan-Chuen; Yang, Yi-Shan

    2007-05-01

    Numerous bioactive compounds are present in licorice (Glycyrrhizae Radix), including flavonoids and triterpenoids. In this study, a reversed-phase high-performance liquid chromatography (HPLC) method for simultaneous quantification of three flavonoids (liquiritin, liquiritigenin and isoliquiritigenin) and four triterpenoids (glycyrrhizin, 18alpha-glycyrrhetinic acid, 18beta-glycyrrhetinic acid and 18beta-glycyrrhetinic acid methyl ester) from licorice was developed, and further, to quantify these 7 compounds from 20 different licorice samples. Specifically, the reverse-phase HPLC was performed with a gradient mobile phase composed of 25 mM phosphate buffer (pH 2.5)-acetonitrile featuring gradient elution steps as follows: 0 min, 100:0; 10 min, 80:20; 50 min, 70:30; 73 min, 50:50; 110 min, 50:50; 125 min, 20:80; 140 min, 20:80, and peaks were detected at 254 nm. By using our technique, a rather good specificity was obtained regarding to the separation of these seven compounds. The regression coefficient for the linear equations for the seven compounds lay between 0.9978 and 0.9992. The limits of detection and quantification lay in the range of 0.044-0.084 and 0.13-0.25 microg/ml, respectively. The relative recovery rates for the seven compounds lay between 96.63+/-2.43 and 103.55+/-2.77%. Coefficient variation for intra-day and inter-day precisions lay in the range of 0.20-1.84 and 0.28-1.86%, respectively. Based upon our validation results, this analytical technique is a convenient method to simultaneous quantify numerous bioactive compounds derived from licorice, featuring good quantification parameters, accuracy and precision.

  9. The role of PET quantification in cardiovascular imaging.

    Science.gov (United States)

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries

  10. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Conrad, Patrick [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Bigoni, Daniele [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Parno, Matthew [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2017-06-09

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a history of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT

  11. Dermatologic radiotherapy and thyroid cancer. Dose measurements and risk quantification

    International Nuclear Information System (INIS)

    Goldschmidt, H.; Gorson, R.O.; Lassen, M.

    1983-01-01

    Thyroid doses for various dermatologic radiation techniques were measured with thermoluminescent dosimeters and ionization rate meters in an Alderson-Rando anthropomorphic phantom. The effects of changes in radiation quality and of the use or nonuse of treatment cones and thyroid shields were evaluated in detail. The results indicate that the potential risk of radiogenic thyroid cancer is very small when proper radiation protection measures are used. The probability of radiogenic thyroid cancer developing and the potential mortality risk were assessed quantitatively for each measurement. The quantification of radiation risks allows comparisons with risks of other therapeutic modalities and the common hazards of daily life

  12. HPLC Quantification of Cytotoxic Compounds from Aspergillus niger

    Directory of Open Access Journals (Sweden)

    Paula Karina S. Uchoa

    2017-01-01

    Full Text Available A high-performance liquid chromatography method was developed and validated for the quantification of the cytotoxic compounds produced by a marine strain of Aspergillus niger. The fungus was grown in malt peptone dextrose (MPD, potato dextrose yeast (PDY, and mannitol peptone yeast (MnPY media during 7, 14, 21, and 28 days, and the natural products were identified by standard compounds. The validation parameters obtained were selectivity, linearity (coefficient of correlation > 0.99, precision (relative standard deviation below 5%, and accuracy (recovery > 96.

  13. Improved Method for PD-Quantification in Power Cables

    DEFF Research Database (Denmark)

    Holbøll, Joachim T.; Villefrance, Rasmus; Henriksen, Mogens

    1999-01-01

    n this paper, a method is described for improved quantification of partial discharges(PD) in power cables. The method is suitable for PD-detection and location systems in the MHz-range, where pulse attenuation and distortion along the cable cannot be neglected. The system transfer function...... was calculated and measured in order to form basis for magnitude calculation after each measurements. --- Limitations and capabilities of the method will be discussed and related to relevant field applications of high frequent PD-measurements. --- Methods for increased signal/noise ratio are easily implemented...

  14. 'Motion frozen' quantification and display of myocardial perfusion gated SPECT

    International Nuclear Information System (INIS)

    Slomka, P.J.; Hurwitz, G.A.; Baddredine, M.; Baranowski, J.; Aladl, U.E.

    2002-01-01

    Aim: Gated SPECT imaging incorporates both functional and perfusion information of the left ventricle (LV). However perfusion data is confounded by the effect of ventricular motion. Most existing quantification paradigms simply add all gated frames and then proceed to extract the perfusion information from static images, discarding the effects of cardiac motion. In an attempt to improve the reliability and accuracy of cardiac SPECT quantification we propose to eliminate the LV motion prior to the perfusion quantification via automated image warping algorithm. Methods: A pilot series of 14 male and 11 female gated stress SPECT images acquired with 8 time bins have been co-registered to the coordinates of the 3D normal templates. Subsequently the LV endo and epi-cardial 3D points (300-500) were identified on end-systolic (ES) and end-diastolic (ED) frames, defining the ES-ED motion vectors. The nonlinear image warping algorithm (thin-plate-spline) was then applied to warp end-systolic frame was onto the end-diastolic frames using the corresponding ES-ED motion vectors. The remaining 6 intermediate frames were also transformed to the ED coordinates using fractions of the motion vectors. Such warped images were then summed to provide the LV perfusion image in the ED phase but with counts from the full cycle. Results: The identification of the ED/ES corresponding points was successful in all cases. The corrected displacement between ED and ES images was up to 25 mm. The summed images had the appearance of the ED frames but have been much less noisy since all the counts have been used. The spatial resolution of such images appeared higher than that of summed gated images, especially in the female scans. These 'motion frozen' images could be displayed and quantified as regular non-gated tomograms including polar map paradigm. Conclusions: This image processing technique may improve the effective image resolution of summed gated myocardial perfusion images used for

  15. Molecular nonlinear dynamics and protein thermal uncertainty quantification

    Science.gov (United States)

    Xia, Kelin; Wei, Guo-Wei

    2014-01-01

    This work introduces molecular nonlinear dynamics (MND) as a new approach for describing protein folding and aggregation. By using a mode system, we show that the MND of disordered proteins is chaotic while that of folded proteins exhibits intrinsically low dimensional manifolds (ILDMs). The stability of ILDMs is found to strongly correlate with protein energies. We propose a novel method for protein thermal uncertainty quantification based on persistently invariant ILDMs. Extensive comparison with experimental data and the state-of-the-art methods in the field validate the proposed new method for protein B-factor prediction. PMID:24697365

  16. Temporal and spatial quantification of farm and landscape functions

    DEFF Research Database (Denmark)

    Andersen, Peter Stubkjær

    , residence, habitat, and recreation; development of a method for quantifying farm functionality and assessing multifunctionality; and definition of a farm typology based on multifunctionality strategies. Empirical data from farm interviews were used in the study to test the developed methods. The results...... is generally decreases and a tendency of increased segregation of the rural landscape is observed. In perspective, further studies on quantification in tangible units, synergies and trade-offs between functions at different scales, and correlations between structures and functions are needed....

  17. Multi data reservior history matching and uncertainty quantification framework

    KAUST Repository

    Katterbauer, Klemens

    2015-11-26

    A multi-data reservoir history matching and uncertainty quantification framework is provided. The framework can utilize multiple data sets such as production, seismic, electromagnetic, gravimetric and surface deformation data for improving the history matching process. The framework can consist of a geological model that is interfaced with a reservoir simulator. The reservoir simulator can interface with seismic, electromagnetic, gravimetric and surface deformation modules to predict the corresponding observations. The observations can then be incorporated into a recursive filter that subsequently updates the model state and parameters distributions, providing a general framework to quantify and eventually reduce with the data, uncertainty in the estimated reservoir state and parameters.

  18. Review of some aspects of human reliability quantification

    International Nuclear Information System (INIS)

    Lydell, B.O.Y.; Spurgin, A.J.; Hannaman, G.W.; Lukic, Y.D.

    1986-01-01

    An area in systems reliability considered to be weak, is the characterization and quantification of the role of the operations and maintenance staff in combatting accidents. Several R and D programs are underway to improve the modeling of human interactions and some progress has been made. This paper describes a specific aspect of human reliability analysis which is referred to as modeling of cognitive processes. In particular, the basis for the so- called Human Cognitive Reliability (HCR) model is described and the focus is on its validation and on its benefits and limitations

  19. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Stracuzzi, David John [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Brost, Randolph [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Chen, Maximillian Gene [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Malinas, Rebecca [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Peterson, Matthew Gregor [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Phillips, Cynthia A. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Robinson, David G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Woodbridge, Diane [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search, and discuss a number of possible improvements for each.

  20. NEW MODEL FOR QUANTIFICATION OF ICT DEPENDABLE ORGANIZATIONS RESILIENCE

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2011-03-01

    Full Text Available Business environment today demands high reliable organizations in every segment to be competitive on the global market. Beside that, ICT sector is becoming irreplaceable in many fields of business, from the communication to the complex systems for process control and production. To fulfill those requirements and to develop further, many organizations worldwide are implementing business paradigm called - organizations resilience. Although resilience is well known term in many science fields, it is not well studied due to its complex nature. This paper is dealing with developing the new model for assessment and quantification of ICT dependable organizations resilience.

  1. Uncertainty Quantification and Statistical Engineering for Hypersonic Entry Applications

    Science.gov (United States)

    Cozmuta, Ioana

    2011-01-01

    NASA has invested significant resources in developing and validating a mathematical construct for TPS margin management: a) Tailorable for low/high reliability missions; b) Tailorable for ablative/reusable TPS; c) Uncertainty Quantification and Statistical Engineering are valuable tools not exploited enough; and d) Need to define strategies combining both Theoretical Tools and Experimental Methods. The main reason for this lecture is to give a flavor of where UQ and SE could contribute and hope that the broader community will work with us to improve in these areas.

  2. DNA imaging and quantification using chemi-luminescent probes

    International Nuclear Information System (INIS)

    Dorner, G.; Redjdal, N.; Laniece, P.; Siebert, R.; Tricoire, H.; Valentin, L.

    1999-01-01

    During this interdisciplinary study we have developed an ultra sensitive and reliable imaging system of DNA labelled by chemiluminescence. Based on a liquid nitrogen cooled CCD, the system achieves sensitivities down to 10 fg/mm 2 labelled DNA over a surface area of 25 x 25 cm 2 with a sub-millimeter resolution. Commercially available chemi-luminescent - and enhancer molecules are compared and their reaction conditions optimized for best signal-to-noise ratios. Double labelling was performed to verify quantification with radioactive probes. (authors)

  3. Nuclear Data Uncertainty Quantification: Past, Present and Future

    International Nuclear Information System (INIS)

    Smith, D.L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested

  4. Nuclear Data Uncertainty Quantification: Past, Present and Future

    Science.gov (United States)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  5. Quantification in histopathology-Can magnetic particles help?

    International Nuclear Information System (INIS)

    Mitchels, John; Hawkins, Peter; Luxton, Richard; Rhodes, Anthony

    2007-01-01

    Every year, more than 270,000 people are diagnosed with cancer in the UK alone; this means that one in three people worldwide contract cancer within their lifetime. Histopathology is the principle method for confirming cancer and directing treatment. In this paper, a novel application of magnetic particles is proposed to help address the problem of subjectivity in histopathology. Preliminary results indicate that magnetic nanoparticles cannot only be used to assist diagnosis through improving quantification but also potentially increase throughput, hence offering a way of dramatically reducing costs within the routine histopathology laboratory

  6. Quantification of protein concentration using UV absorbance and Coomassie dyes.

    Science.gov (United States)

    Noble, James E

    2014-01-01

    The measurement of a solubilized protein concentration in solution is an important assay in biochemistry research and development labs for applications ranging from enzymatic studies to providing data for biopharmaceutical lot release. Spectrophotometric protein quantification assays are methods that use UV and visible spectroscopy to rapidly determine the concentration of protein, relative to a standard, or using an assigned extinction coefficient. Where multiple samples need measurement, and/or the sample volume and concentration is limited, preparations of the Coomassie dye commonly known as the Bradford assay can be used. © 2014 Elsevier Inc. All rights reserved.

  7. Development of magnetic resonance technology for noninvasive boron quantification

    International Nuclear Information System (INIS)

    Bradshaw, K.M.

    1990-11-01

    Boron magnetic resonance imaging (MRI) and spectroscopy (MRS) were developed in support of the noninvasive boron quantification task of the Idaho National Engineering Laboratory (INEL) Power Burst Facility/Boron Neutron Capture Therapy (PBF/BNCT) program. The hardware and software described in this report are modifications specific to a GE Signa trademark MRI system, release 3.X and are necessary for boron magnetic resonance operation. The technology developed in this task has been applied to obtaining animal pharmacokinetic data of boron compounds (drug time response) and the in-vivo localization of boron in animal tissue noninvasively. 9 refs., 21 figs

  8. QUANTIFICATION OF ANGIOGENESIS IN THE CHICKEN CHORIOALLANTOIC MEMBRANE (CAM

    Directory of Open Access Journals (Sweden)

    Silvia Blacher

    2011-05-01

    Full Text Available The chick chorioallantoic membrane (CAM provides a suitable in vivo model to study angiogenesis and evaluate several pro- and anti-angiogenic factors and compounds. In the present work, new developments in image analysis are used to quantify CAM angiogenic response from optical microscopic observations, covering all vascular components, from the large supplying and feeding vessels down to the capillary plexus. To validate our methodology angiogenesis is quantified during two phases of CAM development (day 7 and 13 and after treatment with an antiangiogenic modulator of the angiogenesis. Our morphometric analysis emphasizes that an accurate quantification of the CAM vasculature needs to be performed at various scales.

  9. Quantification of transformation products of rocket fuel unsymmetrical dimethylhydrazine in soils using SPME and GC-MS.

    Science.gov (United States)

    Bakaikina, Nadezhda V; Kenessov, Bulat; Ul'yanovskii, Nikolay V; Kosyakov, Dmitry S

    2018-07-01

    Determination of transformation products (TPs) of rocket fuel unsymmetrical dimethylhydrazine (UDMH) in soil is highly important for environmental impact assessment of the launches of heavy space rockets from Kazakhstan, Russia, China and India. The method based on headspace solid-phase microextraction (HS SPME) and gas chromatography-mass spectrometry is advantageous over other known methods due to greater simplicity and cost efficiency. However, accurate quantification of these analytes using HS SPME is limited by the matrix effect. In this research, we proposed using internal standard and standard addition calibrations to achieve proper combination of accuracies of the quantification of key TPs of UDMH and cost efficiency. 1-Trideuteromethyl-1H-1,2,4-triazole (MTA-d3) was used as the internal standard. Internal standard calibration allowed controlling matrix effects during quantification of 1-methyl-1H-1,2,4-triazole (MTA), N,N-dimethylformamide (DMF), and N-nitrosodimethylamine (NDMA) in soils with humus content < 1%. Using SPME at 60 °C for 15 min by 65 µm Carboxen/polydimethylsiloxane fiber, recoveries of MTA, DMF and NDMA for sandy and loamy soil samples were 91-117, 85-123 and 64-132%, respectively. For improving the method accuracy and widening the range of analytes, standard addition and its combination with internal standard calibration were tested and compared on real soil samples. The combined calibration approach provided greatest accuracies for NDMA, DMF, N-methylformamide, formamide, 1H-pyrazole, 3-methyl-1H-pyrazole and 1H-pyrazole. For determination of 1-formyl-2,2-dimethylhydrazine, 3,5-dimethylpyrazole, 2-ethyl-1H-imidazole, 1H-imidazole, 1H-1,2,4-triazole, pyrazines and pyridines, standard addition calibration is more suitable. However, the proposed approach and collected data allow using both approaches simultaneously. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. MDCT quantification is the dominant parameter in decision-making regarding chest tube drainage for stable patients with traumatic pneumothorax.

    Science.gov (United States)

    Cai, Wenli; Lee, June-Goo; Fikry, Karim; Yoshida, Hiroyuki; Novelline, Robert; de Moya, Marc

    2012-07-01

    It is commonly believed that the size of a pneumothorax is an important determinant of treatment decision, in particular regarding whether chest tube drainage (CTD) is required. However, the volumetric quantification of pneumothoraces has not routinely been performed in clinics. In this paper, we introduced an automated computer-aided volumetry (CAV) scheme for quantification of volume of pneumothoraces in chest multi-detect CT (MDCT) images. Moreover, we investigated the impact of accurate volume of pneumothoraces in the improvement of the performance in decision-making regarding CTD in the management of traumatic pneumothoraces. For this purpose, an occurrence frequency map was calculated for quantitative analysis of the importance of each clinical parameter in the decision-making regarding CTD by a computer simulation of decision-making using a genetic algorithm (GA) and a support vector machine (SVM). A total of 14 clinical parameters, including volume of pneumothorax calculated by our CAV scheme, was collected as parameters available for decision-making. The results showed that volume was the dominant parameter in decision-making regarding CTD, with an occurrence frequency value of 1.00. The results also indicated that the inclusion of volume provided the best performance that was statistically significant compared to the other tests in which volume was excluded from the clinical parameters. This study provides the scientific evidence for the application of CAV scheme in MDCT volumetric quantification of pneumothoraces in the management of clinically stable chest trauma patients with traumatic pneumothorax. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Accurate quantification of 5 German cockroach (GCr) allergens in complex extracts using multiple reaction monitoring mass spectrometry (MRM MS).

    Science.gov (United States)

    Mindaye, S T; Spiric, J; David, N A; Rabin, R L; Slater, J E

    2017-12-01

    German cockroach (GCr) allergen extracts are complex and heterogeneous products, and methods to better assess their potency and composition are needed for adequate studies of their safety and efficacy. The objective of this study was to develop an assay based on liquid chromatography and multiple reaction monitoring mass spectrometry (LC-MRM MS) for rapid, accurate, and reproducible quantification of 5 allergens (Bla g 1, Bla g 2, Bla g 3, Bla g 4, and Bla g 5) in crude GCr allergen extracts. We first established a comprehensive peptide library of allergens from various commercial extracts as well as recombinant allergens. Peptide mapping was performed using high-resolution MS, and the peptide library was then used to identify prototypic and quantotypic peptides to proceed with MRM method development. Assay development included a systematic optimization of digestion conditions (buffer, digestion time, and trypsin concentration), chromatographic separation, and MS parameters. Robustness and suitability were assessed following ICH (Q2 [R1]) guidelines. The method is precise (RSD  0.99, 0.01-1384 fmol/μL), and sensitive (LLOD and LLOQ MS, we quantified allergens from various commercial GCr extracts and showed considerable variability that may impact clinical efficacy. Our data demonstrate that the LC-MRM MS method is valuable for absolute quantification of allergens in GCr extracts and likely has broader applicability to other complex allergen extracts. Definitive quantification provides a new standard for labelling of allergen extracts, which will inform patient care, enable personalized therapy, and enhance the efficacy of immunotherapy for environmental and food allergies. © 2017 The Authors. Clinical & Experimental Allergy published by John Wiley & Sons Ltd. This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  12. Prospective comparison of liver stiffness measurements between two point wave elastography methods: Virtual ouch quantification and elastography point quantification

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Hyun Suk; Lee, Jeong Min; Yoon, Jeong Hee; Lee, Dong Ho; Chang, Won; Han, Joon Koo [Seoul National University Hospital, Seoul (Korea, Republic of)

    2016-09-15

    To prospectively compare technical success rate and reliable measurements of virtual touch quantification (VTQ) elastography and elastography point quantification (ElastPQ), and to correlate liver stiffness (LS) measurements obtained by the two elastography techniques. Our study included 85 patients, 80 of whom were previously diagnosed with chronic liver disease. The technical success rate and reliable measurements of the two kinds of point shear wave elastography (pSWE) techniques were compared by χ{sup 2} analysis. LS values measured using the two techniques were compared and correlated via Wilcoxon signed-rank test, Spearman correlation coefficient, and 95% Bland-Altman limit of agreement. The intraobserver reproducibility of ElastPQ was determined by 95% Bland-Altman limit of agreement and intraclass correlation coefficient (ICC). The two pSWE techniques showed similar technical success rate (98.8% for VTQ vs. 95.3% for ElastPQ, p = 0.823) and reliable LS measurements (95.3% for VTQ vs. 90.6% for ElastPQ, p = 0.509). The mean LS measurements obtained by VTQ (1.71 ± 0.47 m/s) and ElastPQ (1.66 ± 0.41 m/s) were not significantly different (p = 0.209). The LS measurements obtained by the two techniques showed strong correlation (r = 0.820); in addition, the 95% limit of agreement of the two methods was 27.5% of the mean. Finally, the ICC of repeat ElastPQ measurements was 0.991. Virtual touch quantification and ElastPQ showed similar technical success rate and reliable measurements, with strongly correlated LS measurements. However, the two methods are not interchangeable due to the large limit of agreement.

  13. DNA imaging and quantification using chemi-luminescent probes; Imagerie et quantification d`ADN par chimiluminescence

    Energy Technology Data Exchange (ETDEWEB)

    Dorner, G; Redjdal, N; Laniece, P; Siebert, R; Tricoire, H; Valentin, L [Groupe I.P.B., Experimental Research Division, Inst. de Physique Nucleaire, Paris-11 Univ., 91 - Orsay (France)

    1999-11-01

    During this interdisciplinary study we have developed an ultra sensitive and reliable imaging system of DNA labelled by chemiluminescence. Based on a liquid nitrogen cooled CCD, the system achieves sensitivities down to 10 fg/mm{sup 2} labelled DNA over a surface area of 25 x 25 cm{sup 2} with a sub-millimeter resolution. Commercially available chemi-luminescent - and enhancer molecules are compared and their reaction conditions optimized for best signal-to-noise ratios. Double labelling was performed to verify quantification with radioactive probes. (authors) 1 fig.

  14. A Spanish model for quantification and management of construction waste

    International Nuclear Information System (INIS)

    Solis-Guzman, Jaime; Marrero, Madelyn; Montes-Delgado, Maria Victoria; Ramirez-de-Arellano, Antonio

    2009-01-01

    Currently, construction and demolition waste (C and D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C and D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C and D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C and D waste volume in both new construction and demolition projects.

  15. Overview of hybrid subspace methods for uncertainty quantification, sensitivity analysis

    International Nuclear Information System (INIS)

    Abdel-Khalik, Hany S.; Bang, Youngsuk; Wang, Congjian

    2013-01-01

    Highlights: ► We overview the state-of-the-art in uncertainty quantification and sensitivity analysis. ► We overview new developments in above areas using hybrid methods. ► We give a tutorial introduction to above areas and the new developments. ► Hybrid methods address the explosion in dimensionality in nonlinear models. ► Representative numerical experiments are given. -- Abstract: The role of modeling and simulation has been heavily promoted in recent years to improve understanding of complex engineering systems. To realize the benefits of modeling and simulation, concerted efforts in the areas of uncertainty quantification and sensitivity analysis are required. The manuscript intends to serve as a pedagogical presentation of the material to young researchers and practitioners with little background on the subjects. We believe this is important as the role of these subjects is expected to be integral to the design, safety, and operation of existing as well as next generation reactors. In addition to covering the basics, an overview of the current state-of-the-art will be given with particular emphasis on the challenges pertaining to nuclear reactor modeling. The second objective will focus on presenting our own development of hybrid subspace methods intended to address the explosion in the computational overhead required when handling real-world complex engineering systems.

  16. Evaluation of semi-automatic arterial stenosis quantification

    International Nuclear Information System (INIS)

    Hernandez Hoyos, M.; Universite Claude Bernard Lyon 1, 69 - Villeurbanne; Univ. de los Andes, Bogota; Serfaty, J.M.; Douek, P.C.; Universite Claude Bernard Lyon 1, 69 - Villeurbanne; Hopital Cardiovasculaire et Pneumologique L. Pradel, Bron; Maghiar, A.; Mansard, C.; Orkisz, M.; Magnin, I.; Universite Claude Bernard Lyon 1, 69 - Villeurbanne

    2006-01-01

    Object: To assess the accuracy and reproducibility of semi-automatic vessel axis extraction and stenosis quantification in 3D contrast-enhanced Magnetic Resonance Angiography (CE-MRA) of the carotid arteries (CA). Materials and methods: A total of 25 MRA datasets was used: 5 phantoms with known stenoses, and 20 patients (40 CAs) drawn from a multicenter trial database. Maracas software extracted vessel centerlines and quantified the stenoses, based on boundary detection in planes perpendicular to the centerline. Centerline accuracy was visually scored. Semi-automatic measurements were compared with: (1) theoretical phantom morphometric values, and (2) stenosis degrees evaluated by two independent radiologists. Results: Exploitable centerlines were obtained in 97% of CA and in all phantoms. In phantoms, the software achieved a better agreement with theoretic stenosis degrees (weighted kappa Κ W = 0.91) than the radiologists (Κ W = 0.69). In patients, agreement between software and radiologists varied from Κ W =0.67 to 0.90. In both, Maracas was substantially more reproducible than the readers. Mean operating time was within 1 min/ CA. Conclusion: Maracas software generates accurate 3D centerlines of vascular segments with minimum user intervention. Semi-automatic quantification of CA stenosis is also accurate, except in very severe stenoses that cannot be segmented. It substantially reduces the inter-observer variability. (orig.)

  17. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  18. 3D histomorphometric quantification from 3D computed tomography

    International Nuclear Information System (INIS)

    Oliveira, L.F. de; Lopes, R.T.

    2004-01-01

    The histomorphometric analysis is based on stereologic concepts and was originally applied to biologic samples. This technique has been used to evaluate different complex structures such as ceramic filters, net structures and cancellous objects that are objects with inner connected structures. The measured histomorphometric parameters of structure are: sample volume to total reconstructed volume (BV/TV), sample surface to sample volume (BS/BV), connection thickness (Tb Th ), connection number (Tb N ) and connection separation (Tb Sp ). The anisotropy was evaluated as well. These parameters constitute the base of histomorphometric analysis. The quantification is realized over cross-sections recovered by cone beam reconstruction, where a real-time microfocus radiographic system is used as tomographic system. The three-dimensional (3D) histomorphometry, obtained from tomography, corresponds to an evolution of conventional method that is based on 2D analysis. It is more coherent with morphologic and topologic context of the sample. This work shows result from 3D histomorphometric quantification to characterize objects examined by 3D computer tomography. The results, which characterizes the internal structures of ceramic foams with different porous density, are compared to results from conventional methods

  19. Mixture quantification using PLS in plastic scintillation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Bagan, H.; Tarancon, A.; Rauret, G. [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain); Garcia, J.F., E-mail: jfgarcia@ub.ed [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain)

    2011-06-15

    This article reports the capability of plastic scintillation (PS) combined with multivariate calibration (Partial least squares; PLS) to detect and quantify alpha and beta emitters in mixtures. While several attempts have been made with this purpose in mind using liquid scintillation (LS), no attempt was done using PS that has the great advantage of not producing mixed waste after the measurements are performed. Following this objective, ternary mixtures of alpha and beta emitters ({sup 241}Am, {sup 137}Cs and {sup 90}Sr/{sup 90}Y) have been quantified. Procedure optimisation has evaluated the use of the net spectra or the sample spectra, the inclusion of different spectra obtained at different values of the Pulse Shape Analysis parameter and the application of the PLS1 or PLS2 algorithms. The conclusions show that the use of PS+PLS2 applied to the sample spectra, without the use of any pulse shape discrimination, allows quantification of the activities with relative errors less than 10% in most of the cases. This procedure not only allows quantification of mixtures but also reduces measurement time (no blanks are required) and the application of this procedure does not require detectors that include the pulse shape analysis parameter.

  20. A Spanish model for quantification and management of construction waste.

    Science.gov (United States)

    Solís-Guzmán, Jaime; Marrero, Madelyn; Montes-Delgado, Maria Victoria; Ramírez-de-Arellano, Antonio

    2009-09-01

    Currently, construction and demolition waste (C&D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C&D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C&D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C&D waste volume in both new construction and demolition projects.

  1. Sludge quantification at water treatment plant and its management scenario.

    Science.gov (United States)

    Ahmad, Tarique; Ahmad, Kafeel; Alam, Mehtab

    2017-08-15

    Large volume of sludge is generated at the water treatment plants during the purification of surface water for potable supplies. Handling and disposal of sludge require careful attention from civic bodies, plant operators, and environmentalists. Quantification of the sludge produced at the treatment plants is important to develop suitable management strategies for its economical and environment friendly disposal. Present study deals with the quantification of sludge using empirical relation between turbidity, suspended solids, and coagulant dosing. Seasonal variation has significant effect on the raw water quality received at the water treatment plants so forth sludge generation also varies. Yearly production of the sludge in a water treatment plant at Ghaziabad, India, is estimated to be 29,700 ton. Sustainable disposal of such a quantity of sludge is a challenging task under stringent environmental legislation. Several beneficial reuses of sludge in civil engineering and constructional work have been identified globally such as raw material in manufacturing cement, bricks, and artificial aggregates, as cementitious material, and sand substitute in preparing concrete and mortar. About 54 to 60% sand, 24 to 28% silt, and 16% clay constitute the sludge generated at the water treatment plant under investigation. Characteristics of the sludge are found suitable for its potential utilization as locally available construction material for safe disposal. An overview of the sustainable management scenario involving beneficial reuses of the sludge has also been presented.

  2. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    Science.gov (United States)

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.

  3. Spectroscopic quantification of 5-hydroxymethylcytosine in genomic DNA.

    Science.gov (United States)

    Shahal, Tamar; Gilat, Noa; Michaeli, Yael; Redy-Keisar, Orit; Shabat, Doron; Ebenstein, Yuval

    2014-08-19

    5-Hydroxymethylcytosine (5hmC), a modified form of the DNA base cytosine, is an important epigenetic mark linked to regulation of gene expression in development, and tumorigenesis. We have developed a spectroscopic method for a global quantification of 5hmC in genomic DNA. The assay is performed within a multiwell plate, which allows simultaneous recording of up to 350 samples. Our quantification procedure of 5hmC is direct, simple, and rapid. It relies on a two-step protocol that consists of enzymatic glucosylation of 5hmC with an azide-modified glucose, followed by a "click reaction" with an alkyne-fluorescent tag. The fluorescence intensity recorded from the DNA sample is proportional to its 5hmC content and can be quantified by a simple plate reader measurement. This labeling technique is specific and highly sensitive, allowing detection of 5hmC down to 0.002% of the total nucleotides. Our results reveal significant variations in the 5hmC content obtained from different mouse tissues, in agreement with previously reported data.

  4. Quantification of 5-methyl-2'-deoxycytidine in the DNA.

    Science.gov (United States)

    Giel-Pietraszuk, Małgorzata; Insińska-Rak, Małgorzata; Golczak, Anna; Sikorski, Marek; Barciszewska, Mirosława; Barciszewski, Jan

    2015-01-01

    Methylation at position 5 of cytosine (Cyt) at the CpG sequences leading to formation of 5-methyl-cytosine (m(5)Cyt) is an important element of epigenetic regulation of gene expression. Modification of the normal methylation pattern, unique to each organism, leads to the development of pathological processes and diseases, including cancer. Therefore, quantification of the DNA methylation and analysis of changes in the methylation pattern is very important from a practical point of view and can be used for diagnostic purposes, as well as monitoring of the treatment progress. In this paper we present a new method for quantification of 5-methyl-2'deoxycytidine (m(5)C) in the DNA. The technique is based on conversion of m(5)C into fluorescent 3,N(4)-etheno-5-methyl-2'deoxycytidine (εm(5)C) and its identification by reversed-phase high-performance liquid chromatography (RP-HPLC). The assay was used to evaluate m(5)C concentration in DNA of calf thymus and peripheral blood of cows bred under different conditions. This approach can be applied for measuring of 5-methylcytosine in cellular DNA from different cells and tissues.

  5. Accurate quantification of supercoiled DNA by digital PCR

    Science.gov (United States)

    Dong, Lianhua; Yoo, Hee-Bong; Wang, Jing; Park, Sang-Ryoul

    2016-01-01

    Digital PCR (dPCR) as an enumeration-based quantification method is capable of quantifying the DNA copy number without the help of standards. However, it can generate false results when the PCR conditions are not optimized. A recent international comparison (CCQM P154) showed that most laboratories significantly underestimated the concentration of supercoiled plasmid DNA by dPCR. Mostly, supercoiled DNAs are linearized before dPCR to avoid such underestimations. The present study was conducted to overcome this problem. In the bilateral comparison, the National Institute of Metrology, China (NIM) optimized and applied dPCR for supercoiled DNA determination, whereas Korea Research Institute of Standards and Science (KRISS) prepared the unknown samples and quantified them by flow cytometry. In this study, several factors like selection of the PCR master mix, the fluorescent label, and the position of the primers were evaluated for quantifying supercoiled DNA by dPCR. This work confirmed that a 16S PCR master mix avoided poor amplification of the supercoiled DNA, whereas HEX labels on dPCR probe resulted in robust amplification curves. Optimizing the dPCR assay based on these two observations resulted in accurate quantification of supercoiled DNA without preanalytical linearization. This result was validated in close agreement (101~113%) with the result from flow cytometry. PMID:27063649

  6. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography.

    Science.gov (United States)

    Loss, Leandro A; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2012-10-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides.

  7. A practical method for accurate quantification of large fault trees

    International Nuclear Information System (INIS)

    Choi, Jong Soo; Cho, Nam Zin

    2007-01-01

    This paper describes a practical method to accurately quantify top event probability and importance measures from incomplete minimal cut sets (MCS) of a large fault tree. The MCS-based fault tree method is extensively used in probabilistic safety assessments. Several sources of uncertainties exist in MCS-based fault tree analysis. The paper is focused on quantification of the following two sources of uncertainties: (1) the truncation neglecting low-probability cut sets and (2) the approximation in quantifying MCSs. The method proposed in this paper is based on a Monte Carlo simulation technique to estimate probability of the discarded MCSs and the sum of disjoint products (SDP) approach complemented by the correction factor approach (CFA). The method provides capability to accurately quantify the two uncertainties and estimate the top event probability and importance measures of large coherent fault trees. The proposed fault tree quantification method has been implemented in the CUTREE code package and is tested on the two example fault trees

  8. Unilateral condylar hyperplasia: a 3-dimensional quantification of asymmetry.

    Directory of Open Access Journals (Sweden)

    Tim J Verhoeven

    Full Text Available PURPOSE: Objective quantifications of facial asymmetry in patients with Unilateral Condylar Hyperplasia (UCH have not yet been described in literature. The aim of this study was to objectively quantify soft-tissue asymmetry in patients with UCH and to compare the findings with a control group using a new method. MATERIAL AND METHODS: Thirty 3D photographs of patients diagnosed with UCH were compared with 30 3D photographs of healthy controls. As UCH presents particularly in the mandible, a new method was used to isolate the lower part of the face to evaluate asymmetry of this part separately. The new method was validated by two observers using 3D photographs of five patients and five controls. RESULTS: A significant difference (0.79 mm between patients and controls whole face asymmetry was found. Intra- and inter-observer differences of 0.011 mm (-0.034-0.011 and 0.017 mm (-0.007-0.042 respectively were found. These differences are irrelevant in clinical practice. CONCLUSION: After objective quantification, a significant difference was identified in soft-tissue asymmetry between patients with UCH and controls. The method used to isolate mandibular asymmetry was found to be valid and a suitable tool to evaluate facial asymmetry.

  9. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  10. Complex Empiricism and the Quantification of Uncertainty in Paleoclimate Reconstructions

    Science.gov (United States)

    Brumble, K. C.

    2014-12-01

    Because the global climate cannot be observed directly, and because of vast and noisy data sets, climate science is a rich field to study how computational statistics informs what it means to do empirical science. Traditionally held virtues of empirical science and empirical methods like reproducibility, independence, and straightforward observation are complicated by representational choices involved in statistical modeling and data handling. Examining how climate reconstructions instantiate complicated empirical relationships between model, data, and predictions reveals that the path from data to prediction does not match traditional conceptions of empirical inference either. Rather, the empirical inferences involved are "complex" in that they require articulation of a good deal of statistical processing wherein assumptions are adopted and representational decisions made, often in the face of substantial uncertainties. Proxy reconstructions are both statistical and paleoclimate science activities aimed at using a variety of proxies to reconstruct past climate behavior. Paleoclimate proxy reconstructions also involve complex data handling and statistical refinement, leading to the current emphasis in the field on the quantification of uncertainty in reconstructions. In this presentation I explore how the processing needed for the correlation of diverse, large, and messy data sets necessitate the explicit quantification of the uncertainties stemming from wrangling proxies into manageable suites. I also address how semi-empirical pseudo-proxy methods allow for the exploration of signal detection in data sets, and as intermediary steps for statistical experimentation.

  11. Within-day repeatability for absolute quantification of Lawsonia intracellularis bacteria in feces from growing pigs

    DEFF Research Database (Denmark)

    Pedersen, Ken Steen; Pedersen, Klaus H.; Hjulsager, Charlotte Kristiane

    2012-01-01

    Absolute quantification of Lawsonia intracellularis by real-time polymerase chain reaction (PCR) is now possible on a routine basis. Poor repeatability of quantification can result in disease status misclassification of individual pigs when a single fecal sample is obtained. The objective...

  12. Quantification of cellular uptake of DNA nanostructures by qPCR

    DEFF Research Database (Denmark)

    Okholm, Anders Hauge; Nielsen, Jesper Sejrup; Vinther, Mathias

    2014-01-01

    interactions and structural and functional features of the DNA delivery device must be thoroughly investigated. Here, we present a rapid and robust method for the precise quantification of the component materials of DNA origami structures capable of entering cells in vitro. The quantification is performed...

  13. Accurate Quantification of Cardiovascular Biomarkers in Serum Using Protein Standard Absolute Quantification (PSAQ™) and Selected Reaction Monitoring*

    Science.gov (United States)

    Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-01-01

    Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464

  14. Rapid and sensitive quantification of C3- and C6-phosphoesters in starch by fluorescence-assisted capillary electrophoresis.

    Science.gov (United States)

    Verbeke, Jeremy; Penverne, Christophe; D'Hulst, Christophe; Rolando, Christian; Szydlowski, Nicolas

    2016-11-05

    Phosphate groups are naturally present in starch at C3- or C6-position of the glucose residues and impact the structure of starch granules. Their precise quantification is necessary for understanding starch physicochemical properties and metabolism. Nevertheless, reliable quantification of Glc-3-P remains laborious and time consuming. Here we describe a capillary electrophoresis method for simultaneous measurement of both Glc-6-P and Glc-3-P after acid hydrolysis of starch. The sensitivity threshold was estimated at the fg scale, which is compatible with the analysis of less than a μg of sample. The method was validated by analyzing antisense potato lines deficient in SBEs, GWD or GBSS. We show that Glc-3-P content is altered in the latter and that these variations do not correlate with modifications in Glc-6-P content. We anticipate the method reported here to be an efficient tool for high throughput study of starch phosphorylation at both C3- and C6-position. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. A novel nano-immunoassay method for quantification of proteins from CD138-purified myeloma cells: biological and clinical utility.

    Science.gov (United States)

    Misiewicz-Krzeminska, Irena; Corchete, Luis Antonio; Rojas, Elizabeta A; Martínez-López, Joaquín; García-Sanz, Ramón; Oriol, Albert; Bladé, Joan; Lahuerta, Juan-José; Miguel, Jesús San; Mateos, María-Victoria; Gutiérrez, Norma C

    2018-05-01

    Protein analysis in bone marrow samples from patients with multiple myeloma has been limited by the low concentration of proteins obtained after CD138 + cell selection. A novel approach based on capillary nano-immunoassay could make it possible to quantify dozens of proteins from each myeloma sample in an automated manner. Here we present a method for the accurate and robust quantification of the expression of multiple proteins extracted from CD138-purified multiple myeloma samples frozen in RLT Plus buffer, which is commonly used for nucleic acid preservation and isolation. Additionally, the biological and clinical value of this analysis for a panel of 12 proteins essential to the pathogenesis of multiple myeloma was evaluated in 63 patients with newly diagnosed multiple myeloma. The analysis of the prognostic impact of CRBN /Cereblon and IKZF1 /Ikaros mRNA/protein showed that only the protein levels were able to predict progression-free survival of patients; mRNA levels were not associated with prognosis. Interestingly, high levels of Cereblon and Ikaros proteins were associated with longer progression-free survival only in patients who received immunomodulatory drugs and not in those treated with other drugs. In conclusion, the capillary nano-immunoassay platform provides a novel opportunity for automated quantification of the expression of more than 20 proteins in CD138 + primary multiple myeloma samples. Copyright © 2018 Ferrata Storti Foundation.

  16. Quantification by aberration corrected (S)TEM of boundaries formed by symmetry breaking phase transformations

    Energy Technology Data Exchange (ETDEWEB)

    Schryvers, D., E-mail: nick.schryvers@uantwerpen.be [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Salje, E.K.H. [Department of Earth Sciences, University of Cambridge, Cambridge CB2 3EQ (United Kingdom); Nishida, M. [Department of Engineering Sciences for Electronics and Materials, Faculty of Engineering Sciences, Kyushu University, Kasuga, Fukuoka 816-8580 (Japan); De Backer, A. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Idrissi, H. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Institute of Mechanics, Materials and Civil Engineering, Université Catholique de Louvain, Place Sainte Barbe, 2, B-1348, Louvain-la-Neuve (Belgium); Van Aert, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2017-05-15

    The present contribution gives a review of recent quantification work of atom displacements, atom site occupations and level of crystallinity in various systems and based on aberration corrected HR(S)TEM images. Depending on the case studied, picometer range precisions for individual distances can be obtained, boundary widths at the unit cell level determined or statistical evolutions of fractions of the ordered areas calculated. In all of these cases, these quantitative measures imply new routes for the applications of the respective materials. - Highlights: • Quantification of picometer displacements at ferroelastic twin boundary in CaTiO{sub 3.} • Quantification of kinks in meandering ferroelectric domain wall in LiNbO{sub 3}. • Quantification of column occupation in anti-phase boundary in Co-Pt. • Quantification of atom displacements at twin boundary in Ni-Ti B19′ martensite.

  17. PCR amplification of repetitive sequences as a possible approach in relative species quantification

    DEFF Research Database (Denmark)

    Ballin, Nicolai Zederkopff; Vogensen, Finn Kvist; Karlsson, Anders H

    2012-01-01

    Abstract Both relative and absolute quantifications are possible in species quantification when single copy genomic DNA is used. However, amplification of single copy genomic DNA does not allow a limit of detection as low as one obtained from amplification of repetitive sequences. Amplification...... of repetitive sequences is therefore frequently used in absolute quantification but problems occur in relative quantification as the number of repetitive sequences is unknown. A promising approach was developed where data from amplification of repetitive sequences were used in relative quantification of species...... to relatively quantify the amount of chicken DNA in a binary mixture of chicken DNA and pig DNA. However, the designed PCR primers lack the specificity required for regulatory species control....

  18. Real-time PCR for the quantification of fungi in planta.

    Science.gov (United States)

    Klosterman, Steven J

    2012-01-01

    Methods enabling quantification of fungi in planta can be useful for a variety of applications. In combination with information on plant disease severity, indirect quantification of fungi in planta offers an additional tool in the screening of plants that are resistant to fungal diseases. In this chapter, a method is described for the quantification of DNA from a fungus in plant leaves using real-time PCR (qPCR). Although the method described entails quantification of the fungus Verticillium dahliae in lettuce leaves, the methodology described would be useful for other pathosystems as well. The method utilizes primers that are specific for amplification of a β-tubulin sequence from V. dahliae and a lettuce actin gene sequence as a reference for normalization. This approach enabled quantification of V. dahliae in the amount of 2.5 fg/ng of lettuce leaf DNA at 21 days following plant inoculation.

  19. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    Science.gov (United States)

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  20. Experimental design for TBT quantification by isotope dilution SPE-GC-ICP-MS under the European water framework directive.

    Science.gov (United States)

    Alasonati, Enrica; Fabbri, Barbara; Fettig, Ina; Yardin, Catherine; Del Castillo Busto, Maria Estela; Richter, Janine; Philipp, Rosemarie; Fisicaro, Paola

    2015-03-01

    In Europe the maximum allowable concentration for tributyltin (TBT) compounds in surface water has been regulated by the water framework directive (WFD) and daughter directive that impose a limit of 0.2 ng L(-1) in whole water (as tributyltin cation). Despite the large number of different methodologies for the quantification of organotin species developed in the last two decades, standardised analytical methods at required concentration level do not exist. TBT quantification at picogram level requires efficient and accurate sample preparation and preconcentration, and maximum care to avoid blank contamination. To meet the WFD requirement, a method for the quantification of TBT in mineral water at environmental quality standard (EQS) level, based on solid phase extraction (SPE), was developed and optimised. The quantification was done using species-specific isotope dilution (SSID) followed by gas chromatography (GC) coupled to inductively coupled plasma mass spectrometry (ICP-MS). The analytical process was optimised using a design of experiment (DOE) based on a factorial fractionary plan. The DOE allowed to evaluate 3 qualitative factors (type of stationary phase and eluent, phase mass and eluent volume, pH and analyte ethylation procedure) for a total of 13 levels studied, and a sample volume in the range of 250-1000 mL. Four different models fitting the results were defined and evaluated with statistic tools: one of them was selected and optimised to find the best procedural conditions. C18 phase was found to be the best stationary phase for SPE experiments. The 4 solvents tested with C18, the pH and ethylation conditions, the mass of the phases, the volume of the eluents and the sample volume can all be optimal, but depending on their respective combination. For that reason, the equation of the model conceived in this work is a useful decisional tool for the planning of experiments, because it can be applied to predict the TBT mass fraction recovery when the

  1. A qualitative method proposal to improve environmental impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Toro, Javier, E-mail: jjtoroca@unal.edu.co [Institute of Environmental Studies, National University of Colombia at Bogotá (Colombia); Requena, Ignacio, E-mail: requena@decsai.ugr.es [Department of Computer Science and Artificial Intelligence, University of Granada (Spain); Duarte, Oscar, E-mail: ogduartev@unal.edu.co [National University of Colombia at Bogotá, Department of Electrical Engineering and Electronics (Colombia); Zamorano, Montserrat, E-mail: zamorano@ugr.es [Department of Civil Engineering, University of Granada (Spain)

    2013-11-15

    In environmental impact assessment, qualitative methods are used because they are versatile and easy to apply. This methodology is based on the evaluation of the strength of the impact by grading a series of qualitative attributes that can be manipulated by the evaluator. The results thus obtained are not objective, and all too often impacts are eliminated that should be mitigated with corrective measures. However, qualitative methodology can be improved if the calculation of Impact Importance is based on the characteristics of environmental factors and project activities instead on indicators assessed by evaluators. In this sense, this paper proposes the inclusion of the vulnerability of environmental factors and the potential environmental impact of project activities. For this purpose, the study described in this paper defined Total Impact Importance and specified a quantification procedure. The results obtained in the case study of oil drilling in Colombia reflect greater objectivity in the evaluation of impacts as well as a positive correlation between impact values, the environmental characteristics at and near the project location, and the technical characteristics of project activities. -- Highlights: • Concept of vulnerability has been used to calculate the importance impact assessment. • This paper defined Total Impact Importance and specified a quantification procedure. • The method includes the characteristics of environmental and project activities. • The application has shown greater objectivity in the evaluation of impacts. • Better correlation between impact values, environment and the project has been shown.

  2. A qualitative method proposal to improve environmental impact assessment

    International Nuclear Information System (INIS)

    Toro, Javier; Requena, Ignacio; Duarte, Oscar; Zamorano, Montserrat

    2013-01-01

    In environmental impact assessment, qualitative methods are used because they are versatile and easy to apply. This methodology is based on the evaluation of the strength of the impact by grading a series of qualitative attributes that can be manipulated by the evaluator. The results thus obtained are not objective, and all too often impacts are eliminated that should be mitigated with corrective measures. However, qualitative methodology can be improved if the calculation of Impact Importance is based on the characteristics of environmental factors and project activities instead on indicators assessed by evaluators. In this sense, this paper proposes the inclusion of the vulnerability of environmental factors and the potential environmental impact of project activities. For this purpose, the study described in this paper defined Total Impact Importance and specified a quantification procedure. The results obtained in the case study of oil drilling in Colombia reflect greater objectivity in the evaluation of impacts as well as a positive correlation between impact values, the environmental characteristics at and near the project location, and the technical characteristics of project activities. -- Highlights: • Concept of vulnerability has been used to calculate the importance impact assessment. • This paper defined Total Impact Importance and specified a quantification procedure. • The method includes the characteristics of environmental and project activities. • The application has shown greater objectivity in the evaluation of impacts. • Better correlation between impact values, environment and the project has been shown

  3. Quantification of anti-Leishmania antibodies in saliva of dogs.

    Science.gov (United States)

    Cantos-Barreda, Ana; Escribano, Damián; Bernal, Luis J; Cerón, José J; Martínez-Subiela, Silvia

    2017-08-15

    Detection of serum anti-Leishmania antibodies by quantitative or qualitative techniques has been the most used method to diagnose Canine Leishmaniosis (CanL). Nevertheless, saliva may represent an alternative to blood because it is easy to collect, painless and non-invasive in comparison with serum. In this study, two time-resolved immunofluorometric assays (TR-IFMAs) for quantification of anti-Leishmania IgG2 and IgA antibodies in saliva were developed and validated and their ability to distinguish Leishmania-seronegative from seropositive dogs was evaluated. The analytical study was performed by evaluation of assay precision, sensitivity and accuracy. In addition, serum from 48 dogs (21 Leishmania-seropositive and 27 Leishmania-seronegative) were analyzed by TR-IFMAs. The assays were precise, with an intra- and inter-assay coefficients of variation lower than 11%, and showed high level of accuracy, as determined by linearity under dilution (R 2 =0.99) and recovery tests (>88.60%). Anti-Leishmania IgG2 antibodies in saliva were significantly higher in the seropositive group compared with the seronegative (pLeishmania IgA antibodies between both groups were observed. Furthermore, TR-IFMA for quantification of anti-Leishmania IgG2 antibodies in saliva showed higher differences between seropositive and seronegative dogs than the commercial assay used in serum. In conclusion, TR-IFMAs developed may be used to quantify anti-Leishmania IgG2 and IgA antibodies in canine saliva with an adequate precision, analytical sensitivity and accuracy. Quantification of anti-Leishmania IgG2 antibodies in saliva could be potentially used to evaluate the humoral response in CanL. However, IgA in saliva seemed not to have diagnostic value for this disease. For future studies, it would be desirable to evaluate the ability of the IgG2 assay to detect dogs with subclinical disease or with low antibody titers in serum and also to study the antibodies behaviour in saliva during the

  4. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Directory of Open Access Journals (Sweden)

    Žel Jana

    2006-08-01

    Full Text Available Abstract Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was

  5. Critical points of DNA quantification by real-time PCR--effects of DNA extraction method and sample matrix on quantification of genetically modified organisms.

    Science.gov (United States)

    Cankar, Katarina; Stebih, Dejan; Dreo, Tanja; Zel, Jana; Gruden, Kristina

    2006-08-14

    Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to

  6. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Science.gov (United States)

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  7. Tannins quantification in barks of Mimosa tenuiflora and Acacia mearnsii

    Directory of Open Access Journals (Sweden)

    Leandro Calegari

    2016-03-01

    Full Text Available Due to its chemical complexity, there are several methodologies for vegetable tannins quantification. Thus, this work aims at quantifying both tannin and non-tannin substances present in the barks of Mimosa tenuiflora and Acacia mearnsii by two different methods. From bark particles of both species, analytical solutions were produced by using a steam-jacketed extractor. The solution was analyzed by Stiasny and hide-powder (no chromed methods. For both species, tannin levels were superior when analyzed by hide-powder method, reaching 47.8% and 24.1% for A. mearnsii and M. tenuiflora, respectively. By Stiasny method, the tannins levels considered were 39.0% for A. mearnsii, and 15.5% for M. tenuiflora. Despite the best results presented by A. mearnsii, the bark of M. tenuiflora also showed great potential due to its considerable amount of tannin and the availability of the species at Caatinga biome.

  8. Quantification of interfacial segregation by analytical electron microscopy

    CERN Document Server

    Muellejans, H

    2003-01-01

    The quantification of interfacial segregation by spatial difference and one-dimensional profiling is presented in general where special attention is given to the random and systematic uncertainties. The method is demonstrated for an example of Al-Al sub 2 O sub 3 interfaces in a metal-ceramic composite material investigated by energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy in a dedicated scanning transmission electron microscope. The variation of segregation measured at different interfaces by both methods is within the uncertainties, indicating a constant segregation level and interfacial phase formation. The most important random uncertainty is the counting statistics of the impurity signal whereas the specimen thickness introduces systematic uncertainties (via k factor and effective scan width). The latter could be significantly reduced when the specimen thickness is determined explicitly. (orig.)

  9. Imaging and Quantification of Extracellular Vesicles by Transmission Electron Microscopy.

    Science.gov (United States)

    Linares, Romain; Tan, Sisareuth; Gounou, Céline; Brisson, Alain R

    2017-01-01

    Extracellular vesicles (EVs) are cell-derived vesicles that are present in blood and other body fluids. EVs raise major interest for their diverse physiopathological roles and their potential biomedical applications. However, the characterization and quantification of EVs constitute major challenges, mainly due to their small size and the lack of methods adapted for their study. Electron microscopy has made significant contributions to the EV field since their initial discovery. Here, we describe the use of two transmission electron microscopy (TEM) techniques for imaging and quantifying EVs. Cryo-TEM combined with receptor-specific gold labeling is applied to reveal the morphology, size, and phenotype of EVs, while their enumeration is achieved after high-speed sedimentation on EM grids.

  10. Statistical approach for uncertainty quantification of experimental modal model parameters

    DEFF Research Database (Denmark)

    Luczak, M.; Peeters, B.; Kahsin, M.

    2014-01-01

    Composite materials are widely used in manufacture of aerospace and wind energy structural components. These load carrying structures are subjected to dynamic time-varying loading conditions. Robust structural dynamics identification procedure impose tight constraints on the quality of modal models...... represent different complexity levels ranging from coupon, through sub-component up to fully assembled aerospace and wind energy structural components made of composite materials. The proposed method is demonstrated on two application cases of a small and large wind turbine blade........ This paper aims at a systematic approach for uncertainty quantification of the parameters of the modal models estimated from experimentally obtained data. Statistical analysis of modal parameters is implemented to derive an assessment of the entire modal model uncertainty measure. Investigated structures...

  11. Quantification methodology for the French 900 MW PWR PRA

    International Nuclear Information System (INIS)

    Ducamp, F.; Lanore, J.M.; Duchemin, B.; De Villeneuve, M.J.

    1985-02-01

    This paper develops some improvements brought to provide to the classical way of risk assessment. The calculation of the contribution to the risk of one peculiar sequence of an event tree is composed of four stages: creation of a fault tree for each system which appears in the event trees, in terms of component faults; simplification of these fault trees into smaller ones, in terms of macrocomponents; creation of one ''super-tree'' by regrouping the fault trees of down systems (systems which fail in the sequence) under an AND gate and calculation of minimal cut sets of this super-tree, taking into account the up systems (systems that do not fail in the sequence) and peculiarities related to the initiating event if needed; quantification of the minimal cut sets so obtained, taking into account the duration of the scenario depicted by the sequence and the possibilities of repair. Each of these steps is developed in this article

  12. Quantification of fluorescence angiography in a porcine model

    DEFF Research Database (Denmark)

    Nerup, Nikolaj; Andersen, Helene Schou; Ambrus, Rikard

    2017-01-01

    PURPOSE: There is no consensus on how to quantify indocyanine green (ICG) fluorescence angiography. The aim of the present study was to establish and gather validity evidence for a method of quantifying fluorescence angiography, to assess organ perfusion. METHODS: Laparotomy was performed on seven...... pigs, with two regions of interest (ROIs) marked. ICG and neutron-activated microspheres were administered and the stomach was illuminated in the near-infrared range, parallel to continuous recording of fluorescence signal. Tissue samples from the ROIs were sent for quantification of microspheres...... to calculate the regional blood flow. A software system was developed to assess the fluorescent recordings quantitatively, and each quantitative parameter was compared with the regional blood flow. The parameter with the strongest correlation was then compared with results from an independently developed...

  13. Raman spectroscopy for DNA quantification in cell nucleus.

    Science.gov (United States)

    Okotrub, K A; Surovtsev, N V; Semeshin, V F; Omelyanchuk, L V

    2015-01-01

    Here we demonstrate the feasibility of a novel approach to quantify DNA in cell nuclei. This approach is based on spectroscopy analysis of Raman light scattering, and avoids the problem of nonstoichiometric binding of dyes to DNA, as it directly measures the signal from DNA. Quantitative analysis of nuclear DNA contribution to Raman spectrum could be reliably performed using intensity of a phosphate mode at 1096 cm(-1) . When compared to the known DNA standards from cells of different animals, our results matched those values at error of 10%. We therefore suggest that this approach will be useful to expand the list of DNA standards, to properly adjust the duration of hydrolysis in Feulgen staining, to assay the applicability of fuchsines for DNA quantification, as well as to measure DNA content in cells with complex hydrolysis patterns, when Feulgen densitometry is inappropriate. © 2014 International Society for Advancement of Cytometry.

  14. Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Roger [Univ. of Southern California, Los Angeles, CA (United States)

    2017-04-18

    QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced models to be used in estimation and inference.

  15. Automated quantification of emphysema in CT studies of the lung

    International Nuclear Information System (INIS)

    Archer, D.C.; deKemp, R.A.; Coblentz, C.L.; Nahmias, C.

    1991-01-01

    Emphysema by definition is a pathologic diagnosis. Recently, in vivo quantification of emphysema from CT with point counting and with a GE 9800 CT scanner program called Density Mask has been described. These methods are laborious and time-consuming, making them unsuitable for screening. The propose of this paper is to create a screening test for emphysema. The authors developed a computer program that quantifies the amount of emphysema from standard CT-scans. The computer was programmed to recognize the lung edges on each section by identifying abrupt changes in CT numbers; grow regions within each lung to identify and separate the lungs from other structures; count regions of lung containing CT numbers measuring <-900 HU corresponding to areas of emphysema; and calculation the percentage of emphysema present from the volume of normal emphysematous lung. The programs were written in C and urn on a Sun 4/100 workstation

  16. Capacitive immunosensor for C-reactive protein quantification

    KAUST Repository

    Sapsanis, Christos

    2015-08-02

    We report an agglutination-based immunosensor for the quantification of C-reactive protein (CRP). The developed immunoassay sensor requires approximately 15 minutes of assay time per sample and provides a sensitivity of 0.5 mg/L. We have measured the capacitance of interdigitated electrodes (IDEs) and quantified the concentration of added analyte. The proposed method is a label free detection method and hence provides rapid measurement preferable in diagnostics. We have so far been able to quantify the concentration to as low as 0.5 mg/L and as high as 10 mg/L. By quantifying CRP in serum, we can assess whether patients are prone to cardiac diseases and monitor the risk associated with such diseases. The sensor is a simple low cost structure and it can be a promising device for rapid and sensitive detection of disease markers at the point-of-care stage.

  17. Negative dielectrophoresis spectroscopy for rare analyte quantification in biological samples

    Science.gov (United States)

    Kirmani, Syed Abdul Mannan; Gudagunti, Fleming Dackson; Velmanickam, Logeeshan; Nawarathna, Dharmakeerthi; Lima, Ivan T., Jr.

    2017-03-01

    We propose the use of negative dielectrophoresis (DEP) spectroscopy as a technique to improve the detection limit of rare analytes in biological samples. We observe a significant dependence of the negative DEP force on functionalized polystyrene beads at the edges of interdigitated electrodes with respect to the frequency of the electric field. We measured this velocity of repulsion for 0% and 0.8% conjugation of avidin with biotin functionalized polystyrene beads with our automated software through real-time image processing that monitors the Rayleigh scattering from the beads. A significant difference in the velocity of the beads was observed in the presence of as little as 80 molecules of avidin per biotin functionalized bead. This technology can be applied in the detection and quantification of rare analytes that can be useful in the diagnosis and the treatment of diseases, such as cancer and myocardial infarction, with the use of polystyrene beads functionalized with antibodies for the target biomarkers.

  18. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    Science.gov (United States)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  19. Microplastics in Baltic bottom sediments: Quantification procedures and first results.

    Science.gov (United States)

    Zobkov, M; Esiukova, E

    2017-01-30

    Microplastics in the marine environment are known as a global ecological problem but there are still no standardized analysis procedures for their quantification. The first breakthrough in this direction was the NOAA Laboratory Methods for quantifying synthetic particles in water and sediments, but fibers numbers have been found to be underestimated with this approach. We propose modifications for these methods that will allow us to analyze microplastics in bottom sediments, including small fibers. Addition of an internal standard to sediment samples and occasional empty runs are advised for analysis quality control. The microplastics extraction efficiency using the proposed modifications is 92±7%. Distribution of microplastics in bottom sediments of the Russian part of the Baltic Sea is presented. Microplastic particles were found in all of the samples with an average concentration of 34±10 items/kg DW and have the same order of magnitude as neighbor studies reported. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Quantification of osteolytic bone lesions in a preclinical rat trial

    Science.gov (United States)

    Fränzle, Andrea; Bretschi, Maren; Bäuerle, Tobias; Giske, Kristina; Hillengass, Jens; Bendl, Rolf

    2013-10-01

    In breast cancer, most of the patients who died, have developed bone metastasis as disease progression. Bone metastases in case of breast cancer are mainly bone destructive (osteolytic). To understand pathogenesis and to analyse response to different treatments, animal models, in our case rats, are examined. For assessment of treatment response to bone remodelling therapies exact segmentations of osteolytic lesions are needed. Manual segmentations are not only time-consuming but lack in reproducibility. Computerized segmentation tools are essential. In this paper we present an approach for the computerized quantification of osteolytic lesion volumes using a comparison to a healthy reference model. The presented qualitative and quantitative evaluation of the reconstructed bone volumes show, that the automatically segmented lesion volumes complete missing bone in a reasonable way.

  1. Thermostability of biological systems: fundamentals, challenges, and quantification.

    Science.gov (United States)

    He, Xiaoming

    2011-01-01

    This review examines the fundamentals and challenges in engineering/understanding the thermostability of biological systems over a wide temperature range (from the cryogenic to hyperthermic regimen). Applications of the bio-thermostability engineering to either destroy unwanted or stabilize useful biologicals for the treatment of diseases in modern medicine are first introduced. Studies on the biological responses to cryogenic and hyperthermic temperatures for the various applications are reviewed to understand the mechanism of thermal (both cryo and hyperthermic) injury and its quantification at the molecular, cellular and tissue/organ levels. Methods for quantifying the thermophysical processes of the various applications are then summarized accounting for the effect of blood perfusion, metabolism, water transport across cell plasma membrane, and phase transition (both equilibrium and non-equilibrium such as ice formation and glass transition) of water. The review concludes with a summary of the status quo and future perspectives in engineering the thermostability of biological systems.

  2. Aspect-Oriented Programming is Quantification and Obliviousness

    Science.gov (United States)

    Filman, Robert E.; Friedman, Daniel P.; Norvig, Peter (Technical Monitor)

    2000-01-01

    This paper proposes that the distinguishing characteristic of Aspect-Oriented Programming (AOP) systems is that they allow programming by making quantified programmatic assertions over programs written by programmers oblivious to such assertions. Thus, AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the actions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are expressive enough to allow programming an AOP system within them. A corollary is that while AOP can be applied to Object-Oriented Programming, it is an independent concept applicable to other programming styles.

  3. Uncertainty quantification in computational fluid dynamics and aircraft engines

    CERN Document Server

    Montomoli, Francesco; D'Ammaro, Antonio; Massini, Michela; Salvadori, Simone

    2015-01-01

    This book introduces novel design techniques developed to increase the safety of aircraft engines. The authors demonstrate how the application of uncertainty methods can overcome problems in the accurate prediction of engine lift, caused by manufacturing error. This in turn ameliorates the difficulty of achieving required safety margins imposed by limits in current design and manufacturing methods. This text shows that even state-of-the-art computational fluid dynamics (CFD) are not able to predict the same performance measured in experiments; CFD methods assume idealised geometries but ideal geometries do not exist, cannot be manufactured and their performance differs from real-world ones. By applying geometrical variations of a few microns, the agreement with experiments improves dramatically, but unfortunately the manufacturing errors in engines or in experiments are unknown. In order to overcome this limitation, uncertainty quantification considers the probability density functions of manufacturing errors...

  4. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State Univ., Fort Collins, CO (United States); El-Azab, Anter [Florida State Univ., Tallahassee, FL (United States); Pernice, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States); Peterson, John W. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Polyakov, Peter [Univ. of Wyoming, Laramie, WY (United States); Tavener, Simon [Colorado State Univ., Fort Collins, CO (United States); Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States); Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis for computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.

  5. Reliability and discriminatory power of methods for dental plaque quantification

    Directory of Open Access Journals (Sweden)

    Daniela Prócida Raggio

    2010-04-01

    Full Text Available OBJECTIVE: This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI and fluorescence camera (FC to detect plaque. MATERIAL AND METHODS: Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. RESULTS: Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. CONCLUSION: The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque.

  6. Ideas underlying the Quantification of Margins and Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Pilch, Martin, E-mail: mpilch@sandia.gov [Department 1514, Sandia National Laboratories, Albuquerque, NM 87185-0828 (United States); Trucano, Timothy G. [Department 1411, Sandia National Laboratories, Albuquerque, NM 87185-0370 (United States); Helton, Jon C. [Department of Mathematics and Statistics, Arizona State University, Tempe, AZ 85287-1804 (United States)

    2011-09-15

    Key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions are described. While QMU is a broad process and methodology for generating critical technical information to be used in U.S. nuclear weapon stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, the following topics are discussed: (i) the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, (ii) the need to separate aleatory and epistemic uncertainty in QMU, and (iii) the properties of risk-informed decision making (RIDM) that are best suited for effective application of QMU. The paper is written at a high level, but provides an extensive bibliography of useful papers for interested readers to deepen their understanding of the presented ideas.

  7. Plasticity models of material variability based on uncertainty quantification techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Reese E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Rizzi, Francesco [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Boyce, Brad [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Templeton, Jeremy Alan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ostien, Jakob [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-11-01

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQ techniques can be used in model selection and assessing the quality of calibrated physical parameters.

  8. Study on quantification of HBs-antibody by immunoradiometric assay

    International Nuclear Information System (INIS)

    Kondo, Yuichi; Itoi, Yoshihiro; Kajiyama, Shizuo

    1989-01-01

    Quantification of HBs-antibody assay was carried out using a commercialized assay kit and standard solutions of HBs-antibody recognised as 1 st reference preparation of hepatitis B immunogloblin by WHO. Standard curve of HBs-antibody was drawn with the function of 3D-spline and the correlation factor was obtained as r = 0.999. Coefficient of intra-assay variance was 3.8 % and that of inter-assay variance was 7.8 %. Dilution tests showed satisfactory results in the range of 2-16 times. Correlation between value of cut-off indices and concentration of HBs-antibody was obtained as the formula of y = 2.599 x-3.894 (r = 0.992) and 2.1 of cut-off index corresponded to about 5 mIU/ml of HBs-antibody concentration. (author)

  9. An Uncertainty Quantification Framework for Remote Sensing Retrievals

    Science.gov (United States)

    Braverman, A. J.; Hobbs, J.

    2017-12-01

    Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.

  10. Nanodiamond arrays on glass for quantification and fluorescence characterisation.

    Science.gov (United States)

    Heffernan, Ashleigh H; Greentree, Andrew D; Gibson, Brant C

    2017-08-23

    Quantifying the variation in emission properties of fluorescent nanodiamonds is important for developing their wide-ranging applicability. Directed self-assembly techniques show promise for positioning nanodiamonds precisely enabling such quantification. Here we show an approach for depositing nanodiamonds in pre-determined arrays which are used to gather statistical information about fluorescent lifetimes. The arrays were created via a layer of photoresist patterned with grids of apertures using electron beam lithography and then drop-cast with nanodiamonds. Electron microscopy revealed a 90% average deposition yield across 3,376 populated array sites, with an average of 20 nanodiamonds per site. Confocal microscopy, optimised for nitrogen vacancy fluorescence collection, revealed a broad distribution of fluorescent lifetimes in agreement with literature. This method for statistically quantifying fluorescent nanoparticles provides a step towards fabrication of hybrid photonic devices for applications from quantum cryptography to sensing.

  11. Impact and quantification of the sources of error in DNA pooling designs.

    Science.gov (United States)

    Jawaid, A; Sham, P

    2009-01-01

    The analysis of genome wide variation offers the possibility of unravelling the genes involved in the pathogenesis of disease. Genome wide association studies are also particularly useful for identifying and validating targets for therapeutic intervention as well as for detecting markers for drug efficacy and side effects. The cost of such large-scale genetic association studies may be reduced substantially by the analysis of pooled DNA from multiple individuals. However, experimental errors inherent in pooling studies lead to a potential increase in the false positive rate and a loss in power compared to individual genotyping. Here we quantify various sources of experimental error using empirical data from typical pooling experiments and corresponding individual genotyping counts using two statistical methods. We provide analytical formulas for calculating these different errors in the absence of complete information, such as replicate pool formation, and for adjusting for the errors in the statistical analysis. We demonstrate that DNA pooling has the potential of estimating allele frequencies accurately, and adjusting the pooled allele frequency estimates for differential allelic amplification considerably improves accuracy. Estimates of the components of error show that differential allelic amplification is the most important contributor to the error variance in absolute allele frequency estimation, followed by allele frequency measurement and pool formation errors. Our results emphasise the importance of minimising experimental errors and obtaining correct error estimates in genetic association studies.

  12. Patterns of cross-contamination in a multispecies population genomic project: detection, quantification, impact, and solutions.

    Science.gov (United States)

    Ballenghien, Marion; Faivre, Nicolas; Galtier, Nicolas

    2017-03-29

    Contamination is a well-known but often neglected problem in molecular biology. Here, we investigated the prevalence of cross-contamination among 446 samples from 116 distinct species of animals, which were processed in the same laboratory and subjected to subcontracted transcriptome sequencing. Using cytochrome oxidase 1 as a barcode, we identified a minimum of 782 events of between-species contamination, with approximately 80% of our samples being affected. An analysis of laboratory metadata revealed a strong effect of the sequencing center: nearly all the detected events of between-species contamination involved species that were sent the same day to the same company. We introduce new methods to address the amount of within-species, between-individual contamination, and to correct for this problem when calling genotypes from base read counts. We report evidence for pervasive within-species contamination in this data set, and show that classical population genomic statistics, such as synonymous diversity, the ratio of non-synonymous to synonymous diversity, inbreeding coefficient F IT , and Tajima's D, are sensitive to this problem to various extents. Control analyses suggest that our published results are probably robust to the problem of contamination. Recommendations on how to prevent or avoid contamination in large-scale population genomics/molecular ecology are provided based on this analysis.

  13. The Impact of Soil Reflectance on the Quantification of the Green Vegetation Fraction from NDVI

    Science.gov (United States)

    Montandon, L. M.; Small, E. E.

    2008-01-01

    The green vegetation fraction (Fg) is an important climate and hydrologic model parameter. A common method to calculate Fg is to create a simple linear mixing rnodeP between two NDVI endmembers: bare soil NDVI (NDVI(sub o)) and full vegetation NDVI (NDVI(sub infinity)). Usually it is assumed that NDVI(sub o), is close to zero (NDVI(sub o) approx.-0.05) and is generally chosen from the lowest observed NDVI values. However, the mean soil NDVI computed from 2906 samples is much larger (NDVI=0.2) and is highly variable (standard deviation=O. 1). We show that the underestimation of NDVI(sub o) yields overestimations of Fg. The largest errors occur in grassland and shrubland areas. Using parameters for NDVI(sub o) and NDVI(sub infinity) derived from global scenes yields overestimations of Fg ((Delta) Fg*) that are larger than 0.2 for the majority of U.S. land cover types when pixel NDVI values are 0.2NDVI(sub pixel)NDVI values. When using conterminous U.S. scenes to derive NDV(sub o) and NDVI(sub infinity), the overestimation is less (0.10-0.17 for 0.2NDVI(sub pixel)NDVI cycle. We propose using global databases of NDVI(sub o) along with information on historical NDVI(sub pixel) values to compute a statistically most-likely estimate of Fg (Fg*). Using in situ measurements made at the Sevilleta LTER, we show that this approach yields better estimates of Fg than using global invariant NDVI(sub o) values estimated from whole scenes (Figure 2). At the two studied sites, the Fg estimate was adjusted by 52% at the grassland and 86% at the shrubland. More significant advances will require information on spatial distribution of soil reflectance.

  14. Optimization of concrete for prefabrication and quantification of its environmental impact

    NARCIS (Netherlands)

    Onghena, S.; Grunewald, S.; Schutter, G

    2016-01-01

    The development of strength is an important criterion for the production of
    prefabricated concrete elements. With seasonal changes of temperature that affect the development of concrete strength, daily cycles of often 18 hours or shorter have to bemaintained. The use of Ordinary Portland Cement

  15. Plasticity detection and quantification in monopile support structures due to axial impact loading

    NARCIS (Netherlands)

    Meijers, P.C.; Tsouvalas, A.; Metrikine, A.

    2018-01-01

    Recent developments in the construction of offshore wind turbines have created the need for a method to detect whether a monopile foundation is plastically deformed during the installation procedure. Since measurements at the pile head are difficult to perform, a method based on measurements at a

  16. Sediment dynamics in the Rhine catchment : Quantification of fluvial response to climate change and human impact

    NARCIS (Netherlands)

    Erkens, G.

    2009-01-01

    Fluvial systems are strongly responsive to changes in climate and land use — but take their time to show it. Accurate prediction of the timing and degree of future fluvial response requires comprehensive understanding of fluvial response in the past. This PhD-thesis studied the response of the river

  17. Quantification of the Potential Gross Economic Impacts of Five Methane Reduction Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Keyser, David [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Warner, Ethan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Curley, Christina [Colorado State Univ., Fort Collins, CO (United States)

    2015-04-23

    Methane (CH4) is a potent greenhouse gas that is released from the natural gas supply chain into the atmosphere as a result of fugitive emissions1 and venting2 . We assess five potential CH4 reduction scenarios from transmission, storage, and distribution (TS&D) using published literature on the costs and the estimated quantity of CH4 reduced. We utilize cost and methane inventory data from ICF (2014) and Warner et al. (forthcoming) as well as data from Barrett and McCulloch (2014) and the American Gas Association (AGA) (2013) to estimate that the implementation of these measures could support approximately 85,000 jobs annually from 2015 to 2019 and reduce CH4 emissions from natural gas TS&D by over 40%. Based on standard input/output analysis methodology, measures are estimated to support over $8 billion in GDP annually over the same time period and allow producers to recover approximately $912 million annually in captured gas.

  18. Cerebellar heterogeneity and its impact on PET data quantification of 5-HT receptor radioligands

    DEFF Research Database (Denmark)

    Ganz, Melanie; Feng, Ling; Hansen, Hanne Demant

    2017-01-01

    standardized uptake values (SUV) and nondisplaceable neocortical binding potential (BPND). Statistical difference was assessed with paired nonparametric two-sided Wilcoxon signed-rank tests and multiple comparison corrected via false discovery rate. We demonstrate significant radioligand specific regional...

  19. Impacts of light-absorbing impurities on snow and their quantification with bidirectional reflectance measurements

    Science.gov (United States)

    Gritsevich, Maria; Peltoniemi, Jouni; Meinander, Outi; Dagsson-Waldhauserová, Pavla; Zubko, Nataliya; Hakala, Teemu; Virkkula, Aki; Svensson, Jonas; de Leeuw, Gerrit

    2017-04-01

    In order to quantify the effects of absorbing impurities on snow and define their contribution to the climate change, we have conducted a series of dedicated bidirectional reflectance measurements. Chimney soot, volcanic sand, and glaciogenic silt have been deposited on the snow in the controlled way. The bidirectional reflectance factors of these targets and untouched snow have been measured using the Finnish Geodetic Institute's field goniospectrometer FIGIFIGO, see, e.g., [1, 2] and references therein. It has been found that the contaminants darken the snow, and modify its appearance mostly as expected, with clear directional signal and modest spectral signal. A remarkable feature is the fact that any absorbing contaminant on snow enhances the metamorphosis under strong sunlight [3, 4]. Immediately after deposition, the contaminated snow surface appears darker than the pure snow in all viewing directions, but the heated soot particles start sinking down deeply into the snow in minutes. The nadir measurement remains darkest, but at larger zenith angles the surface of the soot-contaminated snow changes back to almost as white as clean snow. Thus, for on ground observer the darkening by impurities can be completely invisible, overestimating the albedo, but a nadir looking satellite sees the darkest points, now underestimating the albedo. After more time, also the nadir view brightens, and the remaining impurities may be biased towards more shadowed locations or less absorbing orientations by natural selection. This suggests that at noon the albedo should be lower than in the morning or afternoon. When sunlight stimulates more sinking than melting, albedo should be higher in the afternoon than in the morning, and vice versa when melting is dominating. Thus to estimate the effects caused by black carbon (BC) deposited on snow on climate changes may one need to take into account possible rapid diffusion of the BC inside the snow from its surface. When the snow melt rate gets faster than the diffusion rate (under condition of warm outside temperatures), as it was observed at the end of the experiment reported here, dark material starts accumulating into the surface [5]. The BC deposited on snow at warm temperatures initiates rapid melting process and may cause dramatic changes on the snow surface. References 1 Peltoniemi J.I., Hakala T., Suomalainen J., Honkavaara E., Markelin L., Gritsevich M., Eskelinen J., Jaanson P., Ikonen E. (2014): Technical notes: A detailed study for the provision of measurement uncertainty and traceability for goniospectrometers. Journal of Quantitative Spectroscopy & Radiative Transfer 146, 376-390, http://dx.doi.org/10.1016/j.jqsrt.2014.04.011 2 Zubko N., Gritsevich M., Zubko E., Hakala T., Peltoniemi J.I. (2016): Optical measurements of chemically heterogeneous particulate surfaces // Journal of Quantitative Spectroscopy and Radiative Transfer, 178, 422-431, http://dx.doi.org/10.1016/j.jqsrt.2015.12.010 3 Peltoniemi J.I., Gritsevich M., Hakala T., Dagsson-Waldhauserová P., Arnalds Ó., Anttila K., Hannula H.-R., Kivekäs N., Lihavainen H., Meinander O., Svensson J., Virkkula A., de Leeuw G. (2015): Soot on snow exper- iment: bidirectional reflectance factor measurements of contaminated snow // The Cryosphere, 9, 2323-2337, http://dx.doi.org/10.5194/tc-9-2323-2015 4 Svensson J., Virkkula A., Meinander O., Kivekäs N., Hannula H.-R., Järvinen O., Peltoniemi J.I., Gritsevich M., Heikkilä A., Kontu A., Neitola K., Brus D., Dagsson-Waldhauserova P., Anttila K., Vehkamäki M., Hienola A., de Leeuw G. & Lihavainen H. (2016): Soot-doped natural snow and its albedo — results from field experiments. Boreal Environment Research, 21, 481-503, http://www.borenv.net/BER/pdfs/preprints/Svensson1498.pdf 5 Meinander O., Kontu A., Virkkula A., Arola A., Backman L., Dagsson-Waldhauserová P., Järvinen O., Manninen T., Svensson J., de Leeuw G., and Leppäranta M. (2014): Brief communication: Light-absorbing impurities can reduce the density of melting snow, The Cryosphere, 8, 991-995, doi:10.5194/tc-8-991-2014.

  20. The Serengeti food web : Empirical quantification and analysis of topological changes under increasing human impact

    NARCIS (Netherlands)

    de Visser, Sara N.; Freymann, Bernd P.; Olff, Han

    P>1. To address effects of land use and human overexploitation on wildlife populations, it is essential to better understand how human activities have changed species composition, diversity and functioning. Theoretical studies modelled how network properties change under human-induced, non-random