WorldWideScience

Sample records for benefits quantification methodology

  1. Renewable Electricity Benefits Quantification Methodology: A Request for Technical Assistance from the California Public Utilities Commission

    Energy Technology Data Exchange (ETDEWEB)

    Mosey, G.; Vimmerstedt, L.

    2009-07-01

    The California Public Utilities Commission (CPUC) requested assistance in identifying methodological alternatives for quantifying the benefits of renewable electricity. The context is the CPUC's analysis of a 33% renewable portfolio standard (RPS) in California--one element of California's Climate Change Scoping Plan. The information would be used to support development of an analytic plan to augment the cost analysis of this RPS (which recently was completed). NREL has responded to this request by developing a high-level survey of renewable electricity effects, quantification alternatives, and considerations for selection of analytic methods. This report addresses economic effects and health and environmental effects, and provides an overview of related analytic tools. Economic effects include jobs, earnings, gross state product, and electricity rate and fuel price hedging. Health and environmental effects include air quality and related public-health effects, solid and hazardous wastes, and effects on water resources.

  2. A Study on Uncertainty Quantification of Reflood Model using CIRCE Methodology

    International Nuclear Information System (INIS)

    Jeon, Seongsu; Hong, Soonjoon; Oh, Deogyeon; Bang, Youngseok

    2013-01-01

    The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment generally used. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. In this paper, the application process of CIRCE methodology and main results are briefly described. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. The application of CIRCE provided the satisfactory results. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM

  3. A Micropillar Compression Methodology for Ductile Damage Quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  4. A micropillar compression methodology for ductile damage quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  5. A Project-Based Quantification of BIM Benefits

    Directory of Open Access Journals (Sweden)

    Jian Li

    2014-08-01

    Full Text Available In the construction industry, research is being carried out to look for feasible methods and technologies to cut down project costs and waste. Building Information Modelling (BIM is certainly currently a promising technology/method that can achieve this. The output of the construction industry has a considerable scale; however, the concentration of the industry and the level of informatization are still not high. There is still a large gap in terms of productivity between the construction industry and other industries. Due to the lack of first-hand data regarding how much of an effect can be genuinely had by BIM in real cases, it is unrealistic for construction stakeholders to take the risk of widely adopting BIM. This paper focuses on the methodological quantification (through a case study approach of BIM's benefits in building construction resource management and real-time costs control, in contrast to traditional non-BIM technologies. Through the use of BIM technology for the dynamic querying and statistical analysis of construction schedules, engineering, resources and costs, the three implementations considered demonstrate how BIM can facilitate the comprehensive grasp of a project's implementation and progress, identify and solve the contradictions and conflicts between construction resources and costs controls, reduce project over-spends and protect the supply of resources.

  6. Quantification of pelvic floor muscle strength in female urinary incontinence: A systematic review and comparison of contemporary methodologies.

    Science.gov (United States)

    Deegan, Emily G; Stothers, Lynn; Kavanagh, Alex; Macnab, Andrew J

    2018-01-01

    There remains no gold standard for quantification of voluntary pelvic floor muscle (PFM) strength, despite international guidelines that recommend PFM assessment in females with urinary incontinence (UI). Methods currently reported for quantification of skeletal muscle strength across disciplines are systematically reviewed and their relevance for clinical and academic use related to the pelvic floor are described. A systematic review via Medline, PubMed, CINHAL, and the Cochrane database using key terms for pelvic floor anatomy and function were cross referenced with skeletal muscle strength quantification from 1946 to 2016. Full text peer-reviewed articles in English having female subjects with incontinence were identified. Each study was analyzed for use of controls, type of methodology as direct or indirect measures, benefits, and limitations of the technique. A total of 1586 articles were identified of which 50 met the inclusion criteria. Nine methodologies of determining PFM strength were described including: digital palpation, perineometer, dynamometry, EMG, vaginal cones, ultrasonography, magnetic resonance imaging, urine stream interruption test, and the Colpexin pull test. Thirty-two percent lacked a control group. Technical refinements in both direct and indirect instrumentation for PFM strength measurement are allowing for sensitivity. However, the most common methods of quantification remain digital palpation and perineometry; techniques that pose limitations and yield subjective or indirect measures of muscular strength. Dynamometry has potential as an accurate and sensitive tool, but is limited by inability to assess PFM strength during dynamic movements. © 2017 Wiley Periodicals, Inc.

  7. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A.; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M.

    2014-01-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  8. Recommendations for benefit-risk assessment methodologies and visual representations

    DEFF Research Database (Denmark)

    Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul

    2016-01-01

    PURPOSE: The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. METHODS: Eight case studies based on the benefit......-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. RESULTS: A general pathway through the case studies...

  9. NASA Electronic Publishing System: Cost/benefit Methodology

    Science.gov (United States)

    Tuey, Richard C.

    1994-01-01

    The NASA Scientific and Technical Information Office was assigned the responsibility to examine the benefits of the utilization of electronic printing and duplicating systems throughout NASA Installations and Headquarters. The subject of this report is the documentation of the methodology used in justifying the acquisition of the most cost beneficial solution for the printing and duplicating requirements of a duplicating facility that is contemplating the acquisition of an electronic printing and duplicating system. Four alternatives are presented with each alternative costed out with its associated benefits. The methodology goes a step further than just a cost benefit analysis through its comparison of risks associated with each alternative, sensitivity to number of impressions and productivity gains on the selected alternative and finally the return on investment for the selected alternative. The report can be used in conjunction with the two earlier reports, NASA-TM-106242 and TM-106510 in guiding others in determining the cost effective duplicating alternative.

  10. Development of Accident Scenarios and Quantification Methodology for RAON Accelerator

    International Nuclear Information System (INIS)

    Lee, Yongjin; Jae, Moosung

    2014-01-01

    The RIsp (Rare Isotope Science Project) plans to provide neutron-rich isotopes (RIs) and stable heavy ion beams. The accelerator is defined as radiation production system according to Nuclear Safety Law. Therefore, it needs strict operate procedures and safety assurance to prevent radiation exposure. In order to satisfy this condition, there is a need for evaluating potential risk of accelerator from the design stage itself. Though some of PSA researches have been conducted for accelerator, most of them focus on not general accident sequence but simple explanation of accident. In this paper, general accident scenarios are developed by Event Tree and deduce new quantification methodology of Event Tree. In this study, some initial events, which may occur in the accelerator, are selected. Using selected initial events, the accident scenarios of accelerator facility are developed with Event Tree. These results can be used as basic data of the accelerator for future risk assessments. After analyzing the probability of each heading, it is possible to conduct quantification and evaluate the significance of the accident result. If there is a development of the accident scenario for external events, risk assessment of entire accelerator facility will be completed. To reduce the uncertainty of the Event Tree, it is possible to produce a reliable data via the presented quantification techniques

  11. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  12. Benefit quantification of interoperability in coordinate metrology

    DEFF Research Database (Denmark)

    Savio, E.; Carmignato, S.; De Chiffre, Leonardo

    2014-01-01

    these inefficiencies. The paper presents a methodology for an economic evaluation of interoperability benefits with respect to the verification of geometrical product specifications. It requires input data from testing and inspection activities, as well as information on training of personnel and licensing of software......One of the factors contributing to limited reproducibility of coordinate measurements is the use of different inspection software. Time-consuming efforts for translation of part programmes are sometimes needed, and interoperability of inspection equipment has the potential to reduce...

  13. Quantification of the detriment and comparison of health risks. Methodological problems

    International Nuclear Information System (INIS)

    Jammet, H.

    1982-01-01

    Some of the methodological problems involved in the quantitative estimate of the health detriment of different energy sources and in risk comparison are described. First, the question of determining the detriment is discussed from the point of view of the distortions introduced in the quantification when dealing with risks for which the amount of information available varies widely. The main criteria applied to classifying types of detriment are then recalled. Finally, the problems involved in comparisons are outlined: spatial and temporal variations in the types of detriment, operation under normal and accident conditions, and the risks to the public and workers. (author)

  14. Costs and benefits of sulphur oxide control: a methodological study

    Energy Technology Data Exchange (ETDEWEB)

    1981-01-01

    The objective is to present for the first time a methodology for estimating the costs and benefits of SO/sub x/ control strategies as an aid o policy formulation which could create the basis for further action in member countries. To illustrate the methodology, different control scenarios for Western Europe are developed and analyzed using the cost-benefit approach, and some preliminary conclusions are drawn. The next step assesses the impact of the emissions on ambient air quality, calculated with the aid of long-range and urban air quality models. Finally, the impact of the calculated concentrations of SO/sub x/ in the different scenarios on a number of environmental and human assets - materials, agricultural crops, health, and aquatic ecosystems - are estimated in order to have a measure of the benefits of control.

  15. Towards an uncertainty quantification methodology with CASMO-5

    International Nuclear Information System (INIS)

    Wieselquist, W.; Vasiliev, A.; Ferroukhi, H.

    2011-01-01

    We present the development of an uncertainty quantification (UQ) methodology for the CASMO-5 lattice physics code, used extensively at the Paul Scherrer Institut for standalone neutronics calculations, as well as the generation of nuclear fuel segment libraries for the downstream core simulator, SIMULATE-3. We focus here on propagation of nuclear data uncertainties and describe the framework required for 'black box' UQ--in this case minor modifications of the code are necessary to allow perturbation of the CASMO-5 nuclear data library. We then implement a basic rst-order UQ method, direct perturbation, which directly produces sensitivity coefficients and when folded with the input nuclear data variance-covariance matrix (VCM) yields output uncertainties in the form of an output VCM. We discuss the implementation, including how to map the VCMs of a different group structure to the code library group structure (in our case the ENDF/B-VII-based 586-group library in CASMO-5), present some results for pin cell calculations, and conclude with future work. (author)

  16. Lung involvement quantification in chest radiographs; Quantificacao de comprometimento pulmonar em radiografias de torax

    Energy Technology Data Exchange (ETDEWEB)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Instituto de Biociencias. Departamento de Fisica e Biofisica; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M., E-mail: giacomini@ibb.unesp.br [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Faculdade de Medicina. Departamento de Doencas Tropicais e Diagnostico por Imagem

    2014-12-15

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  17. A proposed approach to backfit decision-making using risk assessment and benefit-cost methodology

    International Nuclear Information System (INIS)

    O'Donnell, E.P.; Raney, T.J.

    1984-01-01

    This paper outlines a proposed approach to backfit decision-making which utilizes quantitative risk assessment techniques, benefit-cost methodology and decision criteria. In general terms, it is structured to provide an objective framework for decision-making aimed at ensuring a positive return on backfit investment while allowing for inclusion of subjective value judgments by the decision-maker. The distributions of the independent variables are combined to arrive at an overall probability distribution for the benefit-cost ratio. In this way, the decision-maker can explicitly establish the probability or level of confidence that a particular backfit will yield benefits in excess of cost. An example is presented demonstrating the application of methodology to a specific plant backfit. (orig.)

  18. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    International Nuclear Information System (INIS)

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    Highlights: ► Survey of bio-analytical approaches utilizing biomolecule labelling. ► Detailed discussion of methodology and chemistry of elemental labelling. ► Biomedical and bio-analytical applications of elemental labelling. ► FI-ICP-MS and LC–ICP-MS for quantification of elemental labelled biomolecules. ► Review of selected applications. - Abstract: This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given.

  19. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  20. A methodology for estimating health benefits of electricity generation using renewable technologies.

    Science.gov (United States)

    Partridge, Ian; Gamkhar, Shama

    2012-02-01

    At Copenhagen, the developed countries agreed to provide up to $100 bn per year to finance climate change mitigation and adaptation by developing countries. Projects aimed at cutting greenhouse gas (GHG) emissions will need to be evaluated against dual criteria: from the viewpoint of the developed countries they must cut emissions of GHGs at reasonable cost, while host countries will assess their contribution to development, or simply their overall economic benefits. Co-benefits of some types of project will also be of interest to host countries: for example some projects will contribute to reducing air pollution, thus improving the health of the local population. This paper uses a simple damage function methodology to quantify some of the health co-benefits of replacing coal-fired generation with wind or small hydro in China. We estimate the monetary value of these co-benefits and find that it is probably small compared to the added costs. We have not made a full cost-benefit analysis of renewable energy in China as some likely co-benefits are omitted from our calculations. Our results are subject to considerable uncertainty however, after careful consideration of their likely accuracy and comparisons with other studies, we believe that they provide a good first cut estimate of co-benefits and are sufficiently robust to stand as a guide for policy makers. In addition to these empirical results, a key contribution made by the paper is to demonstrate a simple and reasonably accurate methodology for health benefits estimation that applies the most recent academic research in the field to the solution of an increasingly important problem. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    Science.gov (United States)

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given. PMID:23062431

  2. Optimization Of Methodological Support Of Application Tax Benefits In Regions: Practice Of Perm Region

    Directory of Open Access Journals (Sweden)

    Alexandr Ivanovich Tatarkin

    2015-03-01

    Full Text Available In the article, the problem of the methodological process support of regional tax benefits is reviewed. The method of tax benefits assessment, accepted in Perm Region, was chosen as an analysis object because the relatively long period of application of benefits has allowed to build enough statistics base. In the article, the reliability of budget, economic, investment, and social effectiveness assessments of application benefits, based on the Method, is investigated. The suggestions of its perfection are formulated

  3. Application of a Bayesian model for the quantification of the European methodology for qualification of non-destructive testing

    International Nuclear Information System (INIS)

    Gandossi, Luca; Simola, Kaisa; Shepherd, Barrie

    2010-01-01

    The European methodology for qualification of non-destructive testing is a well-established approach adopted by nuclear utilities in many European countries. According to this methodology, qualification is based on a combination of technical justification and practical trials. The methodology is qualitative in nature, and it does not give explicit guidance on how the evidence from the technical justification and results from trials should be weighted. A Bayesian model for the quantification process was presented in a previous paper, proposing a way to combine the 'soft' evidence contained in a technical justification with the 'hard' evidence obtained from practical trials. This paper describes the results of a pilot study in which such a Bayesian model was applied to two realistic Qualification Dossiers by experienced NDT qualification specialists. At the end of the study, recommendations were made and a set of guidelines was developed for the application of the Bayesian model.

  4. ALARA cost/benefit analysis at Union Electric company using the ARP/AI methodology

    International Nuclear Information System (INIS)

    Williams, M.C.

    1987-01-01

    This paper describes the development of a specific method for justification of expenditures associated with reducing occupational radiation exposure to as low as reasonably achievable (ALARA). This methodology is based on the concepts of the Apparebt Reduction Potential (ARP) and Achievability index (AI) as described in NUREG/CR-0446, Union Eletric's corporate planning model and the EPRI Model for dose rate buildup with reactor operating life. The ARP provides a screening test to determine if there is a need for ALARA expenditures based on actual or predicted exposure rates and/or dose experience. The AI is a means of assessing all costs and all benefits, even though they are expressed in different units of measurement such as person-rem and dollars, to determine if ALARA expenditures are justified and their value. This method of cost/benefit analysis can be applied by any company or organization utilizing site-specific exposure and dose rate data, and incorporating consideration of administrative exposure controls which may vary from organization to organization. Specific example cases are presented and compared to other methodologies for ALARA cost/benefit analysis

  5. Quantification of flood risk mitigation benefits: A building-scale damage assessment through the RASOR platform.

    Science.gov (United States)

    Arrighi, Chiara; Rossi, Lauro; Trasforini, Eva; Rudari, Roberto; Ferraris, Luca; Brugioni, Marcello; Franceschini, Serena; Castelli, Fabio

    2018-02-01

    Flood risk mitigation usually requires a significant investment of public resources and cost-effectiveness should be ensured. The assessment of the benefits of hydraulic works requires the quantification of (i) flood risk in absence of measures, (ii) risk in presence of mitigation works, (iii) investments to achieve acceptable residual risk. In this work a building-scale is adopted to estimate direct tangible flood losses to several building classes (e.g. residential, industrial, commercial, etc.) and respective contents, exploiting various sources of public open data in a GIS environment. The impact simulations for assigned flood hazard scenarios are computed through the RASOR platform which allows for an extensive characterization of the properties and their vulnerability through libraries of stage-damage curves. Recovery and replacement costs are estimated based on insurance data, market values and socio-economic proxies. The methodology is applied to the case study of Florence (Italy) where a system of retention basins upstream of the city is under construction to reduce flood risk. Current flood risk in the study area (70 km 2 ) is about 170 Mio euros per year without accounting for people, infrastructures, cultural heritage and vehicles at risk. The monetary investment in the retention basins is paid off in about 5 years. However, the results show that although hydraulic works are cost-effective, a significant residual risk has to be managed and the achievement of the desired level of acceptable risk would require about 1 billion euros of investments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. The quantification of risk and tourism

    Directory of Open Access Journals (Sweden)

    Piet Croucamp

    2014-01-01

    Full Text Available Tourism in South Africa comprises 9.5% of Gross Domestic Product (GDP, but remains an underresearched industry, especially regarding the quantification of the risks prevailing in the social, political and economic environment in which the industry operates. Risk prediction, extrapolation forecasting is conducted largely in the context of a qualitative methodology. This article reflects on the quantification of social constructs as variables of risk in the tourism industry with reference to South Africa. The theory and methodology of quantification is briefly reviewed and the indicators of risk are conceptualized and operationalized. The identified indicators are scaled in indices for purposes of quantification. Risk assessments and the quantification of constructs rely heavily on the experience - often personal - of the researcher and this scholarly endeavour is, therefore, not inclusive of all possible identified indicators of risk. It is accepted that tourism in South Africa is an industry comprising of a large diversity of sectors, each with a different set of risk indicators and risk profiles. The emphasis of this article is thus on the methodology to be applied to a risk profile. A secondary endeavour is to provide for clarity about the conceptual and operational confines of risk in general, as well as how quantified risk relates to the tourism industry. The indices provided include both domesticand international risk indicators. The motivation for the article is to encourage a greater emphasis on quantitative research in our efforts to understand and manage a risk profile for the tourist industry.

  7. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    International Nuclear Information System (INIS)

    Seebauer, Matthias

    2014-01-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO 2  ha −1  yr −1 with significantly different mitigation benefits depending on typologies of the crop–livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms. (paper)

  8. Design of an Integrated Methodology for Analytical Design of Complex Supply Chains

    Directory of Open Access Journals (Sweden)

    Shahid Rashid

    2012-01-01

    Full Text Available A literature review and gap analysis indentifies key limitations of industry best practice when modelling of supply chains. To address these limitations the paper reports on the conception and development of an integrated modelling methodology designed to underpin the analytical design of complex supply chains. The methodology is based upon a systematic deployment of EM, CLD, and SM techniques; the integration of which is achieved via common modelling concepts and decomposition principles. Thereby the methodology facilitates: (i graphical representation and description of key “processing”, “resourcing” and “work flow” properties of supply chain configurations; (ii behavioural exploration of currently configured supply chains, to facilitate reasoning about uncertain demand impacts on supply, make, delivery, and return processes; (iii predictive quantification about relative performances of alternative complex supply chain configurations, including risk assessments. Guidelines for the application of each step of the methodology are described. Also described are recommended data collection methods and expected modelling outcomes for each step. The methodology is being extensively case tested to quantify potential benefits & costs relative to current best industry practice. The paper reflects on preliminary benefits gained during industry based case study modelling and identifies areas of potential improvement.

  9. Validation and evaluation of an HPLC methodology for the quantification of the potent antimitotic compound (+)-discodermolide in the Caribbean marine sponge Discodermia dissoluta.

    Science.gov (United States)

    Valderrama, Katherine; Castellanos, Leonardo; Zea, Sven

    2010-08-01

    The sponge Discodermia dissoluta is the source of the potent antimitotic compound (+)-discodermolide. The relatively abundant and shallow populations of this sponge in Santa Marta, Colombia, allow for studies to evaluate the natural and biotechnological supply options of (+)-discodermolide. In this work, an RP-HPLC-UV methodology for the quantification of (+)-discodermolide from sponge samples was tested and validated. Our protocol for extracting this compound from the sponge included lyophilization, exhaustive methanol extraction, partitioning using water and dichloromethane, purification of the organic fraction in RP-18 cartridges and then finally retrieving the (+)-discodermolide in the methanol-water (80:20 v/v) fraction. This fraction was injected into an HPLC system with an Xterra RP-18 column and a detection wavelength of 235 nm. The calibration curve was linear, making it possible to calculate the LODs and quantification in these experiments. The intra-day and inter-day precision showed relative standard deviations lower than 5%. The accuracy, determined as the percentage recovery, was 99.4%. Nine samples of the sponge from the Bahamas, Bonaire, Curaçao and Santa Marta had concentrations of (+)-discodermolide ranging from 5.3 to 29.3 microg/g(-1) of wet sponge. This methodology is quick and simple, allowing for the quantification in sponges from natural environments, in situ cultures or dissociated cells.

  10. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    OpenAIRE

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  11. A refined methodology for modeling volume quantification performance in CT

    Science.gov (United States)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  12. Co-benefits of climate mitigation: Counting statistical lives or life-years?

    DEFF Research Database (Denmark)

    Andersen, Mikael Skou

    2017-01-01

    Making up for air pollution related mortality and accounting for the number of deaths has become an important environmental indicator in its own right, but differences across the Atlantic over how to account for these are making it difficult to find common ground in climate policy appraisals, where...... co-benefits from reducing air pollution of fossil fuels is to be factored in. This article revisits established quantification methodologies for air pollution related mortality applied by government agencies in USA and EU. Demographic lifetables are applied to explore uncertainties over latency....... With a common OECD base value approach the air pollution costs related to fossil fuels are found to be about 3 times lower with EU versus US methodology....

  13. Assessment of ALWR passive safety system reliability. Phase 1: Methodology development and component failure quantification

    International Nuclear Information System (INIS)

    Hake, T.M.; Heger, A.S.

    1995-04-01

    Many advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive systems to perform safety functions, rather than active systems as in current reactor designs. These passive systems depend to a great extent on physical processes such as natural circulation for their driving force, and not on active components, such as pumps. An NRC-sponsored study was begun at Sandia National Laboratories to develop and implement a methodology for evaluating ALWR passive system reliability in the context of probabilistic risk assessment (PRA). This report documents the first of three phases of this study, including methodology development, system-level qualitative analysis, and sequence-level component failure quantification. The methodology developed addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. Traditional PRA methods, such as fault and event tree modeling, are applied to the component failure aspect. Thermal-hydraulic calculations are incorporated into a formal expert judgment process to address uncertainties in selected natural processes and success criteria. The first phase of the program has emphasized the component failure element of passive system reliability, rather than the natural process uncertainties. Although cursory evaluation of the natural processes has been performed as part of Phase 1, detailed assessment of these processes will take place during Phases 2 and 3 of the program

  14. Standardization of a PIGE methodology for simultaneous quantification of low Z elements in barium borosilicate glass samples

    International Nuclear Information System (INIS)

    Chhillar, S.; Acharya, R.; Dasari, K.B.; Pujari, P.K.; Mishra, R.K.; Kaushik, C.P.

    2013-01-01

    In order to standardize particle induced gamma-ray emission (PIGE) methodology for simultaneous quantification of light elements, analytical sensitivities of Li, F, B, Na, Al and Si were evaluated using 4 MeV proton beam ( ∼ 10 nA current) using 3 MV Pelletron at IOP, Bhubaneswar. The PIGE method was validated by determining all six elements in a synthetic sample in graphite matrix and applied to two barium borosilicate glass (BaBSG) samples. The prompt γ-rays emitted from inelastic scattering or nuclear reactions of corresponding isotopes were measured using a 60% HPGe coupled to MCA and the current normalized count rates were used for concentration calculation. (author)

  15. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Xiao-Ying; Yao, Juan; He, Hua; Glantz, Clifford S.; Booth, Alexander E.

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  16. [Early post-partum discharges: benefits, disadvantages and implementation methodology].

    Science.gov (United States)

    Berkane, N

    2015-02-01

    Early post-partum discharges (EPD) are a hot topic. Already widely practised in many European countries this procedure, was promoted by the government for a decade, requested by representatives of Midwive organisations, desired by some patients, but also criticized by the Academy of Medicine. Well organized and with an obligatory control and follow-up, EPD could help with the management of the shortage of maternity beds and hence increase the satisfaction of the patients. The procedure could even be a way to effectively implement a town-hospital network, something, which has many other benefits. However this procedure is not without potential dangers: lower quality of care, financial risks for the department, and especially a significant increase of the workload of the hospital staff. The main objective of this paper is to detail the benefits and disadvantages of EPD for maternities and to propose an organizational basis if EPD is the procedure of choice. A participatory methodology is essential when using this procedure, due to the important participation of different categories of staff involved in hospital discharge (administrative, medical and paramedical staff) and to exclude complications when certain specifications are not followed or misunderstood. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  17. In vivo MRS metabolite quantification using genetic optimization

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure.

  18. In vivo MRS metabolite quantification using genetic optimization

    International Nuclear Information System (INIS)

    Papakostas, G A; Mertzios, B G; Karras, D A; Van Ormondt, D; Graveron-Demilly, D

    2011-01-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure

  19. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  20. A systematic methodology for the robust quantification of energy efficiency at wastewater treatment plants featuring Data Envelopment Analysis.

    Science.gov (United States)

    Longo, S; Hospido, A; Lema, J M; Mauricio-Iglesias, M

    2018-05-10

    This article examines the potential benefits of using Data Envelopment Analysis (DEA) for conducting energy-efficiency assessment of wastewater treatment plants (WWTPs). WWTPs are characteristically heterogeneous (in size, technology, climate, function …) which limits the correct application of DEA. This paper proposes and describes the Robust Energy Efficiency DEA (REED) in its various stages, a systematic state-of-the-art methodology aimed at including exogenous variables in nonparametric frontier models and especially designed for WWTP operation. In particular, the methodology systematizes the modelling process by presenting an integrated framework for selecting the correct variables and appropriate models, possibly tackling the effect of exogenous factors. As a result, the application of REED improves the quality of the efficiency estimates and hence the significance of benchmarking. For the reader's convenience, this article is presented as a step-by-step guideline to guide the user in the determination of WWTPs energy efficiency from beginning to end. The application and benefits of the developed methodology are demonstrated by a case study related to the comparison of the energy efficiency of a set of 399 WWTPs operating in different countries and under heterogeneous environmental conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  2. Methodology for quantification of waste generated in Spanish railway construction works

    International Nuclear Information System (INIS)

    Guzmán Báez, Ana de; Villoria Sáez, Paola; Río Merino, Mercedes del; García Navarro, Justo

    2012-01-01

    Highlights: ► Two equations for C and D waste estimation in railway construction works are developed. ► Mixed C and D waste is the most generated category during railway construction works. ► Tunnel construction is essential to quantify the waste generated during the works. ► There is a relationship between C and D waste generated and railway functional units. ► The methodology proposed can be used to obtain new constants for other areas. - Abstract: In the last years, the European Union (EU) has been focused on the reduction of construction and demolition (C and D) waste. Specifically, in 2006, Spain generated roughly 47 million tons of C and D waste, of which only 13.6% was recycled. This situation has lead to the drawing up of many regulations on C and D waste during the past years forcing EU countries to include new measures for waste prevention and recycling. Among these measures, the mandatory obligation to quantify the C and D waste expected to be originated during a construction project is mandated. However, limited data is available on civil engineering projects. Therefore, the aim of this research study is to improve C and D waste management in railway projects, by developing a model for C and D waste quantification. For this purpose, we develop two equations which estimate in advance the amount, both in weight and volume, of the C and D waste likely to be generated in railway construction projects, including the category of C and D waste generated for the entire project.

  3. Quantification of lung fibrosis and emphysema in mice using automated micro-computed tomography.

    Directory of Open Access Journals (Sweden)

    Ellen De Langhe

    Full Text Available BACKGROUND: In vivo high-resolution micro-computed tomography allows for longitudinal image-based measurements in animal models of lung disease. The combination of repetitive high resolution imaging with fully automated quantitative image analysis in mouse models of lung fibrosis lung benefits preclinical research. This study aimed to develop and validate such an automated micro-computed tomography analysis algorithm for quantification of aerated lung volume in mice; an indicator of pulmonary fibrosis and emphysema severity. METHODOLOGY: Mice received an intratracheal instillation of bleomycin (n = 8, elastase (0.25 U elastase n = 9, 0.5 U elastase n = 8 or saline control (n = 6 for fibrosis, n = 5 for emphysema. A subset of mice was scanned without intervention, to evaluate potential radiation-induced toxicity (n = 4. Some bleomycin-instilled mice were treated with imatinib for proof of concept (n = 8. Mice were scanned weekly, until four weeks after induction, when they underwent pulmonary function testing, lung histology and collagen quantification. Aerated lung volumes were calculated with our automated algorithm. PRINCIPAL FINDINGS: Our automated image-based aerated lung volume quantification method is reproducible with low intra-subject variability. Bleomycin-treated mice had significantly lower scan-derived aerated lung volumes, compared to controls. Aerated lung volume correlated with the histopathological fibrosis score and total lung collagen content. Inversely, a dose-dependent increase in lung volume was observed in elastase-treated mice. Serial scanning of individual mice is feasible and visualized dynamic disease progression. No radiation-induced toxicity was observed. Three-dimensional images provided critical topographical information. CONCLUSIONS: We report on a high resolution in vivo micro-computed tomography image analysis algorithm that runs fully automated and allows quantification of aerated lung volume in mice. This

  4. Quantification of the least limiting water range in an oxisol using two methodological strategies

    Directory of Open Access Journals (Sweden)

    Wagner Henrique Moreira

    2014-12-01

    Full Text Available The least limiting water range (LLWR has been used as an indicator of soil physical quality as it represents, in a single parameter, the soil physical properties directly linked to plant growth, with the exception of temperature. The usual procedure for obtaining the LLWR involves determination of the water retention curve (WRC and the soil resistance to penetration curve (SRC in soil samples with undisturbed structure in the laboratory. Determination of the WRC and SRC using field measurements (in situ is preferable, but requires appropriate instrumentation. The objective of this study was to determine the LLWR from the data collected for determination of WRC and SRC in situ using portable electronic instruments, and to compare those determinations with the ones made in the laboratory. Samples were taken from the 0.0-0.1 m layer of a Latossolo Vermelho distrófico (Oxisol. Two methods were used for quantification of the LLWR: the traditional, with measurements made in soil samples with undisturbed structure; and in situ , with measurements of water content (θ, soil water potential (Ψ, and soil resistance to penetration (SR through the use of sensors. The in situ measurements of θ, Ψ and SR were taken over a period of four days of soil drying. At the same time, samples with undisturbed structure were collected for determination of bulk density (BD. Due to the limitations of measurement of Ψ by tensiometer, additional determinations of θ were made with a psychrometer (in the laboratory at the Ψ of -1500 kPa. The results show that it is possible to determine the LLWR by the θ, Ψ and SR measurements using the suggested approach and instrumentation. The quality of fit of the SRC was similar in both strategies. In contrast, the θ and Ψ in situ measurements, associated with those measured with a psychrometer, produced a better WRC description. The estimates of the LLWR were similar in both methodological strategies. The quantification of

  5. Analytical methodologies for the determination of nutraceuticals in foods

    International Nuclear Information System (INIS)

    Rosanna, Gatti; Domenica Masci

    2015-01-01

    The term nutraceutical was coined almost thirty years ago (Stephen De Felice, 1989) by the union of the two terms nutrition and pharmaceutical. According to the definition, for 'Nutraceutical' refers to 'any substance that can be considered a food (or part of a food), and which provides medical or health benefits, including the prevention and / or the treatment of a disease'. At the Casaccia Research Centre ENEA, are developed and validated methodologies analytics for detection and quantification of nutraceutical substances. This is to highlight some cultivars in relation to genotype, geographical area production, cultural practices, or for the purpose to assess the content relative to conservation techniques or transport of raw materials and processed products. [it

  6. Methodology and applications for the benefit cost analysis of the seismic risk reduction in building portfolios at broadscale

    OpenAIRE

    Valcarcel, Jairo A.; Mora, Miguel G.; Cardona, Omar D.; Pujades, Lluis G.; Barbat, Alex H.; Bernal, Gabriel A.

    2013-01-01

    This article presents a methodology for an estimate of the benefit cost ratio of the seismic risk reduction in buildings portfolio at broadscale, for a world region, allowing comparing the results obtained for the countries belonging to that region. This methodology encompasses (1) the generation of a set of random seismic events and the evaluation of the spectral accelerations at the buildings location; (2) the estimation of the buildings built area, the economic value, as well as the cla...

  7. A methodology for direct quantification of over-ranging length in helical computed tomography with real-time dosimetry.

    Science.gov (United States)

    Tien, Christopher J; Winslow, James F; Hintenlang, David E

    2011-01-31

    In helical computed tomography (CT), reconstruction information from volumes adjacent to the clinical volume of interest (VOI) is required for proper reconstruction. Previous studies have relied upon either operator console readings or indirect extrapolation of measurements in order to determine the over-ranging length of a scan. This paper presents a methodology for the direct quantification of over-ranging dose contributions using real-time dosimetry. A Siemens SOMATOM Sensation 16 multislice helical CT scanner is used with a novel real-time "point" fiber-optic dosimeter system with 10 ms temporal resolution to measure over-ranging length, which is also expressed in dose-length-product (DLP). Film was used to benchmark the exact length of over-ranging. Over-ranging length varied from 4.38 cm at pitch of 0.5 to 6.72 cm at a pitch of 1.5, which corresponds to DLP of 131 to 202 mGy-cm. The dose-extrapolation method of Van der Molen et al. yielded results within 3%, while the console reading method of Tzedakis et al. yielded consistently larger over-ranging lengths. From film measurements, it was determined that Tzedakis et al. overestimated over-ranging lengths by one-half of beam collimation width. Over-ranging length measured as a function of reconstruction slice thicknesses produced two linear regions similar to previous publications. Over-ranging is quantified with both absolute length and DLP, which contributes about 60 mGy-cm or about 10% of DLP for a routine abdominal scan. This paper presents a direct physical measurement of over-ranging length within 10% of previous methodologies. Current uncertainties are less than 1%, in comparison with 5% in other methodologies. Clinical implantation can be increased by using only one dosimeter if codependence with console readings is acceptable, with an uncertainty of 1.1% This methodology will be applied to different vendors, models, and postprocessing methods--which have been shown to produce over-ranging lengths

  8. Real-time PCR for the quantification of fungi in planta.

    Science.gov (United States)

    Klosterman, Steven J

    2012-01-01

    Methods enabling quantification of fungi in planta can be useful for a variety of applications. In combination with information on plant disease severity, indirect quantification of fungi in planta offers an additional tool in the screening of plants that are resistant to fungal diseases. In this chapter, a method is described for the quantification of DNA from a fungus in plant leaves using real-time PCR (qPCR). Although the method described entails quantification of the fungus Verticillium dahliae in lettuce leaves, the methodology described would be useful for other pathosystems as well. The method utilizes primers that are specific for amplification of a β-tubulin sequence from V. dahliae and a lettuce actin gene sequence as a reference for normalization. This approach enabled quantification of V. dahliae in the amount of 2.5 fg/ng of lettuce leaf DNA at 21 days following plant inoculation.

  9. Experience and benefits from using the EPRI MOV Performance Prediction Methodology in nuclear power plants

    International Nuclear Information System (INIS)

    Walker, T.; Damerell, P.S.

    1999-01-01

    The EPRI MOV Performance Prediction Methodology (PPM) is an effective tool for evaluating design basis thrust and torque requirements for MOVs. Use of the PPM has become more widespread in US nuclear power plants as they close out their Generic Letter (GL) 89-10 programs and address MOV periodic verification per GL 96-05. The PPM has also been used at plants outside the US, many of which are implementing programs similar to US plants' GL 89-10 programs. The USNRC Safety Evaluation of the PPM and the USNRC's discussion of the PPM in GL 96-05 make the PPM an attractive alternative to differential pressure (DP) testing, which can be costly and time-consuming. Significant experience and benefits, which are summarized in this paper, have been gained using the PPM. Although use of PPM requires a commitment of resources, the benefits of a solidly justified approach and a reduced need for DP testing provide a substantial safety and economic benefit. (author)

  10. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  11. A methodology for assessing the market benefits of alternative motor fuels: The Alternative Fuels Trade Model

    Energy Technology Data Exchange (ETDEWEB)

    Leiby, P.N.

    1993-09-01

    This report describes a modeling methodology for examining the prospective economic benefits of displacing motor gasoline use by alternative fuels. The approach is based on the Alternative Fuels Trade Model (AFTM). AFTM development was undertaken by the US Department of Energy (DOE) as part of a longer term study of alternative fuels issues. The AFTM is intended to assist with evaluating how alternative fuels may be promoted effectively, and what the consequences of substantial alternative fuels use might be. Such an evaluation of policies and consequences of an alternative fuels program is being undertaken by DOE as required by Section 502(b) of the Energy Policy Act of 1992. Interest in alternative fuels is based on the prospective economic, environmental and energy security benefits from the substitution of these fuels for conventional transportation fuels. The transportation sector is heavily dependent on oil. Increased oil use implies increased petroleum imports, with much of the increase coming from OPEC countries. Conversely, displacement of gasoline has the potential to reduce US petroleum imports, thereby reducing reliance on OPEC oil and possibly weakening OPEC`s ability to extract monopoly profits. The magnitude of US petroleum import reduction, the attendant fuel price changes, and the resulting US benefits, depend upon the nature of oil-gas substitution and the supply and demand behavior of other world regions. The methodology applies an integrated model of fuel market interactions to characterize these effects.

  12. Digital PCR for direct quantification of viruses without DNA extraction.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration material and it has higher tolerance to inhibitors. DNA quantification without an extraction step (i.e. direct quantification) was performed here using dPCR and two different human cytomegalovirus whole-virus materials. Two dPCR platforms were used for this direct quantification of the viral DNA, and these were compared with quantification of the extracted viral DNA in terms of yield and variability. Direct quantification of both whole-virus materials present in simple matrices like cell lysate or Tris-HCl buffer provided repeatable measurements of virus concentrations that were probably in closer agreement with the actual viral load than when estimated through quantification of the extracted DNA. Direct dPCR quantification of other viruses, reference materials and clinically relevant matrices is now needed to show the full versatility of this very promising and cost-efficient development in virus quantification.

  13. A posteriori uncertainty quantification of PIV-based pressure data

    NARCIS (Netherlands)

    Azijli, I.; Sciacchitano, A.; Ragni, D.; Palha Da Silva Clérigo, A.; Dwight, R.P.

    2016-01-01

    A methodology for a posteriori uncertainty quantification of pressure data retrieved from particle image velocimetry (PIV) is proposed. It relies upon the Bayesian framework, where the posterior distribution (probability distribution of the true velocity, given the PIV measurements) is obtained from

  14. A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.

    MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.

  15. Risk assessment methodology for Hanford high-level waste tanks

    International Nuclear Information System (INIS)

    Bott, T.F.; Mac Farlane, D.R.; Stack, D.W.; Kindinger, J.

    1992-01-01

    A methodology is presented for applying Probabilistic Safety Assessment techniques to quantification of the health risks posed by the high-level waste (HLW) underground tanks at the Department of Energy's Hanford reservation. This methodology includes hazard screening development of a list of potential accident initiators, systems fault trees development and quantification, definition of source terms for various release categories, and estimation of health consequences from the releases. Both airborne and liquid pathway releases to the environment, arising from aerosol and spill/leak releases from the tanks, are included in the release categories. The proposed methodology is intended to be applied to a representative subset of the total of 177 tanks, thereby providing a baseline risk profile for the HLW tank farm that can be used for setting clean-up/remediation priorities. Some preliminary results are presented for Tank 101-SY

  16. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.; Adams, M.L.; McClarren, R.G.; Mallick, B.K.

    2011-01-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework

  17. Methodology to estimate the cost of the severe accidents risk / maximum benefit

    International Nuclear Information System (INIS)

    Mendoza, G.; Flores, R. M.; Vega, E.

    2016-09-01

    For programs and activities to manage aging effects, any changes to plant operations, inspections, maintenance activities, systems and administrative control procedures during the renewal period should be characterized, designed to manage the effects of aging as required by 10 Cfr Part 54 that could impact the environment. Environmental impacts significantly different from those described in the final environmental statement for the current operating license should be described in detail. When complying with the requirements of a license renewal application, the Severe Accident Mitigation Alternatives (SAMA) analysis is contained in a supplement to the environmental report of the plant that meets the requirements of 10 Cfr Part 51. In this paper, the methodology for estimating the cost of severe accidents risk is established and discussed, which is then used to identify and select the alternatives for severe accident mitigation, which are analyzed to estimate the maximum benefit that an alternative could achieve if this eliminate all risk. Using the regulatory analysis techniques of the US Nuclear Regulatory Commission (NRC) estimates the cost of severe accidents risk. The ultimate goal of implementing the methodology is to identify candidates for SAMA that have the potential to reduce the severe accidents risk and determine if the implementation of each candidate is cost-effective. (Author)

  18. Cost-benefit analysis of alternative LNG vapor-mitigation measures. Topical report, September 14, 1987-January 15, 1991

    International Nuclear Information System (INIS)

    Atallah, S.

    1992-01-01

    A generalized methodology is presented for comparing the costs and safety benefits of alternative hazard mitigation measures for a large LNG vapor release. The procedure involves the quantification of the risk to the public before and after the application of LNG vapor mitigation measures. In the study, risk was defined as the product of the annual accident frequency, estimated from a fault tree analysis, and the severity of the accident. Severity was measured in terms of the number of people who may be exposed to 2.5% or higher concentration. The ratios of the annual costs of the various mitigation measures to their safety benefits (as determined by the differences between the risk before and after mitigation measure implementation), were then used to identify the most cost-effective approaches to vapor cloud mitigation

  19. Chromatic and anisotropic cross-recurrence quantification analysis of interpersonal behavior

    NARCIS (Netherlands)

    Cox, R.F.A; van der Steen, Stephanie; Guevara Guerrero, Marlenny; Hoekstra, Lisette; van Dijk, Marijn; Webber, Charles; Ioana, Cornel; Marwan, Norbert

    Cross-recurrence quantification analysis (CRQA) is a powerful nonlinear time-series method to study coordination and cooperation between people. This chapter concentrates on two methodological issues related to CRQA on categorical data streams, which are commonly encountered in the behavioral

  20. Methodology of cost benefit analyses

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2000-10-01

    The report addresses financial aspects of proposed investments and other steps which are intended to contribute to nuclear safety. The aim is to provide introductory insight into the procedures and potential of cost-benefit analyses as a routine guide when making decisions on costly provisions as one of the tools to assess whether a particular provision is reasonable. The topic is applied to the nuclear power sector. (P.A.)

  1. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  2. Is law enforcement of drug-impaired driving cost-efficient? An explorative study of a methodology for cost-benefit analysis.

    Science.gov (United States)

    Veisten, Knut; Houwing, Sjoerd; Mathijssen, M P M René; Akhtar, Juned

    2013-03-01

    Road users driving under the influence of psychoactive substances may be at much higher relative risk (RR) in road traffic than the average driver. Legislation banning blood alcohol concentrations above certain threshold levels combined with roadside breath-testing of alcohol have been in lieu for decades in many countries, but new legislation and testing of drivers for drug use have recently been implemented in some countries. In this article we present a methodology for cost-benefit analysis (CBA) of increased law enforcement of roadside drug screening. This is an analysis of the profitability for society, where costs of control are weighed against the reduction in injuries expected from fewer drugged drivers on the roads. We specify assumptions regarding costs and the effect of the specificity of the drug screening device, and quantify a deterrence effect related to sensitivity of the device yielding the benefit estimates. Three European countries with different current enforcement levels were studied, yielding benefit-cost ratios in the approximate range of 0.5-5 for a tripling of current levels of enforcement, with costs of about 4000 EUR per convicted and in the range of 1.5 and 13 million EUR per prevented fatality. The applied methodology for CBA has involved a simplistic behavioural response to enforcement increase and control efficiency. Although this methodology should be developed further, it is clearly indicated that the cost-efficiency of increased law enforcement of drug driving offences is dependent on the baseline situation of drug-use in traffic and on the current level of enforcement, as well as the RR and prevalence of drugs in road traffic. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Costs of disarmament - Rethinking the price tag: A methodological inquiry into the costs and benefits of arms control

    International Nuclear Information System (INIS)

    Willett, S.

    2002-06-01

    The growing number of arms control and disarmament treaties agreed on over the past decades as well as rising concerns about harmful environmental and public health effects of weapons disposal, have understandably led to an increase in the cost of implementing arms control agreements. As a result, the expenses associated with treaty compliance have emerged as a contentious issue within the realm of arms control and disarmament discussions. In particular, opponents of arms control and disarmament point to perceived rising costs of meeting current and proposed treaty obligations in an attempt to limit and undermine such activities. Yet determining just how much arms control and disarmament cost remains very much an ambiguous task. In Costs of Disarmament - Rethinking the Price Tag: A Methodological Inquiry into the Costs and Benefits of Arms Control, Susan Willett addresses the question of how the cost of arms control ought to be measured. Emphasizing the proper allocation of costs associated with arms control treaty implementation to the life cycle costs of weapon systems and their correct weighing against the benefits they procure in terms of averted arms races and increased international security, Willett argues for a revised methodology of costing arms control and disarmament that gives a more accurate - and significantly lower - estimate of the latter. Adopting such a revised methodology concludes the author, might dispel considerable misunderstanding and help point decisions over arms control and disarmament in the right direction

  4. Absolute quantification of olive oil DNA by droplet digital-PCR (ddPCR): Comparison of isolation and amplification methodologies.

    Science.gov (United States)

    Scollo, Francesco; Egea, Leticia A; Gentile, Alessandra; La Malfa, Stefano; Dorado, Gabriel; Hernandez, Pilar

    2016-12-15

    Olive oil is considered a premium product for its nutritional value and health benefits, and the ability to define its origin and varietal composition is a key step towards ensuring the traceability of the product. However, isolating the DNA from such a matrix is a difficult task. In this study, the quality and quantity of olive oil DNA, isolated using four different DNA isolation protocols, was evaluated using the qRT-PCR and ddPCR techniques. The results indicate that CTAB-based extraction methods were the best for unfiltered oil, while Nucleo Spin-based extraction protocols showed greater overall reproducibility. The use of both qRT-PCR and ddPCR led to the absolute quantification of the DNA copy number. The results clearly demonstrate the importance of the choice of DNA-isolation protocol, which should take into consideration the qualitative aspects of DNA and the evaluation of the amplified DNA copy number. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Biomass to energy : GHG reduction quantification protocols and case study

    Energy Technology Data Exchange (ETDEWEB)

    Reusing, G.; Taylor, C. [Conestoga - Rovers and Associates, Waterloo, ON (Canada); Nolan, W. [Liberty Energy, Hamilton, ON (Canada); Kerr, G. [Index Energy, Ajax, ON (Canada)

    2009-07-01

    With the growing concerns over greenhouses gases and their contribution to climate change, it is necessary to find ways of reducing environmental impacts by diversifying energy sources to include non-fossil fuel energy sources. Among the fastest growing green energy sources is energy from waste facilities that use biomass that would otherwise be landfilled or stockpiled. The quantification of greenhouse gas reductions through the use of biomass to energy systems can be calculated using various protocols and methodologies. This paper described each of these methodologies and presented a case study comparing some of these quantification methodologies. A summary and comparison of biomass to energy greenhouse gas reduction protocols in use or under development by the United Nations, the European Union, the Province of Alberta and Environment Canada was presented. It was concluded that regulatory, environmental pressures, and public policy will continue to impact the practices associated with biomass processing or landfill operations, such as composting, or in the case of landfills, gas collection systems, thus reducing the amount of potential credit available for biomass to energy facility offset projects. 10 refs., 2 tabs., 6 figs.

  6. Biomass to energy : GHG reduction quantification protocols and case study

    International Nuclear Information System (INIS)

    Reusing, G.; Taylor, C.; Nolan, W.; Kerr, G.

    2009-01-01

    With the growing concerns over greenhouses gases and their contribution to climate change, it is necessary to find ways of reducing environmental impacts by diversifying energy sources to include non-fossil fuel energy sources. Among the fastest growing green energy sources is energy from waste facilities that use biomass that would otherwise be landfilled or stockpiled. The quantification of greenhouse gas reductions through the use of biomass to energy systems can be calculated using various protocols and methodologies. This paper described each of these methodologies and presented a case study comparing some of these quantification methodologies. A summary and comparison of biomass to energy greenhouse gas reduction protocols in use or under development by the United Nations, the European Union, the Province of Alberta and Environment Canada was presented. It was concluded that regulatory, environmental pressures, and public policy will continue to impact the practices associated with biomass processing or landfill operations, such as composting, or in the case of landfills, gas collection systems, thus reducing the amount of potential credit available for biomass to energy facility offset projects. 10 refs., 2 tabs., 6 figs

  7. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    Science.gov (United States)

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.

  8. Methodological modifications on quantification of phosphatidylethanol in blood from humans abusing alcohol, using high-performance liquid chromatography and evaporative light scattering detection

    Directory of Open Access Journals (Sweden)

    Aradottir Steina

    2005-09-01

    Full Text Available Abstract Background Phosphatidylethanol (PEth is an abnormal phospholipid formed slowly in cell membranes by a transphosphatidylation reaction from phosphatidylcholine in the presence of ethanol and catalyzed by the enzyme phospholipase D. PEth in blood is a promising new marker of ethanol abuse depending on the high specificity and sensitivity of this marker. None of the biological markers used in clinical routine at the present time are sensitive and specific enough for the diagnosis of alcohol abuse. The method for PEth analysis includes lipid extraction of whole blood, a one-hour HPLC separation of lipids and ELSD (evaporative light scattering detection of PEth. Results Methodological improvements are presented which comprise a simpler extraction procedure, the use of phosphatidylbutanol as internal standard and a new algorithm for evaluation of unknown samples. It is further demonstrated that equal test results are obtained with blood collected in standard test tubes with EDTA as with the previously used heparinized test tubes. The PEth content in blood samples is stable for three weeks in the refrigerator. Conclusion Methodological changes make the method more suitable for routine laboratory use, lower the limit of quantification (LOQ and improve precision.

  9. Genomic DNA-based absolute quantification of gene expression in Vitis.

    Science.gov (United States)

    Gambetta, Gregory A; McElrone, Andrew J; Matthews, Mark A

    2013-07-01

    Many studies in which gene expression is quantified by polymerase chain reaction represent the expression of a gene of interest (GOI) relative to that of a reference gene (RG). Relative expression is founded on the assumptions that RG expression is stable across samples, treatments, organs, etc., and that reaction efficiencies of the GOI and RG are equal; assumptions which are often faulty. The true variability in RG expression and actual reaction efficiencies are seldom determined experimentally. Here we present a rapid and robust method for absolute quantification of expression in Vitis where varying concentrations of genomic DNA were used to construct GOI standard curves. This methodology was utilized to absolutely quantify and determine the variability of the previously validated RG ubiquitin (VvUbi) across three test studies in three different tissues (roots, leaves and berries). In addition, in each study a GOI was absolutely quantified. Data sets resulting from relative and absolute methods of quantification were compared and the differences were striking. VvUbi expression was significantly different in magnitude between test studies and variable among individual samples. Absolute quantification consistently reduced the coefficients of variation of the GOIs by more than half, often resulting in differences in statistical significance and in some cases even changing the fundamental nature of the result. Utilizing genomic DNA-based absolute quantification is fast and efficient. Through eliminating error introduced by assuming RG stability and equal reaction efficiencies between the RG and GOI this methodology produces less variation, increased accuracy and greater statistical power. © 2012 Scandinavian Plant Physiology Society.

  10. Resonance self-shielding effect in uncertainty quantification of fission reactor neutronics parameters

    International Nuclear Information System (INIS)

    Chiba, Go; Tsuji, Masashi; Narabayashi, Tadashi

    2014-01-01

    In order to properly quantify fission reactor neutronics parameter uncertainties, we have to use covariance data and sensitivity profiles consistently. In the present paper, we establish two consistent methodologies for uncertainty quantification: a self-shielded cross section-based consistent methodology and an infinitely-diluted cross section-based consistent methodology. With these methodologies and the covariance data of uranium-238 nuclear data given in JENDL-3.3, we quantify uncertainties of infinite neutron multiplication factors of light water reactor and fast reactor fuel cells. While an inconsistent methodology gives results which depend on the energy group structure of neutron flux and neutron-nuclide reaction cross section representation, both the consistent methodologies give fair results with no such dependences.

  11. Urinary Cell-Free DNA Quantification as Non-Invasive Biomarker in Patients with Bladder Cancer.

    Science.gov (United States)

    Brisuda, Antonin; Pazourkova, Eva; Soukup, Viktor; Horinek, Ales; Hrbáček, Jan; Capoun, Otakar; Svobodova, Iveta; Pospisilova, Sarka; Korabecna, Marie; Mares, Jaroslav; Hanuš, Tomáš; Babjuk, Marek

    2016-01-01

    Concentration of urinary cell-free DNA (ucfDNA) belongs to potential bladder cancer markers, but the reported results are inconsistent due to the use of various non-standardised methodologies. The aim of the study was to standardise the methodology for ucfDNA quantification as a potential non-invasive tumour biomarker. In total, 66 patients and 34 controls were enrolled into the study. Volumes of each urine portion (V) were recorded and ucfDNA concentrations (c) were measured using real-time PCR. Total amounts (TA) of ucfDNA were calculated and compared between patients and controls. Diagnostic accuracy of the TA of ucfDNA was determined. The calculation of TA of ucfDNA in the second urine portion was the most appropriate approach to ucfDNA quantification, as there was logarithmic dependence between the volume and the concentration of a urine portion (p = 0.0001). Using this methodology, we were able to discriminate between bladder cancer patients and subjects without bladder tumours (p = 0.0002) with area under the ROC curve of 0.725. Positive and negative predictive value of the test was 90 and 45%, respectively. Quantification of ucf DNA according to our modified method could provide a potential non-invasive biomarker for diagnosis of patients with bladder cancer. © 2015 S. Karger AG, Basel.

  12. Methods for cost-benefit-risk analysis of material-accounting upgrades

    International Nuclear Information System (INIS)

    Fishbone, L.G.; Gordon, D.M.; Higinbotham, W.; Keisch, B.

    1988-01-01

    The authors have developed a cost-benefit-risk methodology for evaluating material-accounting upgrades at key measurement points in nuclear facilities. The focus of this methodology is on nuclear-material measurements and their effects on inventory differences and shipper/receiver differences. The methodology has three main components: cost, benefits, and risk factors. The fundamental outcome of the methodology is therefore cost-benefit ratios characterizing the proposed upgrades, with the risk factors applied as necessary to the benefits. Examples illustrate the methodology's use

  13. Organic and total mercury determination in sediments by cold vapor atomic absorption spectrometry: methodology validation and uncertainty measurements

    Directory of Open Access Journals (Sweden)

    Robson L. Franklin

    2012-01-01

    Full Text Available The purpose of the present study was to validate a method for organic Hg determination in sediment. The procedure for organic Hg was adapted from literature, where the organomercurial compounds were extracted with dichloromethane in acid medium and subsequent destruction of organic compounds by bromine chloride. Total Hg was performed according to 3051A USEPA methodology. Mercury quantification for both methodologies was then performed by CVAAS. Methodology validation was verified by analyzing certified reference materials for total Hg and methylmercury. The uncertainties for both methodologies were calculated. The quantification limit of 3.3 µg kg-1 was found for organic Hg by CVAAS.

  14. RESONANCE SELF-SHIELDING EFFECT IN UNCERTAINTY QUANTIFICATION OF FISSION REACTOR NEUTRONICS PARAMETERS

    Directory of Open Access Journals (Sweden)

    GO CHIBA

    2014-06-01

    Full Text Available In order to properly quantify fission reactor neutronics parameter uncertainties, we have to use covariance data and sensitivity profiles consistently. In the present paper, we establish two consistent methodologies for uncertainty quantification: a self-shielded cross section-based consistent methodology and an infinitely-diluted cross section-based consistent methodology. With these methodologies and the covariance data of uranium-238 nuclear data given in JENDL-3.3, we quantify uncertainties of infinite neutron multiplication factors of light water reactor and fast reactor fuel cells. While an inconsistent methodology gives results which depend on the energy group structure of neutron flux and neutron-nuclide reaction cross section representation, both the consistent methodologies give fair results with no such dependences.

  15. Quantification methodology for the French 900 MW PWR PRA

    International Nuclear Information System (INIS)

    Ducamp, F.; Lanore, J.M.; Duchemin, B.; De Villeneuve, M.J.

    1985-02-01

    This paper develops some improvements brought to provide to the classical way of risk assessment. The calculation of the contribution to the risk of one peculiar sequence of an event tree is composed of four stages: creation of a fault tree for each system which appears in the event trees, in terms of component faults; simplification of these fault trees into smaller ones, in terms of macrocomponents; creation of one ''super-tree'' by regrouping the fault trees of down systems (systems which fail in the sequence) under an AND gate and calculation of minimal cut sets of this super-tree, taking into account the up systems (systems that do not fail in the sequence) and peculiarities related to the initiating event if needed; quantification of the minimal cut sets so obtained, taking into account the duration of the scenario depicted by the sequence and the possibilities of repair. Each of these steps is developed in this article

  16. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  17. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  18. Quantification of analytes affected by relevant interfering signals under quality controlled conditions

    International Nuclear Information System (INIS)

    Bettencourt da Silva, Ricardo J.N.; Santos, Julia R.; Camoes, M. Filomena G.F.C.

    2006-01-01

    The analysis of organic contaminants or residues in biological samples is frequently affected by the presence of compounds producing interfering instrumental signals. This feature is responsible for the higher complexity and cost of these analyses and/or by a significant reduction of the number of studied analytes in a multi-analyte method. This work presents a methodology to estimate the impact of the interfering compounds on the quality of the analysis of complex samples, based on separative instrumental methods of analysis, aiming at supporting the inclusion of analytes affected by interfering compounds in the list of compounds analysed in the studied samples. The proposed methodology involves the study of the magnitude of the signal produced by the interfering compounds in the analysed matrix, and is applicable to analytical systems affected by interfering compounds with varying concentration in the studied matrix. The proposed methodology is based on the comparison of the signals from a representative number of examples of the studied matrix, in order to estimate the impact of the presence of such compounds on the measurement quality. The treatment of the chromatographic signals necessary to collect these data can be easily performed considering algorithms of subtraction of chromatographic signals available in most of the analytical instrumentation software. The subtraction of the interfering compounds signal from the sample signal allows the compensation of the interfering effect irrespective of the relative magnitude of the interfering and analyte signals, supporting the applicability of the same model of the method performance for a broader concentration range. The quantification of the measurement uncertainty was performed using the differential approach, which allows the estimation of the contribution of the presence of the interfering compounds to the quality of the measurement. The proposed methodology was successfully applied to the analysis of

  19. HPCE quantification of 5-methyl-2'-deoxycytidine in genomic DNA: methodological optimization for chestnut and other woody species.

    Science.gov (United States)

    Hasbún, Rodrigo; Valledor, Luís; Rodríguez, José L; Santamaria, Estrella; Ríos, Darcy; Sanchez, Manuel; Cañal, María J; Rodríguez, Roberto

    2008-01-01

    Quantification of deoxynucleosides using micellar high-performance capillary electrophoresis (HPCE) is an efficient, fast and inexpensive evaluation method of genomic DNA methylation. This approach has been demonstrated to be more sensitive and specific than other methods for the quantification of DNA methylation content. However, effective detection and quantification of 5-methyl-2'-deoxycytidine depend of the sample characteristics. Previous works have revealed that in most woody species, the quality and quantity of RNA-free DNA extracted that is suitable for analysis by means of HPCE varies among species of the same gender, among tissues taken from the same tree, and vary in the same tissue depending on the different seasons of the year. The aim of this work is to establish a quantification method of genomic DNA methylation that lends itself to use in different Castanea sativa Mill. materials, and in other angiosperm and gymnosperm woody species. Using a DNA extraction kit based in silica membrane has increased the resolutive capacity of the method. Under these conditions, it can be analyzed different organs or tissues of angiosperms and gymnosperms, regardless of their state of development. We emphasized the importance of samples free of nucleosides, although, in the contrary case, the method ensures the effective separation of deoxynucleosides and identification of 5-methyl-2'-deoxycytidine.

  20. Environmental and Sustainability Education Policy Research: A Systematic Review of Methodological and Thematic Trends

    Science.gov (United States)

    Aikens, Kathleen; McKenzie, Marcia; Vaughter, Philip

    2016-01-01

    This paper reports on a systematic literature review of policy research in the area of environmental and sustainability education. We analyzed 215 research articles, spanning four decades and representing 71 countries, and which engaged a range of methodologies. Our analysis combines quantification of geographic and methodological trends with…

  1. How Much Can Non-industry Standard Measurement Methodologies Benefit Methane Reduction Programs?

    Science.gov (United States)

    Risk, D. A.; O'Connell, L.; Atherton, E.

    2017-12-01

    In recent years, energy sector methane emissions have been recorded in large part by applying modern non-industry-standard techniques. Industry may lack the regulatory flexibility to use such techniques, or in some cases may not understand the possible associated economic advantage. As progressive jurisdictions move from estimation and towards routine measurement, the research community should provide guidance to help regulators and companies measure more effectively, and economically if possible. In this study, we outline a modelling experiment in which we explore the integration of non-industry-standard measurement techniques as part of a generalized compliance measurement program. The study was not intended to be exhaustive, or to recommend particular combinations, but instead to explore the inter-relationships between methodologies, development type, compliance practice. We first defined the role, applicable scale, detection limits, working distances, and approximate deployment cost of several measurement methodologies. We then considered a variety of development types differing mainly in footprint, density, and emissions "profile". Using a Monte Carlo approach, we evaluated the effect of these various factors on the cost and confidence of the compliance measurement program. We found that when added individually, some of the research techniques were indeed able to deliver an improvement in cost and/or confidence when used alongside industry-standard Optical Gas Imaging. When applied in combination, the ideal fraction of each measurement technique depended on development type, emission profile, and whether confidence or cost was more important. Results suggest that measurement cost and confidence could be improved if energy companies exploited a wider range of measurement techniques, and in a manner tailored to each development. In the short-term, combining clear scientific guidance with economic information could benefit immediate mitigation efforts over

  2. Quantification results from an application of a new technique for human event analysis (ATHEANA) at a pressurized water reactor

    International Nuclear Information System (INIS)

    Whitehead, D.W.; Kolaczkowski, A.M.; Thompson, C.M.

    1998-05-01

    This paper presents results from the quantification of the three human failure events (HFEs) identified using the ATHEANA methodology as discussed in an earlier companion paper presented at this conference. Sections describe the quantification task, important basic events, and the results obtained from quantifying the three HFEs that were identified -- the first two of which were simulated at the Seabrook Station Simulator

  3. Quantification of growth benefit of carnivorous plants from prey

    Czech Academy of Sciences Publication Activity Database

    Adamec, Lubomír

    2017-01-01

    Roč. 46, č. 3 (2017), s. 1-7 ISSN 0190-9215 Institutional support: RVO:67985939 Keywords : mineral cost and benefit * stimulation of roots * growth stimulation Subject RIV: EF - Botanics OBOR OECD: Plant sciences, botany

  4. Uncertainties and quantification of common cause failure rates and probabilities for system analyses

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2005-01-01

    Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology

  5. Benefit-Risk Monitoring of Vaccines Using an Interactive Dashboard: A Methodological Proposal from the ADVANCE Project.

    Science.gov (United States)

    Bollaerts, Kaatje; De Smedt, Tom; Donegan, Katherine; Titievsky, Lina; Bauchau, Vincent

    2018-03-26

    New vaccines are launched based on their benefit-risk (B/R) profile anticipated from clinical development. Proactive post-marketing surveillance is necessary to assess whether the vaccination uptake and the B/R profile are as expected and, ultimately, whether further public health or regulatory actions are needed. There are several, typically not integrated, facets of post-marketing vaccine surveillance: the surveillance of vaccination coverage, vaccine safety, effectiveness and impact. With this work, we aim to assess the feasibility and added value of using an interactive dashboard as a potential methodology for near real-time monitoring of vaccine coverage and pre-specified health benefits and risks of vaccines. We developed a web application with an interactive dashboard for B/R monitoring. The dashboard is demonstrated using simulated electronic healthcare record data mimicking the introduction of rotavirus vaccination in the UK. The interactive dashboard allows end users to select certain parameters, including expected vaccine effectiveness, age groups, and time periods and allows calculation of the incremental net health benefit (INHB) as well as the incremental benefit-risk ratio (IBRR) for different sets of preference weights. We assessed the potential added value of the dashboard by user testing amongst a range of stakeholders experienced in the post-marketing monitoring of vaccines. The dashboard was successfully implemented and demonstrated. The feedback from the potential end users was generally positive, although reluctance to using composite B/R measures was expressed. The use of interactive dashboards for B/R monitoring is promising and received support from various stakeholders. In future research, the use of such an interactive dashboard will be further tested with real-life data as opposed to simulated data.

  6. Strategy study of quantification harmonization of SUV in PET/CT images

    International Nuclear Information System (INIS)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-01-01

    and quantitative assessments in different scopes. We concluded that the harmonization strategy of the SUV quantification presented in this paper was effective in reducing the variability of small structures quantification. However, for the comparison of SUV quantification between different scanners and institutions, it is essential that, in addition to the harmonization of quantification, the standardization of the methodology of patient preparation is maintained, in order to minimize the SUV variability due to biological factors. (author)

  7. Cost analysis and ecological benefits of environmental recovery methodologies in bauxite mining

    Directory of Open Access Journals (Sweden)

    João Carlos Costa Guimarães

    2013-03-01

    Full Text Available This work analyzed and compared three methods of environmental recovery in bauxite mining commonly used in Poços de Caldas Plateau, MG, by means of recovery costs and ecological benefits. Earnings and costs data of environmental recovery activities were obtained for the areas that belonged to the Companhia Geral de Minas – CGM, on properties sited in the city of Poços de Caldas, MG. The amount of costs of these activities was used to compare the recovery methods by updating them monetarily to a reference date, in other words, the present moment. It is concluded that the difference between the present value of costs for simple restoration and rehabilitation activities are less than 1% and that between the complete restoration and rehabilitation is about 15.12%, suggesting that the choice of the methods to be used must be based on the ecological earnings proportional to each of them. The methodology of environmental restoration of the mining areas emphasizes the ecological variables in the process of establishment of the community, to the detriment of complex ecological aspects, which show difficulties in measuring the actual moment of the development of the ecosystem considered.

  8. The Optimal Time for Claiming Social Security Benefits: A Methodological Note

    OpenAIRE

    Joseph Friedman

    2014-01-01

    The optimal age for initiating Social Security benefits and the initiation versus postponement of benefits decision are the subjects of a number of recent papers. It is generally agreed that an initiation versus postponement of benefits decision may have significant consequences, but there is less agreement about how to model the problem or measure its financial implications. By law benefits are paid only to live beneficiaries. Thus, the anticipated future benefits should be weighted by the r...

  9. Quantitative Assessment of Distributed Energy Resource Benefits

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, S.W.

    2003-05-22

    Distributed energy resources (DER) offer many benefits, some of which are readily quantified. Other benefits, however, are less easily quantifiable because they may require site-specific information about the DER project or analysis of the electrical system to which the DER is connected. The purpose of this study is to provide analytical insight into several of the more difficult calculations, using the PJM power pool as an example. This power pool contains most of Pennsylvania, New Jersey, Maryland, and Delaware. The techniques used here could be applied elsewhere, and the insights from this work may encourage various stakeholders to more actively pursue DER markets or to reduce obstacles that prevent the full realization of its benefits. This report describes methodologies used to quantify each of the benefits listed in Table ES-1. These methodologies include bulk power pool analyses, regional and national marginal cost evaluations, as well as a more traditional cost-benefit approach for DER owners. The methodologies cannot however determine which stakeholder will receive the benefits; that must be determined by regulators and legislators, and can vary from one location to another.

  10. In vivo quantification of lead in bone with a portable x-ray fluorescence system--methodology and feasibility.

    Science.gov (United States)

    Nie, L H; Sanchez, S; Newton, K; Grodzins, L; Cleveland, R O; Weisskopf, M G

    2011-02-07

    This study was conducted to investigate the methodology and feasibility of developing a portable x-ray fluorescence (XRF) technology to quantify lead (Pb) in bone in vivo. A portable XRF device was set up and optimal settings of voltage, current, and filter combination for bone lead quantification were selected to achieve the lowest detection limit. The minimum radiation dose delivered to the subject was calculated by Monte Carlo simulations. An ultrasound device was used to measure soft tissue thickness to account for signal attenuation, and an alternative method to obtain soft tissue thickness from the XRF spectrum was developed and shown to be equivalent to the ultrasound measurements (intraclass correlation coefficient, ICC = 0.82). We tested the correlation of in vivo bone lead concentrations between the standard KXRF technology and the portable XRF technology. There was a significant correlation between the bone lead concentrations obtained from the standard KXRF technology and those obtained from the portable XRF technology (ICC = 0.65). The detection limit for the portable XRF device was about 8.4 ppm with 2 mm soft tissue thickness. The entrance skin dose delivered to the human subject was about 13 mSv and the total body effective dose was about 1.5 µSv and should pose minimal radiation risk. In conclusion, portable XRF technology can be used for in vivo bone lead measurement with sensitivity comparable to the KXRF technology and good correlation with KXRF measurements.

  11. In vivo quantification of lead in bone with a portable x-ray fluorescence system-methodology and feasibility

    International Nuclear Information System (INIS)

    Nie, L H; Sanchez, S; Newton, K; Weisskopf, M G; Grodzins, L; Cleveland, R O

    2011-01-01

    This study was conducted to investigate the methodology and feasibility of developing a portable x-ray fluorescence (XRF) technology to quantify lead (Pb) in bone in vivo. A portable XRF device was set up and optimal settings of voltage, current, and filter combination for bone lead quantification were selected to achieve the lowest detection limit. The minimum radiation dose delivered to the subject was calculated by Monte Carlo simulations. An ultrasound device was used to measure soft tissue thickness to account for signal attenuation, and an alternative method to obtain soft tissue thickness from the XRF spectrum was developed and shown to be equivalent to the ultrasound measurements (intraclass correlation coefficient, ICC = 0.82). We tested the correlation of in vivo bone lead concentrations between the standard KXRF technology and the portable XRF technology. There was a significant correlation between the bone lead concentrations obtained from the standard KXRF technology and those obtained from the portable XRF technology (ICC = 0.65). The detection limit for the portable XRF device was about 8.4 ppm with 2 mm soft tissue thickness. The entrance skin dose delivered to the human subject was about 13 mSv and the total body effective dose was about 1.5 μSv and should pose minimal radiation risk. In conclusion, portable XRF technology can be used for in vivo bone lead measurement with sensitivity comparable to the KXRF technology and good correlation with KXRF measurements. (note)

  12. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    Science.gov (United States)

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Quantification of margins and mixed uncertainties using evidence theory and stochastic expansions

    International Nuclear Information System (INIS)

    Shah, Harsheel; Hosder, Serhat; Winter, Tyler

    2015-01-01

    The objective of this paper is to implement Dempster–Shafer Theory of Evidence (DSTE) in the presence of mixed (aleatory and multiple sources of epistemic) uncertainty to the reliability and performance assessment of complex engineering systems through the use of quantification of margins and uncertainties (QMU) methodology. This study focuses on quantifying the simulation uncertainties, both in the design condition and the performance boundaries along with the determination of margins. To address the possibility of multiple sources and intervals for epistemic uncertainty characterization, DSTE is used for uncertainty quantification. An approach to incorporate aleatory uncertainty in Dempster–Shafer structures is presented by discretizing the aleatory variable distributions into sets of intervals. In view of excessive computational costs for large scale applications and repetitive simulations needed for DSTE analysis, a stochastic response surface based on point-collocation non-intrusive polynomial chaos (NIPC) has been implemented as the surrogate for the model response. The technique is demonstrated on a model problem with non-linear analytical functions representing the outputs and performance boundaries of two coupled systems. Finally, the QMU approach is demonstrated on a multi-disciplinary analysis of a high speed civil transport (HSCT). - Highlights: • Quantification of margins and uncertainties (QMU) methodology with evidence theory. • Treatment of both inherent and epistemic uncertainties within evidence theory. • Stochastic expansions for representation of performance metrics and boundaries. • Demonstration of QMU on an analytical problem. • QMU analysis applied to an aerospace system (high speed civil transport)

  14. Quantification of prebiotics in commercial infant formulas.

    Science.gov (United States)

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Improved Methodology for Benefit Estimation of Preservation Projects

    Science.gov (United States)

    2018-04-01

    This research report presents an improved process for evaluating the benefits and economic tradeoffs associated with a variety of highway preservation projects. It includes a summary of results from a comprehensive phone survey concerning the use and...

  16. Water Footprint Symposium: where next for water footprint and water assessment methodology?

    NARCIS (Netherlands)

    Tillotson, M.R.; Kiu, J.; Guan, D.; Wu, P.; Zhao, Xu; Zhang, Guoping; Pfister, S.; Pahlow, Markus

    2014-01-01

    Recognizing the need for a comprehensive review of the tools and metrics for the quantification and assessment of water footprints, and allowing for the opportunity for open discussion on the challenges and future of water footprinting methodology, an international symposium on water footprint was

  17. Water Footprint Symposium : where next for water footprint and water assessment methodology?

    NARCIS (Netherlands)

    Tillotson, Martin R.; Liu, Junguo; Guan, Dabo; Wu, Pute; Zhao, Xu; Zhang, Guoping; Pfister, Stephan; Pahlow, Markus

    2014-01-01

    Recognizing the need for a comprehensive review of the tools and metrics for the quantification and assessment of water footprints, and allowing for the opportunity for open discussion on the challenges and future of water footprinting methodology, an international symposium on water footprint was

  18. Cost-Benefit Analysis of Smart Grids Implementation

    International Nuclear Information System (INIS)

    Tomsic, Z.; Pongrasic, M.

    2014-01-01

    Paper presents guidelines for conducting the cost-benefit analysis of Smart Grid projects connected to the implementation of advanced technologies in electric power system. Restrictions of presented electric power networks are also mentioned along with solutions that are offered by advanced electric power network. From an economic point of view, the main characteristic of advanced electric power network is big investment, and benefits are seen after some time with risk of being smaller than expected. Therefore it is important to make a comprehensive analysis of those projects which consist of economic and qualitative analysis. This report relies on EPRI methodology developed in American institute for energy. The methodology is comprehensive and useful, but also simple and easy to understand. Steps of this methodology and main characteristics of methodologies which refer to EPRI methodology: methodology developed in Joint Research Center and methodologies for analysing implementation of smart meters in electricity power network are explained. Costs, benefits and categories in which they can be classified are also defined. As a part of qualitative analysis, social aspect of Smart Grid projects is described. In cost defining, special attention has to be paid to projects of integrating electricity from variable renewable energy sources into the power system because of additional costs. This work summarized categories of additional costs. In the end of this report, an overview is given of what has been done and what will be done in European Union. (author).

  19. Usle systematization of the factors in gis to the quantification the of laminate erosion in the jirau river watershed

    Directory of Open Access Journals (Sweden)

    Elisete Guimarães

    2005-12-01

    Full Text Available The present paper demonstrates the use of USLE (Universal Equation of Soil Losses in GIS (Geographic Information System as a tool for the quantification of soil losses by laminate erosion. The study area is the Jirau River watershed, which is located in the district of Dois Vizinhos, Southwestern Parana. Our results present a contribution to the development and implementation of automated methodologies focused on the characterization, quantification, and control of the laminate erosion process.

  20. Importance of the lipid peroxidation biomarkers and methodological aspects FOR malondialdehyde quantification

    Directory of Open Access Journals (Sweden)

    Denise Grotto

    2009-01-01

    Full Text Available Free radicals induce lipid peroxidation, playing an important role in pathological processes. The injury mediated by free radicals can be measured by conjugated dienes, malondialdehyde, 4-hydroxynonenal, and others. However, malondialdehyde has been pointed out as the main product to evaluate lipid peroxidation. Most assays determine malondialdehyde by its reaction with thiobarbituric acid, which can be measured by indirect (spectrometry and direct methodologies (chromatography. Though there is some controversy among the methodologies, the selective HPLC-based assays provide a more reliable lipid peroxidation measure. This review describes significant aspects about MDA determination, its importance in pathologies and biological samples treatment.

  1. Digital PCR for direct quantification of viruses without DNA extraction

    OpenAIRE

    Pav?i?, Jernej; ?el, Jana; Milavec, Mojca

    2015-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration mat...

  2. Comparison of economic evaluation methodology for the nuclear plant lifetime extension

    International Nuclear Information System (INIS)

    Song, T. H.; Jung, I. S.

    2003-01-01

    In connection with economic evaluation of NPP lifetime management, there are lots of methodologies such as present worth calculation, Levelized Unit Energy Cost (LUEC) calculation, and market benefit comparison methodology. In this paper, economic evaluation of NPP lifetime management was carried out by using these three methodologies, and the results of each was compared with the other methodologies. With these three methodologies, break even points of investment cost related to life extension of nuclear power plant were calculated. It was turned out to be as a analysis result that LUEC is more conservative than present worth calculation and that benefit comparison is more conservative than LUEC, which means that Market Benefit Comparison is the most conservative methodology, and which means base load demand of the future would be far more important than any other factors such as capacity factor, investment cost of life extension, and performance of replacing power plant

  3. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    Science.gov (United States)

    Akram, Muhammad Farooq Bin

    The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for

  4. Recent Trends in PET Image Interpretations Using Volumetric and Texture-based Quantification Methods in Nuclear Oncology

    Energy Technology Data Exchange (ETDEWEB)

    Rahim, Muhammad Kashif; Kim, Sung Eun; So, Hyeongryul; Kim, Hyung Jun; Cheon, Gi Jeong; Lee, Eun Seong; Kang, Keon Wook; Lee, Dong Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2014-03-15

    Image quantification studies in positron emission tomography/computed tomography (PET/CT) are of immense importance in the diagnosis and follow-up of variety of cancers. In this review we have described the current image quantification methodologies employed in {sup 18}F-fluorodeoxyglucose ({sup 18}F-FDG) PET in major oncological conditions with particular emphasis on tumor heterogeneity studies. We have described various quantitative parameters being used in PET image analysis. The main contemporary methodology is to measure tumor metabolic activity; however, analysis of other image-related parameters is also increasing. Primarily, we have identified the existing role of tumor heterogeneity studies in major cancers using {sup 18}F-FDG PET. We have also described some newer radiopharmaceuticals other than {sup 18}F-FDG being studied/used in the management of these cancers. Tumor heterogeneity studies are being performed in almost all major oncological conditions using {sup 18}F-FDG PET. The role of these studies is very promising in the management of these conditions.

  5. Environmental costs and benefits case study: nuclear power plant. Quantification and economic valuation of selected environmental impacts/effects. Final report

    International Nuclear Information System (INIS)

    1984-02-01

    This case study is an application, to a nuclear power plant, of the methodology for quantifying environmental costs and benefits, contained in the regional energy plan, adopted in April, 1983, by the Northwest Power Planning Council, pursuant to Public Law 96-501.The study is based on plant number 2 of the Washington Public Power Supply System (WNP-2), currently nearing completion on the Hanford Nuclear Reservation in eastern Washington State. This report describes and documents efforts to quantify and estimate monetary values for the following seven areas of environmental effects: radiation/health effects, socioeconomic/infrastructure effects, consumptive use of water, psychological/health effects (fear/stress), waste management, nuclear power plant accidents, and decommissioning costs. 103 references

  6. Stochastic approach for radionuclides quantification

    Science.gov (United States)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  7. Image cytometry: nuclear and chromosomal DNA quantification.

    Science.gov (United States)

    Carvalho, Carlos Roberto; Clarindo, Wellington Ronildo; Abreu, Isabella Santiago

    2011-01-01

    Image cytometry (ICM) associates microscopy, digital image and software technologies, and has been particularly useful in spatial and densitometric cytological analyses, such as DNA ploidy and DNA content measurements. Basically, ICM integrates methodologies of optical microscopy calibration, standard density filters, digital CCD camera, and image analysis softwares for quantitative applications. Apart from all system calibration and setup, cytological protocols must provide good slide preparations for efficient and reliable ICM analysis. In this chapter, procedures for ICM applications employed in our laboratory are described. Protocols shown here for human DNA ploidy determination and quantification of nuclear and chromosomal DNA content in plants could be used as described, or adapted for other studies.

  8. Methodology for evaluation of railroad technology research projects

    Science.gov (United States)

    1981-04-01

    This Project memorandum presents a methodology for evaluating railroad research projects. The methodology includes consideration of industry and societal benefits, with special attention given to technical risks, implementation considerations, and po...

  9. BATSE gamma-ray burst line search. 2: Bayesian consistency methodology

    Science.gov (United States)

    Band, D. L.; Ford, L. A.; Matteson, J. L.; Briggs, M.; Paciesas, W.; Pendleton, G.; Preece, R.; Palmer, D.; Teegarden, B.; Schaefer, B.

    1994-01-01

    We describe a Bayesian methodology to evaluate the consistency between the reported Ginga and Burst and Transient Source Experiment (BATSE) detections of absorption features in gamma-ray burst spectra. Currently no features have been detected by BATSE, but this methodology will still be applicable if and when such features are discovered. The Bayesian methodology permits the comparison of hypotheses regarding the two detectors' observations and makes explicit the subjective aspects of our analysis (e.g., the quantification of our confidence in detector performance). We also present non-Bayesian consistency statistics. Based on preliminary calculations of line detectability, we find that both the Bayesian and non-Bayesian techniques show that the BATSE and Ginga observations are consistent given our understanding of these detectors.

  10. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  11. Methodology for quantification of waste generated in Spanish railway construction works.

    Science.gov (United States)

    de Guzmán Báez, Ana; Villoria Sáez, Paola; del Río Merino, Mercedes; García Navarro, Justo

    2012-05-01

    In the last years, the European Union (EU) has been focused on the reduction of construction and demolition (C&D) waste. Specifically, in 2006, Spain generated roughly 47million tons of C&D waste, of which only 13.6% was recycled. This situation has lead to the drawing up of many regulations on C&D waste during the past years forcing EU countries to include new measures for waste prevention and recycling. Among these measures, the mandatory obligation to quantify the C&D waste expected to be originated during a construction project is mandated. However, limited data is available on civil engineering projects. Therefore, the aim of this research study is to improve C&D waste management in railway projects, by developing a model for C&D waste quantification. For this purpose, we develop two equations which estimate in advance the amount, both in weight and volume, of the C&D waste likely to be generated in railway construction projects, including the category of C&D waste generated for the entire project. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Performance of uncertainty quantification methodologies and linear solvers in cardiovascular simulations

    Science.gov (United States)

    Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison

    2017-11-01

    Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.

  13. Quantification of in vivo oxidative damage in Caenorhabditis elegans during aging by endogenous F3-isoprostane measurement

    NARCIS (Netherlands)

    Labuschagne, C.F.; Stigter, E.C.; Hendriks, M.M.; Berger, R.; Rokach, J.; Korswagen, H.C.; Brenkman, A.B.

    2013-01-01

    Oxidative damage is thought to be a major cause in development of pathologies and aging. However, quantification of oxidative damage is methodologically difficult. Here, we present a robust liquid chromatography-tandem mass spectrometry (LC-MS/MS) approach for accurate, sensitive, and linear in vivo

  14. Forest Carbon Leakage Quantification Methods and Their Suitability for Assessing Leakage in REDD

    Directory of Open Access Journals (Sweden)

    Sabine Henders

    2012-01-01

    Full Text Available This paper assesses quantification methods for carbon leakage from forestry activities for their suitability in leakage accounting in a future Reducing Emissions from Deforestation and Forest Degradation (REDD mechanism. To that end, we first conducted a literature review to identify specific pre-requisites for leakage assessment in REDD. We then analyzed a total of 34 quantification methods for leakage emissions from the Clean Development Mechanism (CDM, the Verified Carbon Standard (VCS, the Climate Action Reserve (CAR, the CarbonFix Standard (CFS, and from scientific literature sources. We screened these methods for the leakage aspects they address in terms of leakage type, tools used for quantification and the geographical scale covered. Results show that leakage methods can be grouped into nine main methodological approaches, six of which could fulfill the recommended REDD leakage requirements if approaches for primary and secondary leakage are combined. The majority of methods assessed, address either primary or secondary leakage; the former mostly on a local or regional and the latter on national scale. The VCS is found to be the only carbon accounting standard at present to fulfill all leakage quantification requisites in REDD. However, a lack of accounting methods was identified for international leakage, which was addressed by only two methods, both from scientific literature.

  15. A solar reserve methodology for renewable energy integration studies based on sub-hourly variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez, Eduardo; Brinkman, Gregory; Hummon, Marissa [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lew, Debra

    2012-07-01

    Increasing penetration of wind and solar energy are raising concerns among electric system operators because of the variability and uncertainty associated with the power sources. Previous work focused on the quantification of reserves for systems with wind power. This paper presents a new methodology that allows the determination of necessary reserves for high penetrations of photovoltaic power and compares it to the wind-based methodology. The solar reserve methodology was applied to Phase 2 of the Western Wind and Solar Integration Study. A summary of the results is included. (orig.)

  16. Towards a common disability assessment framework: theoretical and methodological issues for providing public services and benefits using ICF.

    Science.gov (United States)

    Francescutti, Carlo; Frattura, Lucilla; Troiano, Raffaella; Gongolo, Francesco; Martinuzzi, Andrea; Sala, Marina; Meucci, Paolo; Raggi, Alberto; Russo, Emanuela; Buffoni, Mara; Gorini, Giovanna; Conclave, Mario; Petrangeli, Agostino; Solipaca, Alessandro; Leonardi, Matilde

    2009-01-01

    To report on the preliminary results of an Italian project on the implementation of an ICF-based protocol for providing public services and benefits for persons with disabilities. The UN Convention on the Rights of persons with disabilities (UNC) was mapped to the ICF, and core elements were implemented in an ICF-based evaluation protocol. A person-environment interaction classification (PEIC) tree was also developed for defining evaluation outputs. The PEIC and the ICF-based protocol are the guideline and the data interpretation source, respectively, for providing public services and benefits. They enable to assign persons to different services, from surveillance and monitoring to facilitator provision or sustain over time, to barrier removal or to the reorganisation of environmental factors provision. A detailed description of the target intervention is made available through the implementation of a protocol, which points out the effect of personal support and other environmental factors. The detailed description of functioning and disability provided by our methodology can help policy makers and administrators in decision making, on the basis of a description of real needs, and in targeting person-tailored interventions.

  17. A Methodology To Incorporate The Safety Culture Into Probabilistic Safety Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sunghyun; Kim, Namyeong; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2015-10-15

    In order to incorporate organizational factors into PSA, a methodology needs to be developed. Using the AHP to weigh organizational factors as well as the SLIM to rate those factors, a methodology is introduced in this study. The safety issues related to nuclear safety culture have occurred increasingly. The quantification tool has to be developed in order to include the organizational factor into Probabilistic Safety Assessments. In this study, the state-of-the-art for the organizational evaluation methodologies has been surveyed. This study includes the research for organizational factors, maintenance process, maintenance process analysis models, a quantitative methodology using Analytic Hierarchy Process, Success Likelihood Index Methodology. The purpose of this study is to develop a methodology to incorporate the safety culture into PSA for obtaining more objective risk than before. The organizational factor considered in nuclear safety culture might affect the potential risk of human error and hardware-failure. The safety culture impact index to monitor the plant safety culture can be assessed by applying the developed methodology into a nuclear power plant.

  18. A Methodology To Incorporate The Safety Culture Into Probabilistic Safety Assessments

    International Nuclear Information System (INIS)

    Park, Sunghyun; Kim, Namyeong; Jae, Moosung

    2015-01-01

    In order to incorporate organizational factors into PSA, a methodology needs to be developed. Using the AHP to weigh organizational factors as well as the SLIM to rate those factors, a methodology is introduced in this study. The safety issues related to nuclear safety culture have occurred increasingly. The quantification tool has to be developed in order to include the organizational factor into Probabilistic Safety Assessments. In this study, the state-of-the-art for the organizational evaluation methodologies has been surveyed. This study includes the research for organizational factors, maintenance process, maintenance process analysis models, a quantitative methodology using Analytic Hierarchy Process, Success Likelihood Index Methodology. The purpose of this study is to develop a methodology to incorporate the safety culture into PSA for obtaining more objective risk than before. The organizational factor considered in nuclear safety culture might affect the potential risk of human error and hardware-failure. The safety culture impact index to monitor the plant safety culture can be assessed by applying the developed methodology into a nuclear power plant

  19. A Short Review of FDTD-Based Methods for Uncertainty Quantification in Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    Theodoros T. Zygiridis

    2017-01-01

    Full Text Available We provide a review of selected computational methodologies that are based on the deterministic finite-difference time-domain algorithm and are suitable for the investigation of electromagnetic problems involving uncertainties. As it will become apparent, several alternatives capable of performing uncertainty quantification in a variety of cases exist, each one exhibiting different qualities and ranges of applicability, which we intend to point out here. Given the numerous available approaches, the purpose of this paper is to clarify the main strengths and weaknesses of the described methodologies and help the potential readers to safely select the most suitable approach for their problem under consideration.

  20. In Vivo Quantification of Lead in Bone with a Portable X-ray Fluorescence (XRF) System – Methodology and Feasibility

    Science.gov (United States)

    Nie, LH; Sanchez, S; Newton, K; Grodzins, L; Cleveland, RO; Weisskopf, MG

    2013-01-01

    This study was conducted to investigate the methodology and feasibility of developing a portable XRF technology to quantify lead (Pb) in bone in vivo. A portable XRF device was set up and optimal setting of voltage, current, and filter combination for bone lead quantification were selected to achieve the lowest detection limit. The minimum radiation dose delivered to the subject was calculated by Monte Carlo simulations. An ultrasound device was used to measure soft tissue thickness to account for signal attenuation, and an alternative method to obtain soft tissue thickness from the XRF spectrum was developed and shown to be equivalent to the ultrasound measurements (Intraclass Correlation Coefficient, ICC=0.82). We tested the correlation of in vivo bone lead concentrations between the standard KXRF technology and the portable XRF technology. There was a significant correlation between the bone lead concentrations obtained from the standard KXRF technology and those obtained from the portable XRF technology (ICC=0.65). The detection limit for the portable XRF device was about 8.4 ppm with 2 mm soft tissue thickness. The entrance skin dose delivered to the human subject was about 13 mSv and the total body effective dose was about 1.5 μSv and should pose a minimal radiation risk. In conclusion, portable XRF technology can be used for in vivo bone lead measurement with sensitivity comparable to the KXRF technology and good correlation with KXRF measurements. PMID:21242629

  1. Risk-benefit analysis and public policy: a bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Clark, E.M.; Van Horn, A.J.

    1976-11-01

    Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis, and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.

  2. Risk-benefit analysis and public policy: a bibliography

    International Nuclear Information System (INIS)

    Clark, E.M.; Van Horn, A.J.

    1976-11-01

    Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis, and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk

  3. Comparison of DNA quantification methodology used in the DNA extraction protocol for the UK Biobank cohort.

    Science.gov (United States)

    Welsh, Samantha; Peakman, Tim; Sheard, Simon; Almond, Rachael

    2017-01-05

    UK Biobank is a large prospective cohort study in the UK established by the Medical Research Council (MRC) and the Wellcome Trust to enable approved researchers to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. A wide range of phenotypic data has been collected at recruitment and has recently been enhanced by the UK Biobank Genotyping Project. All UK Biobank participants (500,000) have been genotyped on either the UK Biobank Axiom® Array or the Affymetrix UK BiLEVE Axiom® Array and the workflow for preparing samples for genotyping is described. The genetic data is hoped to provide further insight into the genetics of disease. All data, including the genetic data, is available for access to approved researchers. Data for two methods of DNA quantification (ultraviolet-visible spectroscopy [UV/Vis]) measured on the Trinean DropSense™ 96 and PicoGreen®) were compared by two laboratories (UK Biobank and Affymetrix). The sample processing workflow established at UK Biobank, for genotyping on the custom Affymetrix Axiom® array, resulted in high quality DNA (average DNA concentration 38.13 ng/μL, average 260/280 absorbance 1.91). The DNA generated high quality genotype data (average call rate 99.48% and pass rate 99.45%). The DNA concentration measured on the Trinean DropSense™ 96 at UK Biobank correlated well with DNA concentration measured by PicoGreen® at Affymetrix (r = 0.85). The UK Biobank Genotyping Project demonstrated that the high throughput DNA extraction protocol described generates high quality DNA suitable for genotyping on the Affymetrix Axiom array. The correlation between DNA concentration derived from UV/Vis and PicoGreen® quantification methods suggests, in large-scale genetic studies involving two laboratories, it may be possible to remove the DNA quantification step in one laboratory without affecting downstream analyses. This would result in

  4. Socially responsible ethnobotanical surveys in the Cape Floristic Region: ethical principles, methodology and quantification of data

    Directory of Open Access Journals (Sweden)

    Ben-Erik Van Wyk

    2012-03-01

    Full Text Available A broad overview of published and unpublished ethnobotanical surveys in the Cape Floristic Region (the traditional home of the San and Khoi communities shows that the data is incomplete. There is an urgent need to record the rich indigenous knowledge about plants in a systematic and social responsible manner in order to preserve this cultural and scientific heritage for future generations. Improved methods for quantifying data are introduced, with special reference to the simplicity and benefits of the new Matrix Method. This methodology prevents or reduces the number of false negatives, and also ensures the participation of elderly people who might be immobile. It also makes it possible to compare plant uses in different local communities. This method enables the researcher to quantify the knowledge on plant use that was preserved in a community, and to determine the relative importance of a specific plant in a more objective way. Ethical considerations for such ethnobotanical surveys are discussed, through the lens of current ethical codes and international conventions. This is an accessible approach, which can also be used in the life sciences classroom.

  5. Review of some aspects of human reliability quantification

    International Nuclear Information System (INIS)

    Lydell, B.O.Y.; Spurgin, A.J.; Hannaman, G.W.; Lukic, Y.D.

    1986-01-01

    An area in systems reliability considered to be weak, is the characterization and quantification of the role of the operations and maintenance staff in combatting accidents. Several R and D programs are underway to improve the modeling of human interactions and some progress has been made. This paper describes a specific aspect of human reliability analysis which is referred to as modeling of cognitive processes. In particular, the basis for the so- called Human Cognitive Reliability (HCR) model is described and the focus is on its validation and on its benefits and limitations

  6. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    Science.gov (United States)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  7. Fire risk analysis for nuclear power plants: Methodological developments and applications

    International Nuclear Information System (INIS)

    Kazarians, M.; Apostolakis, G.; Siv, N.O.

    1985-01-01

    A methodology to quantify the risk from fires in nuclear power plants is described. This methodology combines engineering judgment, statistical evidence, fire phenomenology, and plant system analysis. It can be divided into two major parts: (1) fire scenario identification and quantification, and (2) analysis of the impact on plant safety. This article primarily concentrates on the first part. Statistical analysis of fire occurrence data is used to establish the likelihood of ignition. The temporal behaviors of the two competing phenomena, fire propagation and fire detection and suppression, are studied and their characteristic times are compared. Severity measures are used to further specialize the frequency of the fire scenario. The methodology is applied to a switchgear room of a nuclear power plant

  8. Use of cesium-137 methodology in the evaluation of superficial erosive processes

    International Nuclear Information System (INIS)

    Andrello, Avacir Casanova; Appoloni, Carlos Roberto; Guimaraes, Maria de Fatima; Nascimento Filho, Virgilio Franco do

    2003-01-01

    Superficial erosion is one of the main soil degradation agents and erosion rates estimations for different edaphic climate conditions for the conventional models, as USLE and RUSLE, are expensive and time-consuming. The use of cesium- 137 anthropogenic radionuclide is a new methodology that has been much studied and its application in the erosion soil evaluation has grown in countries as USA, UK, Australia and others. A brief narration of this methodology is being presented, as the development of the equations utilized for the erosion rates quantification through the cesium- 137 measurements. Two watersheds studied in Brazil have shown that the cesium- 137 methodology was practicable and coherent with the survey in field for applications in erosion studies. (author)

  9. Critical aspects of data analysis for quantification in laser-induced breakdown spectroscopy

    Science.gov (United States)

    Motto-Ros, V.; Syvilay, D.; Bassel, L.; Negre, E.; Trichard, F.; Pelascini, F.; El Haddad, J.; Harhira, A.; Moncayo, S.; Picard, J.; Devismes, D.; Bousquet, B.

    2018-02-01

    In this study, a collaborative contest focused on LIBS data processing has been conducted in an original way since the participants did not share the same samples to be analyzed on their own LIBS experiments but a set of LIBS spectra obtained from one single experiment. Each participant was asked to provide the predicted concentrations of several elements for two glass samples. The analytical contest revealed a wide diversity of results among participants, even when the same spectral lines were considered for the analysis. Then, a parametric study was conducted to investigate the influence of each step during the data processing. This study was based on several analytical figures of merit such as the determination coefficient, uncertainty, limit of quantification and prediction ability (i.e., trueness). Then, it was possible to interpret the results provided by the participants, emphasizing the fact that the type of data extraction, baseline modeling as well as the calibration model play key roles in the quantification performance of the technique. This work provides a set of recommendations based on a systematic evaluation of the quantification procedure with the aim of optimizing the methodological steps toward the standardization of LIBS.

  10. Estimating the Economic Benefits of Regional Ocean Observing Systems

    National Research Council Canada - National Science Library

    Kite-Powell, Hauke L; Colgan, Charles S; Wellman, Katharine F; Pelsoci, Thomas; Wieand, Kenneth; Pendleton, Linwood; Kaiser, Mark J; Pulsipher, Allan G; Luger, Michael

    2005-01-01

    We develop a methodology to estimate the potential economic benefits from new investments in regional coastal ocean observing systems in US waters, and apply this methodology to generate preliminary...

  11. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    Science.gov (United States)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  12. A benefit-cost methodology for developing environmental standards for uranium mill tailings disposal

    International Nuclear Information System (INIS)

    Leiter, A.J.

    1982-01-01

    This paper describes a method for using benefit-cost analysis in developing generally applicable environmental standards for uranium mill tailings disposal. Several disposal alternatives were selected which consist of different combinations of control measures. The resulting cost and benefit estimations allow the calculation of the incremental cost of obtaining incremental benefits of radiation protection. The overall benefit of a disposal alternative is expressed in terms of an index which is based on weighting factors assigned to individual benefits. The results show that some disposal alternatives have higher costs while providing no additional benefit than other alternatives. These alternatives should be eliminated from consideration in developing standards

  13. Solar thermal technologies benefits assessment: Objectives, methodologies and results for 1981

    Science.gov (United States)

    Gates, W. R.

    1982-07-01

    The economic and social benefits of developing cost competitive solar thermal technologies (STT) were assessed. The analysis was restricted to STT in electric applications for 16 high insolation/high energy price states. Three fuel price scenarios and three 1990 STT system costs were considered, reflecting uncertainty over fuel prices and STT cost projections. After considering the numerous benefits of introducing STT into the energy market, three primary benefits were identified and evaluated: (1) direct energy cost savings were estimated to range from zero to $50 billion; (2) oil imports may be reduced by up to 9 percent, improving national security; and (3) significant environmental benefits can be realized in air basins where electric power plant emissions create substantial air pollution problems. STT research and development was found to be unacceptably risky for private industry in the absence of federal support. The normal risks associated with investments in research and development are accentuated because the OPEC cartel can artificially manipulate oil prices and undercut the growth of alternative energy sources.

  14. Nuclear Data Uncertainty Quantification: Past, Present and Future

    International Nuclear Information System (INIS)

    Smith, D.L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested

  15. Nuclear Data Uncertainty Quantification: Past, Present and Future

    Science.gov (United States)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  16. Detection and Quantification of Genetically Modified Soybean in Some Food and Feed Products. A Case Study on Products Available on Romanian Market

    Directory of Open Access Journals (Sweden)

    Elena Rosculete

    2018-04-01

    Full Text Available The aim of this paper is to trace genetically modified soybean in food and feed products present on the Romanian market by using molecular extraction, identification and quantification methodologies. Nine samples (3 food samples, 5 soybean samples and 1 soybean meal were analysed using the classical and real-time polymerase chain reaction (PCR method. DNA-genetically modified organism (GMO was not detected in two of the three analysed samples (food products. However, it could be found in four samples ranging below the limit of 0.9%, and in three samples, above the limit of 0.9%. The results obtained through real-time PCR quantification show that DNA-RRS was detectable in different amounts in different samples: ranging between 0.27% and 9.36% in soy beans, and reaching 50.98% in soybean meal. The current research focuses on how products containing GMO above the limit (it is common knowledge that it is necessary to label the products containing more than 0.9% Genetically Modified DNA are differentiated on the market with a view to labeling food and feed products in terms of the accidental presence of approved genetically modified plants. The benefits brought by genetic engineering in obtaining genetically modified organisms can be balanced with their public acceptance and with certain known or unknown risks that they can bring.

  17. Big Data Analytics Methodology in the Financial Industry

    Science.gov (United States)

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  18. Quantification of Benefits and Cost from Applying a Product Configuration System

    DEFF Research Database (Denmark)

    Kristjansdottir, Katrin; Shafiee, Sara; Hvam, Lars

    of generating the products’ specifications. In addition the lead-time for generating products’ specifications has been reduced and indications of improved quality of the products’ specifications and additional sales are identified. The research verifies the benefits described in the current literature...

  19. Appropriate methodologies for assessing the societal cost and benefits of conservation programs

    International Nuclear Information System (INIS)

    Power, J.M.; Gill, G.S.; Harvey, K.M.

    1983-01-01

    The use of cost-benefit analysis for assessing the societal cost and benefits of conservation programmes is discussed. It is concluded that it should not be the sole criterion for project choice. (U.K.)

  20. Comparative analysis of cost benefit division methodologies in a hydrothermal generation system

    International Nuclear Information System (INIS)

    Pereira, M.V.F.; Gorenstin, B.G.; Campodonico, N.M.; Costa, J.P. da; Kelman, J.

    1989-01-01

    The development and operation planning of the Brazilian generation system has been realized in a coordinate way by several years, due to some organizations, where the main generating companies from the country take part. The benefit share of the system to each participant of the planning and integrated operation has aroused interest. This paper describes the alternate forms of cost benefit allocation, between the participant companies of a coordinate operation, in order to reach an adequateness of remuneration and incentives. It was analysed two proposal of benefit allocation for energy export/import contracts: share by generation value and share by marginal benefit, concluding that the second one represents the best way of contribution for the several factors that comprising a hydroelectric power plant (storage capacity, effective storage and turbine capacity). (C.G.C.). 1 tab

  1. Quantification of organic acids in beer by nuclear magnetic resonance (NMR)-based methods

    International Nuclear Information System (INIS)

    Rodrigues, J.E.A.; Erny, G.L.; Barros, A.S.; Esteves, V.I.; Brandao, T.; Ferreira, A.A.; Cabrita, E.; Gil, A.M.

    2010-01-01

    The organic acids present in beer provide important information on the product's quality and history, determining organoleptic properties and being useful indicators of fermentation performance. NMR spectroscopy may be used for rapid quantification of organic acids in beer and different NMR-based methodologies are hereby compared for the six main acids found in beer (acetic, citric, lactic, malic, pyruvic and succinic). The use of partial least squares (PLS) regression enables faster quantification, compared to traditional integration methods, and the performance of PLS models built using different reference methods (capillary electrophoresis (CE), both with direct and indirect UV detection, and enzymatic essays) was investigated. The best multivariate models were obtained using CE/indirect detection and enzymatic essays as reference and their response was compared with NMR integration, either using an internal reference or an electrical reference signal (Electronic REference To access In vivo Concentrations, ERETIC). NMR integration results generally agree with those obtained by PLS, with some overestimation for malic and pyruvic acids, probably due to peak overlap and subsequent integral errors, and an apparent relative underestimation for citric acid. Overall, these results make the PLS-NMR method an interesting choice for organic acid quantification in beer.

  2. Quantification of organic acids in beer by nuclear magnetic resonance (NMR)-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, J.E.A. [CICECO-Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Erny, G.L. [CESAM - Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Barros, A.S. [QOPNAA-Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Esteves, V.I. [CESAM - Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Brandao, T.; Ferreira, A.A. [UNICER, Bebidas de Portugal, Leca do Balio, 4466-955 S. Mamede de Infesta (Portugal); Cabrita, E. [Department of Chemistry, New University of Lisbon, 2825-114 Caparica (Portugal); Gil, A.M., E-mail: agil@ua.pt [CICECO-Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal)

    2010-08-03

    The organic acids present in beer provide important information on the product's quality and history, determining organoleptic properties and being useful indicators of fermentation performance. NMR spectroscopy may be used for rapid quantification of organic acids in beer and different NMR-based methodologies are hereby compared for the six main acids found in beer (acetic, citric, lactic, malic, pyruvic and succinic). The use of partial least squares (PLS) regression enables faster quantification, compared to traditional integration methods, and the performance of PLS models built using different reference methods (capillary electrophoresis (CE), both with direct and indirect UV detection, and enzymatic essays) was investigated. The best multivariate models were obtained using CE/indirect detection and enzymatic essays as reference and their response was compared with NMR integration, either using an internal reference or an electrical reference signal (Electronic REference To access In vivo Concentrations, ERETIC). NMR integration results generally agree with those obtained by PLS, with some overestimation for malic and pyruvic acids, probably due to peak overlap and subsequent integral errors, and an apparent relative underestimation for citric acid. Overall, these results make the PLS-NMR method an interesting choice for organic acid quantification in beer.

  3. GO methodology. Volume 1. Overview manual

    International Nuclear Information System (INIS)

    1983-06-01

    The GO methodology is a success-oriented probabilistic system performance analysis technique. The methodology can be used to quantify system reliability and availability, identify and rank critical components and the contributors to system failure, construct event trees, and perform statistical uncertainty analysis. Additional capabilities of the method currently under development will enhance its use in evaluating the effects of external events and common cause failures on system performance. This Overview Manual provides a description of the GO Methodology, how it can be used, and benefits of using it in the analysis of complex systems

  4. Bone histomorphometric quantification by X-ray phase contrast and transmission 3D SR microcomputed tomography

    International Nuclear Information System (INIS)

    Nogueira, L.P.; Pinheiro, C.J.G.; Braz, D.; Oliveira, L.F.; Barroso, R.C.

    2008-01-01

    Full text: Conventional histomorphometry is an important method for quantitative evaluation of bone microstructure. X-ray computed tomography is a noninvasive technique, which can be used to evaluate histomorphometric indices. In this technique, the output 3D images are used to quantify the whole sample, differently from the conventional one, in which the quantification is performed in 2D slices and extrapolated for 3D case. Looking for better resolutions and visualization of soft tissues, X-ray phase contrast imaging technique was developed. The objective of this work was to perform histomorphometric quantification of human cancellous bone using 3D synchrotron X ray computed microtomography, using two distinct techniques: transmission and phase contrast, in order to compare the results and evaluate the viability of applying the same methodology of quantification for both technique. All experiments were performed at the ELETTRA Synchrotron Light Laboratory in Trieste (Italy). MicroCT data sets were collected using the CT set-up on the SYRMEP (Synchrotron Radiation for Medical Physics) beamline. Results showed that there is a better correlation between histomorphometric parameters of both techniques when morphological filters had been used. However, using these filters, some important information given by phase contrast are lost and they shall be explored by new techniques of quantification

  5. Costs without benefits? Methodological issues in assessing costs, benefits and effectiveness of water protection policies. Paper

    Energy Technology Data Exchange (ETDEWEB)

    Walz, R.; Schleich, J.

    2000-07-01

    In the last few years, the conditions for extending environmental policy in general and policy dealing with the prevention of water pollution in particular have undergone extensive changes. On the one hand, there has been indisputable considerable success in preventing water pollution which has led to less direct pressure for policy action. On the other hand, the rising sewage levies and the lower political priority assigned in general to environmental policy documented in, e. g. public opinion surveys, has led to water pollution control policy facing very different pressures of justification: more efficient use of funds, improved planning processes, proof of the achievable benefit, but also stopping the increase in levies or not hindering economic development, these or similar slogans are the objections brought against water pollution control. Regardless of how unambiguous these terms appear when used as slogans in this way, they become diffuse and unclear if regarded more closely. This paper therefore attempts to reveal the reasons for possible misunderstandings and misinterpretations on the one hand and, on the other, to reveal the basic problems and uncertainties which are necessarily linked with an assessment of costs and benefits. In order to do this, three areas are examined: level of actors and analysis, evaluation methods and assessment of costs and benefits. (orig.)

  6. Development of a new methodology for quantifying nuclear safety culture

    International Nuclear Information System (INIS)

    Han, Kiyoon; Jae, Moosung

    2017-01-01

    The present study developed a Safety Culture Impact Assessment Model (SCIAM) which consists of a safety culture assessment methodology and a safety culture impact quantification methodology. The SCIAM uses a safety culture impact index (SCII) to monitor the status of safety culture of NPPs periodically and it uses relative core damage frequency (RCDF) to present the impact of safety culture on the safety of NPPs. As a result of applying the SCIAM to the reference plant (Kori 3), the standard for the healthy safety culture of the reference plant is suggested. SCIAM might contribute to improve the safety of NPPs (Nuclear Power Plants) by monitoring the status of safety culture periodically and presenting the standard of healthy safety culture.

  7. Development of a new methodology for quantifying nuclear safety culture

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kiyoon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of). Dept. of Nuclear Engineering

    2017-01-15

    The present study developed a Safety Culture Impact Assessment Model (SCIAM) which consists of a safety culture assessment methodology and a safety culture impact quantification methodology. The SCIAM uses a safety culture impact index (SCII) to monitor the status of safety culture of NPPs periodically and it uses relative core damage frequency (RCDF) to present the impact of safety culture on the safety of NPPs. As a result of applying the SCIAM to the reference plant (Kori 3), the standard for the healthy safety culture of the reference plant is suggested. SCIAM might contribute to improve the safety of NPPs (Nuclear Power Plants) by monitoring the status of safety culture periodically and presenting the standard of healthy safety culture.

  8. Sulfonylurea herbicides – methodological challenges in setting aquatic limit values

    DEFF Research Database (Denmark)

    Rosenkrantz, Rikke Tjørnhøj; Baun, Anders; Kusk, Kresten Ole

    according to the EU Water Framework Directive, the resulting Water Quality Standards (WQSs) are below the analytical quantification limit, making it difficult to verify compliance with the limit values. However, several methodological concerns may be raised in relation to the very low effect concentrations...... and rimsulfuron. The following parameters were varied during testing: pH, exposure duration, temperature and light/dark cycle. Preliminary results show that a decrease in pH causes an increase in toxicity for all compounds. Exposure to a high concentration for 24 hours caused a reduction in growth rate, from...... for setting limit values for SUs or if more detailed information should be gained by taking methodological considerations into account....

  9. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  10. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    Science.gov (United States)

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Utility of radiotracer methodology in scientific research of industrial relevancy

    International Nuclear Information System (INIS)

    Kolar, Z.I.

    1990-01-01

    Utilization of radiotracer methodology in industrial research provides substantial scientific rather than directly demonstrable economic benefits. These benefits include better understanding of industrial processes and subsequently the development of new ones. Examples are given of the use of radiotracers in technological studies and the significance of the obtained results is put down. Creative application of radiotracer methodology may contribute to the economic development and technological advancement of all countries including the developing ones. (orig.) [de

  12. Ensuring VGI Credibility in Urban-Community Data Generation: A Methodological Research Design

    Directory of Open Access Journals (Sweden)

    Jamie O'Brien

    2016-06-01

    Full Text Available In this paper we outline the methodological development of current research into urban community formations based on combinations of qualitative (volunteered and quantitative (spatial analytical and geo-statistical data. We outline a research design that addresses problems of data quality relating to credibility in volunteered geographic information (VGI intended for Web-enabled participatory planning. Here we have drawn on a dual notion of credibility in VGI data, and propose a methodological workflow to address its criteria. We propose a ‘super-positional’ model of urban community formations, and report on the combination of quantitative and participatory methods employed to underpin its integration. The objective of this methodological phase of study is to enhance confidence in the quality of data for Web-enabled participatory planning. Our participatory method has been supported by rigorous quantification of area characteristics, including participant communities’ demographic and socio-economic contexts. This participatory method provided participants with a ready and accessible format for observing and mark-making, which allowed the investigators to iterate rapidly a system design based on participants’ responses to the workshop tasks. Participatory workshops have involved secondary school-age children in socio-economically contrasting areas of Liverpool (Merseyside, UK, which offers a test-bed for comparing communities’ formations in comparative contexts, while bringing an under-represented section of the population into a planning domain, whose experience may stem from public and non-motorised transport modalities. Data has been gathered through one-day participatory workshops, featuring questionnaire surveys, local site analysis, perception mapping and brief, textual descriptions. This innovative approach will support Web-based participation among stakeholding planners, who may benefit from well-structured, community

  13. WE-AB-204-05: Harmonizing PET/CT Quantification in Multicenter Studies: A Case Study

    International Nuclear Information System (INIS)

    Marques da Silva, A; Fischer, A

    2015-01-01

    Purpose: To present the implementation of a strategy to harmonize FDG PET/CT quantification (SUV), performed with different scanner models and manufacturers. Methods: The strategy was based on Boellaard (2011) and EARL FDG-PET/CT accreditation program, that propose quality control measurements for harmonizing scanner performance. A NEMA IEC Body phantom study was performed using four different devices: PHP-1 (Gemini TF Base, Philips); PHP-2 (Gemini GXL, Philips); GEH (Discovery 600, General Electric); SMS (Biograph Hi-Rez 16, Siemens). The SUV Recovery Coefficient (RC) was calculated using the clinical protocol and other clinically relevant reconstruction parameters. The most appropriate reconstruction parameters (MARP) for SUV harmonization, in each scanner, are those which achieve EARL harmonizing standards. They were identified using the lowest root mean square errors (RMSE). To evaluate the strategy’s effectiveness, the Maximum Differences (MD) between the clinical and MARP RC values were calculated. Results: The reconstructions parameters that obtained the lowest RMSE are: FBP 5mm (PHP-1); LOR-RAMLA 2i0.008l (PHP-2); VuePointHD 2i32s10mm (GEH); and FORE+OSEM 4i8s6mm (SMS). Thus, to ensure that quantitative PET image measurements are interchangeable between these sites, images must be reconstructed with the above-mentioned parameters. Although, a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies was observed. The MD showed that the strategy was effective in reducing the variability of SUV quantification for small structures (<17mm). Conclusion: The harmonization strategy of the SUV quantification implemented with these devices was effective in reducing the variability of small structures quantification, minimizing the inter-scanner and inter-institution differences in quantification. However, it is essential that, in addition to the harmonization of quantification, the standardization of the

  14. QUANTIFICATION OF ANGIOGENESIS IN THE CHICKEN CHORIOALLANTOIC MEMBRANE (CAM

    Directory of Open Access Journals (Sweden)

    Silvia Blacher

    2011-05-01

    Full Text Available The chick chorioallantoic membrane (CAM provides a suitable in vivo model to study angiogenesis and evaluate several pro- and anti-angiogenic factors and compounds. In the present work, new developments in image analysis are used to quantify CAM angiogenic response from optical microscopic observations, covering all vascular components, from the large supplying and feeding vessels down to the capillary plexus. To validate our methodology angiogenesis is quantified during two phases of CAM development (day 7 and 13 and after treatment with an antiangiogenic modulator of the angiogenesis. Our morphometric analysis emphasizes that an accurate quantification of the CAM vasculature needs to be performed at various scales.

  15. Quantification In Neurology

    Directory of Open Access Journals (Sweden)

    Netravati M

    2005-01-01

    Full Text Available There is a distinct shift of emphasis in clinical neurology in the last few decades. A few years ago, it was just sufficient for a clinician to precisely record history, document signs, establish diagnosis and write prescription. In the present context, there has been a significant intrusion of scientific culture in clinical practice. Several criteria have been proposed, refined and redefined to ascertain accurate diagnosis for many neurological disorders. Introduction of the concept of impairment, disability, handicap and quality of life has added new dimension to the measurement of health and disease and neurological disorders are no exception. "Best guess" treatment modalities are no more accepted and evidence based medicine has become an integral component of medical care. Traditional treatments need validation and new therapies require vigorous trials. Thus, proper quantification in neurology has become essential, both in practice and research methodology in neurology. While this aspect is widely acknowledged, there is a limited access to a comprehensive document pertaining to measurements in neurology. This following description is a critical appraisal of various measurements and also provides certain commonly used rating scales/scores in neurological practice.

  16. Accelerating time to benefit

    DEFF Research Database (Denmark)

    Svejvig, Per; Geraldi, Joana; Grex, Sara

    Despite the ubiquitous pressure for speed, our approaches to accelerate projects remain constrained to the old-fashioned understanding of the project as a vehicle to deliver products and services, not value. This article explores an attempt to accelerate time to benefit. We describe and deconstruct...... of the time. Although all cases valued speed and speed to benefit, and implemented most practices proposed by the methodology, only three of the five projects were more successful in decreasing time to speed. Based on a multi-case study comparison between these five different projects and their respective...

  17. Uncertainty of a detected spatial cluster in 1D: quantification and visualization

    KAUST Repository

    Lee, Junho; Gangnon, Ronald E.; Zhu, Jun; Liang, Jingjing

    2017-01-01

    Spatial cluster detection is an important problem in a variety of scientific disciplines such as environmental sciences, epidemiology and sociology. However, there appears to be very limited statistical methodology for quantifying the uncertainty of a detected cluster. In this paper, we develop a new method for the quantification and visualization of uncertainty associated with a detected cluster. Our approach is defining a confidence set for the true cluster and visualizing the confidence set, based on the maximum likelihood, in time or in one-dimensional space. We evaluate the pivotal property of the statistic used to construct the confidence set and the coverage rate for the true cluster via empirical distributions. For illustration, our methodology is applied to both simulated data and an Alaska boreal forest dataset. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Uncertainty of a detected spatial cluster in 1D: quantification and visualization

    KAUST Repository

    Lee, Junho

    2017-10-19

    Spatial cluster detection is an important problem in a variety of scientific disciplines such as environmental sciences, epidemiology and sociology. However, there appears to be very limited statistical methodology for quantifying the uncertainty of a detected cluster. In this paper, we develop a new method for the quantification and visualization of uncertainty associated with a detected cluster. Our approach is defining a confidence set for the true cluster and visualizing the confidence set, based on the maximum likelihood, in time or in one-dimensional space. We evaluate the pivotal property of the statistic used to construct the confidence set and the coverage rate for the true cluster via empirical distributions. For illustration, our methodology is applied to both simulated data and an Alaska boreal forest dataset. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Climate and desertification: indicators for an assessment methodology

    International Nuclear Information System (INIS)

    Sciortino, M.; Caiaffa, E.; Fattoruso, G.; Donolo, R.; Salvetti, G.

    2009-01-01

    This work aims to define a methodology that, on the basis of commonly available surface climate records, assesses indicators of the increase or decrease of the extension of territories vulnerable to desertification and land degradation. The definition and quantification of environmental policy relevant indicators aims to improve the understanding and the decision making processes in dry lands. the results of this study show that since 1931 changes of climate involved 90% of the territory of the Sicilian region, with stronger intensity in the internal areas of Enna, Caltanissetta and Palermo provinces. (Author) 9 refs.

  20. An EPGPT-based approach for uncertainty quantification

    International Nuclear Information System (INIS)

    Wang, C.; Abdel-Khalik, H. S.

    2012-01-01

    Generalized Perturbation Theory (GPT) has been widely used by many scientific disciplines to perform sensitivity analysis and uncertainty quantification. This manuscript employs recent developments in GPT theory, collectively referred to as Exact-to-Precision Generalized Perturbation Theory (EPGPT), to enable uncertainty quantification for computationally challenging models, e.g. nonlinear models associated with many input parameters and many output responses and with general non-Gaussian parameters distributions. The core difference between EPGPT and existing GPT is in the way the problem is formulated. GPT formulates an adjoint problem that is dependent on the response of interest. It tries to capture via the adjoint solution the relationship between the response of interest and the constraints on the state variations. EPGPT recasts the problem in terms of a smaller set of what is referred to as the 'active' responses which are solely dependent on the physics model and the boundary and initial conditions rather than on the responses of interest. The objective of this work is to apply an EPGPT methodology to propagate cross-sections variations in typical reactor design calculations. The goal is to illustrate its use and the associated impact for situations where the typical Gaussian assumption for parameters uncertainties is not valid and when nonlinear behavior must be considered. To allow this demonstration, exaggerated variations will be employed to stimulate nonlinear behavior in simple prototypical neutronics models. (authors)

  1. The assessment of the costs and benefits of regulatory decision making

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-06-01

    This study outlines the framework within which cost-benefit analyses of regulation may be undertaken. The general framework is consistent for any cost-benefit analysis. The particular needs or individual structure of the industry to which the regulation is targeted and the particular nature of the regulation will affect the methodologies chosen to execute specific steps within that framework. The discussion also includes insight into the approach to cost-benefit analysis used in other jurisdictions, specifically the U.S. Nuclear Regulatory Commission, the Health and Safety Executive, Nuclear Safety Division in the United Kingdom, Transport Canada and Environment Canada. Various methodologies, and their relative strengths and weaknesses in the context of regulation in the nuclear industry, are outlined in the discussions of each phase of the cost-benefit framework. Those individual methodologies and approaches in other jurisdictions that are best suited to the assessment of regulations administered by the Atomic Energy Control Board are incorporated into a proposed framework. 44 refs., 1 tab., 5 figs.

  2. The assessment of the costs and benefits of regulatory decision making

    International Nuclear Information System (INIS)

    1995-06-01

    This study outlines the framework within which cost-benefit analyses of regulation may be undertaken. The general framework is consistent for any cost-benefit analysis. The particular needs or individual structure of the industry to which the regulation is targeted and the particular nature of the regulation will affect the methodologies chosen to execute specific steps within that framework. The discussion also includes insight into the approach to cost-benefit analysis used in other jurisdictions, specifically the U.S. Nuclear Regulatory Commission, the Health and Safety Executive, Nuclear Safety Division in the United Kingdom, Transport Canada and Environment Canada. Various methodologies, and their relative strengths and weaknesses in the context of regulation in the nuclear industry, are outlined in the discussions of each phase of the cost-benefit framework. Those individual methodologies and approaches in other jurisdictions that are best suited to the assessment of regulations administered by the Atomic Energy Control Board are incorporated into a proposed framework. 44 refs., 1 tab., 5 figs

  3. Targeting plug-in hybrid electric vehicle policies to increase social benefits

    International Nuclear Information System (INIS)

    Skerlos, Steven J.; Winebrake, James J.

    2010-01-01

    In 2009 the U.S. federal government enacted tax credits aimed at encouraging consumers to purchase plug-in hybrid electric vehicles (PHEVs). These tax credits are available to all consumers equally and therefore do not account for the variability in social benefits associated with PHEV operation in different parts of the country. The tax credits also do not consider variability in consumer income. This paper discusses why the PHEV subsidy policy would have higher social benefits at equal or less cost if the tax credits were offered at different levels depending on consumer income and the location of purchase. Quantification of these higher social benefits and related policy proposals are left for future work.

  4. Reference Materials for Calibration of Analytical Biases in Quantification of DNA Methylation.

    Science.gov (United States)

    Yu, Hannah; Hahn, Yoonsoo; Yang, Inchul

    2015-01-01

    Most contemporary methods for the quantification of DNA methylation employ bisulfite conversion and PCR amplification. However, many reports have indicated that bisulfite-mediated PCR methodologies can result in inaccurate measurements of DNA methylation owing to amplification biases. To calibrate analytical biases in quantification of gene methylation, especially those that arise during PCR, we utilized reference materials that represent exact bisulfite-converted sequences with 0% and 100% methylation status of specific genes. After determining relative quantities using qPCR, pairs of plasmids were gravimetrically mixed to generate working standards with predefined DNA methylation levels at 10% intervals in terms of mole fractions. The working standards were used as controls to optimize the experimental conditions and also as calibration standards in melting-based and sequencing-based analyses of DNA methylation. Use of the reference materials enabled precise characterization and proper calibration of various biases during PCR and subsequent methylation measurement processes, resulting in accurate measurements.

  5. Development of Proliferation Resistance Assessment Methodology Based on International Standards

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Lee, Jung Won; Lee, Kwang Seok

    2009-03-01

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  6. The affect heuristic in judgments of risks and benefits

    International Nuclear Information System (INIS)

    Finucane, M.; Slovic, P.; Johnson, S.M.; Alhakami, A.

    1998-01-01

    The role of affect in judgment of risks and benefits is examined in two studies. Despite using different methodologies the two studies suggest that risk and benefit are linked somehow in people's perception, consequently influencing their judgments. Short paper

  7. Methodologies of health impact assessment as part of an integrated approach to reduce effects of air pollution

    OpenAIRE

    Aunan, Kristin; Seip, Hans Martin

    1995-01-01

    Quantification of average frequencies of health effects on a population level is an essential part of an integrated assessment of pollution effects. Epidemiological studies seem to provide the best basis for such estimates. This paper gives an introduction to a methodology for health impact assessment. It also gives results from some selected parts of a case-study in Hungary. This study is aimed at testing and improving the methodology for integrated assessment and focuses on energy productio...

  8. Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Lines, Amanda M. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Adami, Susan R. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Sinkov, Sergey I. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Lumetta, Gregg J. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Bryan, Samuel A. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States

    2017-08-09

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example of a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.

  9. Issues connected with indirect cost quantification: a focus on the transportation system

    Science.gov (United States)

    Křivánková, Zuzana; Bíl, Michal; Kubeček, Jan; Vodák, Rostislav

    2017-04-01

    Transportation and communication networks in general are vital parts of modern society. The economy relies heavily on transportation system performance. A number of people commutes to work regularly. Stockpiles in many companies are being reduced as the just-in-time production process is able to supply resources via the transportation network on time. Natural hazards have the potential to disturb transportation systems. Earthquakes, flooding or landsliding are examples of high-energetic processes which are capable of causing direct losses (i.e. physical damage to the infrastructure). We have focused on quantification of the indirect cost of natural hazards which are not easy to estimate. Indirect losses can also emerge as a result of meteorological hazards with low energy which only seldom cause direct losses, e.g. glaze, snowfall. Whereas evidence of repair work and general direct costs usually exist or can be estimated, indirect costs are much more difficult to identify particularly when they are not covered by insurance agencies. Delimitations of alternative routes (detours) are the most frequent responses to blocked road links. Indirect costs can then be related to increased fuel consumption and additional operating costs. Detours usually result in prolonged travel times. Indirect costs quantification has to therefore cover the value of the time. The costs from the delay are a nonlinear function of travel time, however. The existence of an alternative transportation pattern may also result in an increased number of traffic crashes. This topic has not been studied in depth but an increase in traffic crashes has been reported when people suddenly changed their traffic modes, e.g. when air traffic was not possible. The lost user benefit from those trips that were cancelled or suppressed is also difficult to quantify. Several approaches, based on post-event questioner surveys, have been applied to communities and companies affected by transportation accessibility

  10. The affect heuristic in judgments of risks and benefits

    Energy Technology Data Exchange (ETDEWEB)

    Finucane, M.; Slovic, P.; Johnson, S.M. [Decision Research, 1201 Oak St, Eugene, Oregon (United States); Alhakami, A. [Imam Muhammad Ibn Saud Islamic University Psychology Dept. (Saudi Arabia)

    1998-07-01

    The role of affect in judgment of risks and benefits is examined in two studies. Despite using different methodologies the two studies suggest that risk and benefit are linked somehow in people's perception, consequently influencing their judgments. Short paper.

  11. Guidebook in using Cost Benefit Analysis and strategic environmental assessment for environmental planning in China

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    Environmental planning in China may benefit from greater use of Cost Benefit Analysis (CBA) and Strategic Environmental Assessment (SEA) methodologies. We provide guidance on using these methodologies. Part I and II show the principles behind the methodologies as well as their theoretical structure. Part III demonstrates the methodologies in action in a range of different good practice examples. The case studies and theoretical expositions are intended to teach by way of example as well as by understanding the principles, and to help planners use the methodologies as correctly as possible.(auth)

  12. Quantification in emission tomography

    International Nuclear Information System (INIS)

    Buvat, Irene

    2011-11-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena 2 - Main problems impacting quantification in PET and SPECT: problems, consequences, correction methods, results (Attenuation, scattering, partial volume effect, movement, un-stationary spatial resolution in SPECT, fortuitous coincidences in PET, standardisation in PET); 3 - Synthesis: accessible efficiency, know-how, Precautions, beyond the activity measurement

  13. Accident sequence quantification with KIRAP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP`s cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs.

  14. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong.

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  15. Human error probability quantification using fuzzy methodology in nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, Claudio Souza do

    2010-01-01

    This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations a hypothesis on Research Reactor IEA-R1 from IPEN. It was also obtained a Performance Shaping Factors (PSF) evaluation in order to classify them according to their influence level onto the operator's actions and to determine these PSF actual states over the plant. Both HEP estimation and PSF evaluation were done based on Specialists Evaluation using interviews and questionnaires. Specialists group was composed from selected IEA-R1 operators. Specialist's knowledge representation into linguistic variables and group evaluation values were obtained through Fuzzy Logic and Fuzzy Set Theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on Human Reliability Analysis (HRA). (author)

  16. Comparison of Greenhouse Gas Offset Quantification Protocols for Nitrogen Management in Dryland Wheat Cropping Systems of the Pacific Northwest

    Directory of Open Access Journals (Sweden)

    Tabitha T. Brown

    2017-11-01

    Full Text Available In the carbon market, greenhouse gas (GHG offset protocols need to ensure that emission reductions are of high quality, quantifiable, and real. Lack of consistency across protocols for quantifying emission reductions compromise the credibility of offsets generated. Thus, protocol quantification methodologies need to be periodically reviewed to ensure emission offsets are credited accurately and updated to support practical climate policy solutions. Current GHG emission offset credits generated by agricultural nitrogen (N management activities are based on reducing the annual N fertilizer application rate for a given crop without reducing yield. We performed a “road test” of agricultural N management protocols to evaluate differences among protocol components and quantify nitrous oxide (N2O emission reductions under sample projects relevant to N management in dryland, wheat-based cropping systems of the inland Pacific Northwest (iPNW. We evaluated five agricultural N management offset protocols applicable to North America: two methodologies of American Carbon Registry (ACR1 and ACR2, Verified Carbon Standard (VCS, Climate Action Reserve (CAR, and Alberta Offset Credit System (Alberta. We found that only two protocols, ACR2 and VCS, were suitable for this study, in which four sample projects were developed representing feasible N fertilizer rate reduction activities. The ACR2 and VCS protocols had identical baseline and project emission quantification methodologies resulting in identical emission reduction values. Reducing N fertilizer application rate by switching to variable rate N (sample projects 1–3 or split N application (sample project 4 management resulted in a N2O emission reduction ranging from 0.07 to 0.16, and 0.26 Mg CO2e ha−1, respectively. Across the range of C prices considered ($5, $10, and $50 per metric ton of CO2 equivalent, we concluded that the N2O emission offset payment alone ($0.35–$13.0 ha−1 was unlikely to

  17. Application of the NUREG/CR-6850 EPRI/NRC Fire PRA Methodology to a DOE Facility

    International Nuclear Information System (INIS)

    Elicson, Tom; Harwood, Bentley; Yorg, Richard; Lucek, Heather; Bouchard, Jim; Jukkola, Ray; Phan, Duan

    2011-01-01

    The application NUREG/CR-6850 EPRI/NRC fire PRA methodology to DOE facility presented several challenges. This paper documents the process and discusses several insights gained during development of the fire PRA. A brief review of the tasks performed is provided with particular focus on the following: Tasks 5 and 14: Fire-induced risk model and fire risk quantification. A key lesson learned was to begin model development and quantification as early as possible in the project using screening values and simplified modeling if necessary. Tasks 3 and 9: Fire PRA cable selection and detailed circuit failure analysis. In retrospect, it would have been beneficial to perform the model development and quantification in 2 phases with detailed circuit analysis applied during phase 2. This would have allowed for development of a robust model and quantification earlier in the project and would have provided insights into where to focus the detailed circuit analysis efforts. Tasks 8 and 11: Scoping fire modeling and detailed fire modeling. More focus should be placed on detailed fire modeling and less focus on scoping fire modeling. This was the approach taken for the fire PRA. Task 14: Fire risk quantification. Typically, multiple safe shutdown (SSD) components fail during a given fire scenario. Therefore dependent failure analysis is critical to obtaining a meaningful fire risk quantification. Dependent failure analysis for the fire PRA presented several challenges which will be discussed in the full paper.

  18. Quantification of protein thiols and dithiols in the picomolar range using sodium borohydride and 4,4'-dithiodipyridine

    DEFF Research Database (Denmark)

    Hansen, Rosa E; Østergaard, Henrik; Nørgaard, Per

    2007-01-01

    Experimental determination of the number of thiols in a protein requires methodology that combines high sensitivity and reproducibility with low intrinsic thiol oxidation disposition. In detection of disulfide bonds, it is also necessary to efficiently reduce disulfides and to quantify...... the liberated thiols. Ellman's reagent (5,5'-dithiobis-[2-nitrobenzoic acid], DTNB) is the most widely used reagent for quantification of protein thiols, whereas dithiothreitol (DTT) is commonly used for disulfide reduction. DTNB suffers from a relatively low sensitivity, whereas DTT reduction is inconvenient...... sodium borohydride and the thiol reagent 4,4'-dithiodipyridine (4-DPS). Because borohydride is efficiently destroyed by the addition of acid, the complete reduction and quantification can be performed conveniently in one tube without desalting steps. Furthermore, the use of reverse-phase high...

  19. Development of a simplified methodology for the isotopic determination of fuel spent in Light Water Reactors

    International Nuclear Information System (INIS)

    Hernandez N, H.; Francois L, J.L.

    2005-01-01

    The present work presents a simplified methodology to quantify the isotopic content of the spent fuel of light water reactors; their application is it specific to the Laguna Verde Nucleo electric Central by means of a balance cycle of 18 months. The methodology is divided in two parts: the first one consists on the development of a model of a simplified cell, for the isotopic quantification of the irradiated fuel. With this model the burnt one is simulated 48,000 MWD/TU of the fuel in the core of the reactor, taking like base one fuel assemble type 10x10 and using a two-dimensional simulator for a fuel cell of a light water reactor (CPM-3). The second part of the methodology is based on the creation from an isotopic decay model through an algorithm in C++ (decay) to evaluate the amount, by decay of the radionuclides, after having been irradiated the fuel until the time in which the reprocessing is made. Finally the method used for the quantification of the kilograms of uranium and obtained plutonium of a normalized quantity (1000 kg) of fuel irradiated in a reactor is presented. These results will allow later on to make analysis of the final disposition of the irradiated fuel. (Author)

  20. New approach for the quantification of processed animal proteins in feed using light microscopy.

    Science.gov (United States)

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  1. Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.

    Science.gov (United States)

    1979-09-01

    This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...

  2. Methodologies of health impact assessment as part of an integrated approach to reduce effects of air pollution

    Energy Technology Data Exchange (ETDEWEB)

    Aunan, K; Seip, H M

    1995-12-01

    Quantification of average frequencies of health effects on a population level is an essential part of an integrated assessment of pollution effects. Epidemiological studies seem to provide the best basis for such estimates. This paper gives an introduction to a methodology for health impact assessment and also the results from selected parts of a case study in Hungary. This case study is aimed at testing and improving the methodology for integrated assessment and focuses on energy production and consumption and implications for air pollution. Using monitoring data from Budapest, the paper gives estimates of excess frequencies of respiratory illness, mortality and other health end-points. For a number of health end-points, particles probably may serve as a good indicator component. Stochastic simulation is used to illustrate the uncertainties imbedded in the exposure-response function applied. The paper uses the ``bottom up approach`` to find cost-effective abatement strategies against pollution damages, where specific abatement measures such as emission standards for vehicles are explored in detail. It is concluded that in spite of large uncertainties in every step of the analysis, an integrated assessment of costs and benefits of different abatement measures is valuable as it clarifies the main objectives of an abatement policy and explicitly describes the adverse impacts of different activities and their relative importance. 46 refs., 11 figs., 2 tabs.

  3. Development of risk assessment methodology against natural external hazards for sodium-cooled fast reactors: project overview and strong Wind PRA methodology - 15031

    International Nuclear Information System (INIS)

    Yamano, H.; Nishino, H.; Kurisaka, K.; Okano, Y.; Sakai, T.; Yamamoto, T.; Ishizuka, Y.; Geshi, N.; Furukawa, R.; Nanayama, F.; Takata, T.; Azuma, E.

    2015-01-01

    This paper describes mainly strong wind probabilistic risk assessment (PRA) methodology development in addition to the project overview. In this project, to date, the PRA methodologies against snow, tornado and strong wind were developed as well as the hazard evaluation methodologies. For the volcanic eruption hazard, ash fallout simulation was carried out to contribute to the development of the hazard evaluation methodology. For the forest fire hazard, the concept of the hazard evaluation methodology was developed based on fire simulation. Event sequence assessment methodology was also developed based on plant dynamics analysis coupled with continuous Markov chain Monte Carlo method in order to apply to the event sequence against snow. In developing the strong wind PRA methodology, hazard curves were estimated by using Weibull and Gumbel distributions based on weather data recorded in Japan. The obtained hazard curves were divided into five discrete categories for event tree quantification. Next, failure probabilities for decay heat removal related components were calculated as a product of two probabilities: i.e., a probability for the missiles to enter the intake or out-take in the decay heat removal system, and fragility caused by the missile impacts. Finally, based on the event tree, the core damage frequency was estimated about 6*10 -9 /year by multiplying the discrete hazard probabilities in the Gumbel distribution by the conditional decay heat removal failure probabilities. A dominant sequence was led by the assumption that the operators could not extinguish fuel tank fire caused by the missile impacts and the fire induced loss of the decay heat removal system. (authors)

  4. Cost/benefit analyses of reactor safety systems

    International Nuclear Information System (INIS)

    1988-01-01

    The study presents a methodology for quantitative assessment of the benefit yielded by the various engineered safety systems of a nuclear reactor containment from the standpoint of their capacity to protect the environment compared to their construction costs. The benefit is derived from an estimate of the possible damage from which the environment is protected, taking account of the probabilities of occurrence of malfunctions and accidents. For demonstration purposes, the methodology was applied to a 1 300-MWe PWR nuclear power station. The accident sequence considered was that of a major loss-of-coolant accident as investigated in detail in the German risk study. After determination of the benefits and cost/benefit ratio for the power plant and the containment systems as designed, the performance characteristics of three subsystems, the leakoff system, annulus exhaust air handling system and spray system, were varied. For this purpose, the parameters which describe these systems in the activity release programme were altered. The costs were simultaneously altered in order to take account of the performance divergences. By varying the performance of the individual sub-systems an optimization in design of these systems can be arrived at

  5. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  6. Outline of cost-benefit analysis and a case study

    Science.gov (United States)

    Kellizy, A.

    1978-01-01

    The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.

  7. On uncertainty quantification in hydrogeology and hydrogeophysics

    Science.gov (United States)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  8. 42 CFR 493.649 - Methodology for determining fee amount.

    Science.gov (United States)

    2010-10-01

    ... fringe benefit costs to support the required number of State inspectors, management and direct support... full time equivalent employee. Included in this cost are salary and fringe benefit costs, necessary... 42 Public Health 5 2010-10-01 2010-10-01 false Methodology for determining fee amount. 493.649...

  9. Spanish methodological approach for biosphere assessment of radioactive waste disposal

    International Nuclear Information System (INIS)

    Agueero, A.; Pinedo, P.; Cancio, D.; Simon, I.; Moraleda, M.; Perez-Sanchez, D.; Trueba, C.

    2007-01-01

    The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS 'Reference Biospheres Methodology' and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates

  10. Spanish methodological approach for biosphere assessment of radioactive waste disposal.

    Science.gov (United States)

    Agüero, A; Pinedo, P; Cancio, D; Simón, I; Moraleda, M; Pérez-Sánchez, D; Trueba, C

    2007-10-01

    The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS "Reference Biospheres Methodology" and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates.

  11. Overall Key Performance Indicator to Optimizing Operation of High-Pressure Homogenizers for a Reliable Quantification of Intracellular Components in Pichia pastoris.

    Science.gov (United States)

    Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco

    2015-01-01

    The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest.

  12. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...... standard curve in the Quantifiler Human DNA Quantification kit, the DNA quantification results of the human DNA preparations were 31% higher than expected based on the manufacturers' information. The results indicate a calibration problem with the Quantifiler human DNA standard for its use...

  13. Flood Protection Through Landscape Scale Ecosystem Restoration- Quantifying the Benefits

    Science.gov (United States)

    Pinero, E.

    2017-12-01

    Hurricane Harvey illustrated the risks associated with storm surges on coastal areas, especially during severe storms. One way to address storm surges is to utilize the natural ability of offshore coastal land to dampen their severity. In addition to helping reduce storm surge intensity and related damage, restoring the land will generate numerous co-benefits such as carbon sequestration and water quality improvement. The session will discuss the analytical methodology that helps define what is the most resilient species to take root, and to calculate quantified benefits. It will also address the quantification and monetization of benefits to make the business case for restoration. In 2005, Hurricanes Katrina and Rita damaged levees along the Gulf of Mexico, leading to major forest degradation, habitat deterioration and reduced wildlife use. As a result, this area lost an extensive amount of land, with contiguous sections of wetlands being converted to open water. The Restore the Earth Foundation's North American Amazon project intends to restore one million acres of forests and forested wetlands in the lower Mississippi River Valley. The proposed area for the first phase of this project was once an historic bald cypress forested wetland, which was degraded due to increased salinity levels and extreme fluctuations in hydrology. The Terrebonne and Lafourche Parishes, the "bayou parishes", communities with a combined population of over 200,000, sit on thin fingers of land that are protected by surrounding wetland swamps and wetlands, beyond which is the Gulf of Mexico. The Parishes depend on fishing, hunting, trapping, boat building, off-shore oil and gas production and support activities. Yet these communities are highly vulnerable to risks from natural hazards and future land loss. The ground is at or near sea level and therefore easily inundated by storm surges if not protected by wetlands. While some communities are protected by a levee system, the Terrebonne and

  14. Regional issue identification and assessment: study methodology. First annual report

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    The overall assessment methodologies and models utilized for the first project under the Regional Issue Identification and Assessment (RIIA) program are described. Detailed descriptions are given of the methodologies used by lead laboratories for the quantification of the impacts of an energy scenario on one or more media (e.g., air, water, land, human and ecology), and by all laboratories to assess the regional impacts on all media. The research and assessments reflected in this document were performed by the following national laboratories: Argonne National Laboratory; Brookhaven National Laboratory; Lawrence Berkeley Laboratory; Los Alamos Scientific Laboratory; Oak Ridge National Laboratory; and Pacific Northwest Laboratory. This report contains five chapters. Chapter 1 briefly describes the overall study methodology and introduces the technical participants. Chapter 2 is a summary of the energy policy scenario selected for the RIIA I study and Chapter 3 describes how this scenario was translated into a county-level siting pattern of energy development. The fourth chapter is a detailed description of the individual methodologies used to quantify the environmental and socioeconomic impacts of the scenario while Chapter 5 describes how these impacts were translated into comprehensive regional assessments for each Federal Region.

  15. Adaptive polynomial chaos techniques for uncertainty quantification of a gas cooled fast reactor transient

    International Nuclear Information System (INIS)

    Perko, Z.; Gilli, L.; Lathouwers, D.; Kloosterman, J. L.

    2013-01-01

    Uncertainty quantification plays an increasingly important role in the nuclear community, especially with the rise of Best Estimate Plus Uncertainty methodologies. Sensitivity analysis, surrogate models, Monte Carlo sampling and several other techniques can be used to propagate input uncertainties. In recent years however polynomial chaos expansion has become a popular alternative providing high accuracy at affordable computational cost. This paper presents such polynomial chaos (PC) methods using adaptive sparse grids and adaptive basis set construction, together with an application to a Gas Cooled Fast Reactor transient. Comparison is made between a new sparse grid algorithm and the traditionally used technique proposed by Gerstner. An adaptive basis construction method is also introduced and is proved to be advantageous both from an accuracy and a computational point of view. As a demonstration the uncertainty quantification of a 50% loss of flow transient in the GFR2400 Gas Cooled Fast Reactor design was performed using the CATHARE code system. The results are compared to direct Monte Carlo sampling and show the superior convergence and high accuracy of the polynomial chaos expansion. Since PC techniques are easy to implement, they can offer an attractive alternative to traditional techniques for the uncertainty quantification of large scale problems. (authors)

  16. An LWR design decision Methodology

    International Nuclear Information System (INIS)

    Leahy, T.J.; Rees, D.C.; Young, J.

    1982-01-01

    While all parties involved in nuclear plant regulation endeavor to make decisions which optimize the considerations of plant safety and financial impacts, these decisions are generally made without the benefit of a systematic and rigorous approach to the questions confronting the decision makers. A Design Decision Methodology has been developed which provides such a systematic approach. By employing this methodology, which makes use of currently accepted probabilistic risk assessment techniques and cost estimation, informed decisions may be made against a background of comparisons between the relative levels of safety and costs associated with various design alternatives

  17. Methods for modeling and quantification in functional imaging by positron emissions tomography and magnetic resonance imaging

    International Nuclear Information System (INIS)

    Costes, Nicolas

    2017-01-01

    This report presents experiences and researches in the field of in vivo medical imaging by positron emission tomography (PET) and magnetic resonance imaging (MRI). In particular, advances in terms of reconstruction, quantification and modeling in PET are described. The validation of processing and analysis methods is supported by the creation of data by simulation of the imaging process in PET. The recent advances of combined PET/MRI clinical cameras, allowing simultaneous acquisition of molecular/metabolic PET information, and functional/structural MRI information opens the door to unique methodological innovations, exploiting spatial alignment and simultaneity of the PET and MRI signals. It will lead to an increase in accuracy and sensitivity in the measurement of biological phenomena. In this context, the developed projects address new methodological issues related to quantification, and to the respective contributions of MRI or PET information for a reciprocal improvement of the signals of the two modalities. They open perspectives for combined analysis of the two imaging techniques, allowing optimal use of synchronous, anatomical, molecular and functional information for brain imaging. These innovative concepts, as well as data correction and analysis methods, will be easily translated into other areas of investigation using combined PET/MRI. (author) [fr

  18. Quantification and isotope ratio measurement of boron in U3Si2 by inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Saha, Abhijit; Deb, S.B.; Nagar, B.K.; Saxena, M.K.; Samanta, Papu

    2014-01-01

    An analytical methodology was developed for precise quantification and isotope ratio measurement of boron in U 3 Si 2 matrix by using ICP-MS after matrix separation. The analytical technique was validated by recovery studies employing standard addition method and the accuracy in isotope ratio measurement was improved by correcting the bias factor after analyzing NIST SRM951. The quantification of B in the three U 3 Si 2 samples was found in the range of 2.32-3.90 μg g -1 with a maximum standard deviation of 3%. The 10 B/ 11 B value in the three samples was found to be 0.2455±0.0042, 0.2451±0.0036 and 0.2452±0.0041. (author)

  19. Quantification of severe accident source terms of a Westinghouse 3-loop plant

    International Nuclear Information System (INIS)

    Lee Min; Ko, Y.-C.

    2008-01-01

    Integrated severe accident analysis codes are used to quantify the source terms of the representative sequences identified in PSA study. The characteristics of these source terms depend on the detail design of the plant and the accident scenario. A historical perspective of radioactive source term is provided. The grouping of radionuclides in different source terms or source term quantification tools based on TID-14844, NUREG-1465, and WASH-1400 is compared. The radionuclides release phenomena and models adopted in the integrated severe accident analysis codes of STCP and MAAP4 are described. In the present study, the severe accident source terms for risk quantification of Maanshan Nuclear Power Plant of Taiwan Power Company are quantified using MAAP 4.0.4 code. A methodology is developed to quantify the source terms of each source term category (STC) identified in the Level II PSA analysis of the plant. The characteristics of source terms obtained are compared with other source terms. The plant analyzed employs a Westinghouse designed 3-loop pressurized water reactor (PWR) with large dry containment

  20. Feasibility Study for Applicability of the Wavelet Transform to Code Accuracy Quantification

    International Nuclear Information System (INIS)

    Kim, Jong Rok; Choi, Ki Yong

    2012-01-01

    A purpose of the assessment process of large thermal-hydraulic system codes is verifying their quality by comparing code predictions against experimental data. This process is essential for reliable safety analysis of nuclear power plants. Extensive experimental programs have been conducted in order to support the development and validation activities of best estimate thermal-hydraulic codes. So far, the Fast Fourier Transform Based Method (FFTBM) has been used widely for quantification of the prediction accuracy regardless of its limitation that it does not provide any time resolution for a local event. As alternative options, several time windowing methods (running average, short time Fourier transform, and etc.) can be utilized, but such time windowing methods also have a limitation of a fixed resolution. This limitation can be overcome by a wavelet transform because the resolution of the wavelet transform effectively varies in the time-frequency plane depending on choice of basic functions which are not necessarily sinusoidal. In this study, a feasibility of a new code accuracy quantification methodology using the wavelet transform is pursued

  1. Study on dynamic team performance evaluation methodology based on team situation awareness model

    International Nuclear Information System (INIS)

    Kim, Suk Chul

    2005-02-01

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  2. Study on dynamic team performance evaluation methodology based on team situation awareness model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Chul

    2005-02-15

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  3. Economic benefits of metrology in manufacturing

    DEFF Research Database (Denmark)

    Savio, Enrico; De Chiffre, Leonardo; Carmignato, S.

    2016-01-01

    examples from industrial production, in which the added value of metrology in manufacturing is discussed and quantified. Case studies include: general manufacturing, forging, machining, and related metrology. The focus of the paper is on the improved effectiveness of metrology when used at product...... and process design stages, as well as on the improved accuracy and efficiency of manufacturing through better measuring equipment and process chains with integrated metrology for process control.......In streamlined manufacturing systems, the added value of inspection activities is often questioned, and metrology in particular is sometimes considered only as an avoidable expense. Documented quantification of economic benefits of metrology is generally not available. This work presents concrete...

  4. Geostatistical methodology for waste optimization of contaminated premises - 59344

    International Nuclear Information System (INIS)

    Desnoyers, Yvon; Dubot, Didier

    2012-01-01

    The presented methodological study illustrates a Geo-statistical approach suitable for radiological evaluation in nuclear premises. The waste characterization is mainly focused on floor concrete surfaces. By modeling the spatial continuity of activities, Geo-statistics provide sound methods to estimate and map radiological activities, together with their uncertainty. The multivariate approach allows the integration of numerous surface radiation measurements in order to improve the estimation of activity levels from concrete samples. This way, a sequential and iterative investigation strategy proves to be relevant to fulfill the different evaluation objectives. Waste characterization is performed on risk maps rather than on direct interpolation maps (due to bias of the selection on kriging results). The use of several estimation supports (punctual, 1 m 2 , room) allows a relevant radiological waste categorization thanks to cost-benefit analysis according to the risk of exceeding a given activity threshold. Global results, mainly total activity, are similarly quantified to precociously lead the waste management for the dismantling and decommissioning project. This paper recalled the geo-statistics principles and demonstrated how this methodology provides innovative tools for the radiological evaluation of contaminated premises. The relevance of this approach relies on the presence of a spatial continuity for radiological contamination. In this case, geo-statistics provides reliable activity estimates, uncertainty quantification and risk analysis, which are essential decision-making tools for decommissioning and dismantling projects of nuclear installations. Waste characterization is then performed taking all relevant information into account: historical knowledge, surface measurements and samples. Thanks to the multivariate processing, the different investigation stages can be rationalized as regards quantity and positioning. Waste characterization is finally

  5. Improved best estimate plus uncertainty methodology, including advanced validation concepts, to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, C.; Williams, B.; Hemez, F.; Atamturktur, S.H.; McClure, P.

    2011-01-01

    Research highlights: → The best estimate plus uncertainty methodology (BEPU) is one option in the licensing of nuclear reactors. → The challenges for extending the BEPU method for fuel qualification for an advanced reactor fuel are primarily driven by schedule, the need for data, and the sufficiency of the data. → In this paper we develop an extended BEPU methodology that can potentially be used to address these new challenges in the design and licensing of advanced nuclear reactors. → The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. → The methodology includes a formalism to quantify an adequate level of validation (predictive maturity) with respect to existing data, so that required new testing can be minimized, saving cost by demonstrating that further testing will not enhance the quality of the predictive tools. - Abstract: Many evolving nuclear energy technologies use advanced predictive multiscale, multiphysics modeling and simulation (M and S) capabilities to reduce the cost and schedule of design and licensing. Historically, the role of experiments has been as a primary tool for the design and understanding of nuclear system behavior, while M and S played the subordinate role of supporting experiments. In the new era of multiscale, multiphysics computational-based technology development, this role has been reversed. The experiments will still be needed, but they will be performed at different scales to calibrate and validate the models leading to predictive simulations for design and licensing. Minimizing the required number of validation experiments produces cost and time savings. The use of multiscale, multiphysics models introduces challenges in validating these predictive tools - traditional methodologies will have to be modified to address these challenges. This paper gives the basic aspects of a methodology that can potentially be used to address these new challenges in

  6. Safety in relation to risk and benefit

    International Nuclear Information System (INIS)

    Siddall, E.

    1985-01-01

    The proper definition and quantification of human safety is discussed and from this basis the historical development of our present very high standard of safety is traced. It is shown that increased safety is closely associated with increased wealth, and the quantitative relationship between then is derived from different sources of evidence. When this factor is applied to the production of wealth by industry, a safety benefit is indicated which exceeds the asserted risks by orders of magnitude. It is concluded that present policies and attitudes in respect to the safety of industry may be diametrically wrong. (orig.) [de

  7. The economic costs and benefits of potassium iodide prophylaxis for a reference LWR facility in the United States

    International Nuclear Information System (INIS)

    Behling, U.H.; Behling, K.

    1995-01-01

    Policy decisions relating to radiation protection are commonly based on an evaluation in which the benefits of exposure reduction are compared to the economic costs of the protective measure. A generic difficulty countered in cost-benefit analyses, however, is the quantification of major elements that define the costs and the benefits in commensurate units. In this study, the costs of making KI (potassium iodine) available for public use and the avoidance of thyroidal health effects (i.e., the benefit) in the event of nuclear emergency are defined in the commensurate units of dollars. (Authors). 11 refs., 15 tabs

  8. Pesticides residues in water treatment plant sludge: validation of analytical methodology using liquid chromatography coupled to Tandem mass spectrometry (LC-MS/MS)

    International Nuclear Information System (INIS)

    Moracci, Luiz Fernando Soares

    2008-01-01

    The evolving scenario of Brazilian agriculture brings benefits to the population and demands technological advances to this field. Constantly, new pesticides are introduced encouraging scientific studies with the aim of determine and evaluate impacts on the population and on environment. In this work, the evaluated sample was the sludge resulted from water treatment plant located in the Vale do Ribeira, Sao Paulo, Brazil. The technique used was the reversed phase liquid chromatography coupled to electrospray ionization tandem mass spectrometry. Compounds were previously liquid extracted from the matrix. The development of the methodology demanded data processing in order to be transformed into reliable information. The processes involved concepts of validation of chemical analysis. The evaluated parameters were selectivity, linearity, range, sensitivity, accuracy, precision, limit of detection, limit of quantification and robustness. The obtained qualitative and quantitative results were statistically treated and presented. The developed and validated methodology is simple. As results, even exploring the sensitivity of the analytical technique, the work compounds were not detected in the sludge of the WTP. One can explain that these compounds can be present in a very low concentration, can be degraded under the conditions of the water treatment process or are not completely retained by the WTP. (author)

  9. Biosensor for label-free DNA quantification based on functionalized LPGs.

    Science.gov (United States)

    Gonçalves, Helena M R; Moreira, Luis; Pereira, Leonor; Jorge, Pedro; Gouveia, Carlos; Martins-Lopes, Paula; Fernandes, José R A

    2016-10-15

    A label-free fiber optic biosensor based on a long period grating (LPG) and a basic optical interrogation scheme using off the shelf components is used for the detection of in-situ DNA hybridization. A new methodology is proposed for the determination of the spectral position of the LPG mode resonance. The experimental limit of detection obtained for the DNA was 62±2nM and the limit of quantification was 209±7nM. The sample specificity was experimentally demonstrated using DNA targets with different base mismatches relatively to the probe and was found that the system has a single base mismatch selectivity. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Plasticity models of material variability based on uncertainty quantification techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Reese E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Rizzi, Francesco [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Boyce, Brad [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Templeton, Jeremy Alan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ostien, Jakob [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-11-01

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQ techniques can be used in model selection and assessing the quality of calibrated physical parameters.

  11. Wavelets in quantification of liver tumors in contrasted computed tomography images

    International Nuclear Information System (INIS)

    Rodrigues, Bruna T.; Alvarez, Matheus; Souza, Rafael T.F.; Miranda, Jose R.A.; Romeiro, Fernando G.; Pina, Diana R. de; Trindade, Andre Petean

    2012-01-01

    This paper presents an original methodology of liver tumors segmentation, based on wavelet transform. A virtual phantom was constructed with the same mean and standard deviation of the intensity of gray presented by the measured liver tissue. The optimized algorithm had a sensitivity ranging from 0.81 to 0.83, with a specificity of 0.95 for differentiation of hepatic tumors from normal tissues. We obtained a 96% agreement between the pixels segmented by an experienced radiologist and the algorithm presented here. According to the results shown in this work, the algorithm is optimal for the beginning of the tests for quantification of liver tumors in retrospective surveys. (author)

  12. Multiplex electrochemical DNA platform for femtomolar-level quantification of genetically modified soybean.

    Science.gov (United States)

    Manzanares-Palenzuela, C Lorena; de-los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2015-06-15

    Current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) with a minimum content of 0.9% would benefit from the availability of reliable and rapid methods to detect and quantify DNA sequences specific for GMOs. Different genosensors have been developed to this aim, mainly intended for GMO screening. A remaining challenge, however, is the development of genosensing platforms for GMO quantification, which should be expressed as the number of event-specific DNA sequences per taxon-specific sequences. Here we report a simple and sensitive multiplexed electrochemical approach for the quantification of Roundup-Ready Soybean (RRS). Two DNA sequences, taxon (lectin) and event-specific (RR), are targeted via hybridization onto magnetic beads. Both sequences are simultaneously detected by performing the immobilization, hybridization and labeling steps in a single tube and parallel electrochemical readout. Hybridization is performed in a sandwich format using signaling probes labeled with fluorescein isothiocyanate (FITC) or digoxigenin (Dig), followed by dual enzymatic labeling using Fab fragments of anti-Dig and anti-FITC conjugated to peroxidase or alkaline phosphatase, respectively. Electrochemical measurement of the enzyme activity is finally performed on screen-printed carbon electrodes. The assay gave a linear range of 2-250 pM for both targets, with LOD values of 650 fM (160 amol) and 190 fM (50 amol) for the event-specific and the taxon-specific targets, respectively. Results indicate that the method could be applied for GMO quantification below the European labeling threshold level (0.9%), offering a general approach for the rapid quantification of specific GMO events in foods. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Fluorescent quantification of melanin.

    Science.gov (United States)

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Short-term Consumer Benefits of Dynamic Pricing

    OpenAIRE

    Dupont, Benjamin; De Jonghe, Cedric; Kessels, Kris; Belmans, Ronnie

    2011-01-01

    Consumer benefits of dynamic pricing depend on a variety of factors. Consumer characteristics and climatic circumstances widely differ, which forces a regional comparison. This paper presents a general overview of demand response programs and focuses on the short-term benefits of dynamic pricing for an average Flemish residential consumer. It reaches a methodology to develop a cost reflective dynamic pricing program and to estimate short-term bill savings. Participating in a dynamic pricing p...

  15. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  16. [Real-time quantification to analyze historical Colombian samples detecting a short fragment of hypervariable region II of mitochondrial DNA].

    Science.gov (United States)

    Pérez, Luz Adriana; Rodríguez, Freddy; Langebaek, Carl Henrik; Groot, Helena

    2016-09-01

    Unlike other molecular biology studies, the analysis of ancient DNA (aDNA) requires special infrastructure and methodological conditions to guarantee the quality of the results. One of the main authenticity criteria is DNA quantification, where quantitative real-time PCR is often used given its sensitivity and specificity. Nevertheless, the implementation of these conditions and methodologies to fulfill authenticity criteria imply higher costs. Objective: To develop a simple and less costly method for mitochondrial DNA quantification suitable for highly degraded samples. Materials and methods: The proposed method is based on the use of mini-primers for the specific amplification of short fragments of mitochondrial DNA. The subsequent purification of these amplified fragments allows a standard curve to be constructed with concentrations in accordance to the state of degradation of the samples. Results: The proposed method successfully detected DNA from ancient samples including bone remains and mummified tissue. DNA inhibitory substances were also detected. Conclusion: The proposed method represents a simpler and cost-effective way to detect low amounts of aDNA, and a tool to differentiate DNA-free samples from samples with inhibitory substances.

  17. Cost-Benefit Analysis of Financial Regulation: Case Studies and Implications

    OpenAIRE

    Coates, John

    2015-01-01

    Some members of Congress, the D.C. Circuit, and legal academia are promoting a particular, abstract form of cost-benefit analysis for financial regulation: judicially enforced quantification. How would CBA work in practice, if applied to specific, important, representative rules, and what is the alternative? Detailed case studies of six rules – (1) disclosure rules under Sarbanes-Oxley Section 404, (2) the SEC’s mutual fund governance reforms, (3) Basel III’s heightened capital requirements f...

  18. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  19. A risk-based sensor placement methodology

    International Nuclear Information System (INIS)

    Lee, Ronald W.; Kulesz, James J.

    2008-01-01

    A risk-based sensor placement methodology is proposed to solve the problem of optimal location of sensors to protect population against the exposure to, and effects of, known and/or postulated chemical, biological, and/or radiological threats. Risk is calculated as a quantitative value representing population at risk from exposure at standard exposure levels. Historical meteorological data are used to characterize weather conditions as the frequency of wind speed and direction pairs. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate risk values. Sensor locations are determined via an iterative dynamic programming algorithm whereby threats detected by sensors placed in prior iterations are removed from consideration in subsequent iterations. In addition to the risk-based placement algorithm, the proposed methodology provides a quantification of the marginal utility of each additional sensor. This is the fraction of the total risk accounted for by placement of the sensor. Thus, the criteria for halting the iterative process can be the number of sensors available, a threshold marginal utility value, and/or a minimum cumulative utility achieved with all sensors

  20. Loss-of-benefits analysis for nuclear power plant shutdowns: methodology and illustrative case study

    International Nuclear Information System (INIS)

    Peerenboom, J.P.; Buehring, W.A.; Guziel, K.A.

    1983-11-01

    A framework for loss-of-benefits analysis and a taxomony for identifying and categorizing the effects of nuclear power plant shutdowns or accidents are presented. The framework consists of three fundamental steps: (1) characterizing the shutdown; (2) identifying benefits lost as a result of the shutdown; and (3) quantifying effects. A decision analysis approach to regulatory decision making is presented that explicitly considers the loss of benefits. A case study of a hypothetical reactor shutdown illustrates one key loss of benefits: net replacement energy costs (i.e., change in production costs). Sensitivity studies investigate the responsiveness of case study results to changes in nuclear capacity factor, load growth, fuel price escalation, and discount rate. The effects of multiple reactor shutdowns on production costs are also described

  1. A new methodology for non-contact accurate crack width measurement through photogrammetry for automated structural safety evaluation

    International Nuclear Information System (INIS)

    Jahanshahi, Mohammad R; Masri, Sami F

    2013-01-01

    In mechanical, aerospace and civil structures, cracks are important defects that can cause catastrophes if neglected. Visual inspection is currently the predominant method for crack assessment. This approach is tedious, labor-intensive, subjective and highly qualitative. An inexpensive alternative to current monitoring methods is to use a robotic system that could perform autonomous crack detection and quantification. To reach this goal, several image-based crack detection approaches have been developed; however, the crack thickness quantification, which is an essential element for a reliable structural condition assessment, has not been sufficiently investigated. In this paper, a new contact-less crack quantification methodology, based on computer vision and image processing concepts, is introduced and evaluated against a crack quantification approach which was previously developed by the authors. The proposed approach in this study utilizes depth perception to quantify crack thickness and, as opposed to most previous studies, needs no scale attachment to the region under inspection, which makes this approach ideal for incorporation with autonomous or semi-autonomous mobile inspection systems. Validation tests are performed to evaluate the performance of the proposed approach, and the results show that the new proposed approach outperforms the previously developed one. (paper)

  2. Lamb wave-based damage quantification and probability of detection modeling for fatigue life assessment of riveted lap joint

    Science.gov (United States)

    He, Jingjing; Wang, Dengjiang; Zhang, Weifang

    2015-03-01

    This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.

  3. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  4. Risk-based Regulatory Evaluation Program methodology

    International Nuclear Information System (INIS)

    DuCharme, A.R.; Sanders, G.A.; Carlson, D.D.; Asselin, S.V.

    1987-01-01

    The objectives of this DOE-supported Regulatory Evaluation Progrwam are to analyze and evaluate the safety importance and economic significance of existing regulatory guidance in order to assist in the improvement of the regulatory process for current generation and future design reactors. A risk-based cost-benefit methodology was developed to evaluate the safety benefit and cost of specific regulations or Standard Review Plan sections. Risk-based methods can be used in lieu of or in combination with deterministic methods in developing regulatory requirements and reaching regulatory decisions

  5. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  6. Studies on methodology for vegetal bio indicators in bioremediation areas contaminated with petroleum wastes; Estudos sobre metodologia para bioindicadores vegetais em areas de biorremediacao contaminadas com residuos de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento Neto, Durval; Castro, Rodrigo Azevedo; Krenczynski, Michele Cristine; Goncalves, Claudia Martins; Souza, Sergio Luiz de [Universidade Federal do Parana (UFPR), Curitiba (Brazil). Curso de Pos-Graduacao em Ciencia do Solo; Carvalho, Francisco Jose Pereira de Campos [Universidade Federal do Parana (UFPR), Curitiba (Brazil). Dept. de Solos; Grube, Karl; Coelho, Jorge Ibirajara Evangelista [PETROBRAS, PR (Brazil). REPAR

    1998-07-01

    The present work has as it main objective the development of bioindicator methodology for use of soil biorremediation criteria and environmental assessment evaluation upon the actual soil biorremediation status quantification. In order to do so morphophysiological aspects of Avena sativa and Barbarea verna, were determinated under greenhouse conditions for a dilution series of contaminated soil with the non contaminated one. A quantification scale model was proposed report the based on the statistical analysis for the defined morphophisyological parameters. Therefore, it has possible to quantigicate phytoxicity and construct phytotoxicity curves for the contaminated soil dilution series. It was possible to conclude that the developed methodology can be used as a criteria of soil actual biorremediation status. (author)

  7. WE-G-17A-03: MRIgRT: Quantification of Organ Motion

    International Nuclear Information System (INIS)

    Stanescu, T; Tadic, T; Jaffray, D

    2014-01-01

    Purpose: To develop an MRI-based methodology and tools required for the quantification of organ motion on a dedicated MRI-guided radiotherapy system. A three-room facility, consisting of a TrueBeam 6X linac vault, a 1.5T MR suite and a brachytherapy interventional room, is currently under commissioning at our institution. The MR scanner can move and image in either room for diagnostic and treatment guidance purposes. Methods: A multi-imaging modality (MR, kV) phantom, featuring programmable 3D simple and complex motion trajectories, was used for the validation of several image sorting algorithms. The testing was performed on MRI (e.g. TrueFISP, TurboFLASH), 4D CT and 4D CBCT. The image sorting techniques were based on a) direct image pixel manipulation into columns or rows, b) single and aggregated pixel data tracking and c) using computer vision techniques for global pixel analysis. Subsequently, the motion phantom and sorting algorithms were utilized for commissioning of MR fast imaging techniques for 2D-cine and 4D data rendering. MR imaging protocols were optimized (e.g. readout gradient strength vs. SNR) to minimize the presence of susceptibility-induced distortions, which were reported through phantom experiments and numerical simulations. The system-related distortions were also quantified (dedicated field phantom) and treated as systematic shifts where relevant. Results: Image sorting algorithms were validated for specific MR-based applications such as quantification of organ motion, local data sampling, and 4D MRI for pre-RT delivery with accuracy better than the raw image pixel size (e.g. 1 mm). MR fast imaging sequences were commissioning and imaging strategies were developed to mitigate spatial artifacts with minimal penalty on the image spatial and temporal sampling. Workflows (e.g. liver) were optimized to include the new motion quantification tools for RT planning and daily patient setup verification. Conclusion: Comprehensive methods were developed

  8. Ideas underlying the Quantification of Margins and Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Pilch, Martin, E-mail: mpilch@sandia.gov [Department 1514, Sandia National Laboratories, Albuquerque, NM 87185-0828 (United States); Trucano, Timothy G. [Department 1411, Sandia National Laboratories, Albuquerque, NM 87185-0370 (United States); Helton, Jon C. [Department of Mathematics and Statistics, Arizona State University, Tempe, AZ 85287-1804 (United States)

    2011-09-15

    Key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions are described. While QMU is a broad process and methodology for generating critical technical information to be used in U.S. nuclear weapon stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, the following topics are discussed: (i) the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, (ii) the need to separate aleatory and epistemic uncertainty in QMU, and (iii) the properties of risk-informed decision making (RIDM) that are best suited for effective application of QMU. The paper is written at a high level, but provides an extensive bibliography of useful papers for interested readers to deepen their understanding of the presented ideas.

  9. The application of cost-benefit analysis to the radiological protection of the public

    International Nuclear Information System (INIS)

    1980-03-01

    The subject of this document is the quantification of the potential harm caused to the general public by ionising radiation in normal operating circumstances. The object is to enable the health detriment from a practice involving exposure to ionising radiation to be directly compared with the costs of keeping the ensuing doses as low as reasonably achievable. Chapter headings include: development of radiological protection criteria; principles underlying the valuation of harm from radiation exposure; risk evaluation approach to costing of detriment; monetary valuations; distribution of costs and risk in time. Appendices cover the following: cost benefit analysis (principles); recommendations of ICRP on the use of cost benefit analysis; life valuation studies (review); application of cost benefit analysis to the value of the man sievert. (U.K.)

  10. Can the CFO Trust the FX Exposure Quantification from a Stock Market Approach?

    DEFF Research Database (Denmark)

    Aabo, Tom; Brodin, Danielle

    This study examines the sensitivity of detected exchange rate exposures at the firm specific level to changes in methodological choices using a traditional two factor stock market approach for exposure quantification. We primarily focus on two methodological choices: the choice of market index...... and the choice of observation frequency. We investigate to which extent the detected exchange rate exposures for a given firm can be confirmed when the choice of market index and/or the choice of observation frequency are changed. Applying our sensitivity analysis to Scandinavian non-financial firms, we...... thirds of the number of detected exposures using weekly data and 2) there is no economic rationale that the detected exposures at the firm-specific level should change when going from the use of weekly data to the use of monthly data. In relation to a change in the choice of market index, we find...

  11. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  12. 76 FR 34270 - Federal-State Extended Benefits Program-Methodology for Calculating “on” or “off” Total...

    Science.gov (United States)

    2011-06-13

    ... requirement. The Department plans to promulgate regulations about this methodology in the near future. In the...--Methodology for Calculating ``on'' or ``off'' Total Unemployment Rate Indicators for Purposes of Determining..., Labor. ACTION: Notice. SUMMARY: UIPL 16-11 informs states of the methodology used to calculate the ``on...

  13. The methodological defense of realism scrutinized.

    Science.gov (United States)

    Wray, K Brad

    2015-12-01

    I revisit an older defense of scientific realism, the methodological defense, a defense developed by both Popper and Feyerabend. The methodological defense of realism concerns the attitude of scientists, not philosophers of science. The methodological defense is as follows: a commitment to realism leads scientists to pursue the truth, which in turn is apt to put them in a better position to get at the truth. In contrast, anti-realists lack the tenacity required to develop a theory to its fullest. As a consequence, they are less likely to get at the truth. My aim is to show that the methodological defense is flawed. I argue that a commitment to realism does not always benefit science, and that there is reason to believe that a research community with both realists and anti-realists in it may be better suited to advancing science. A case study of the Copernican Revolution in astronomy supports this claim. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. The Service Learning Projects: Stakeholder Benefits and Potential Class Topics

    Science.gov (United States)

    Rutti, Raina M.; LaBonte, Joanne; Helms, Marilyn Michelle; Hervani, Aref Agahei; Sarkarat, Sy

    2016-01-01

    Purpose: The purpose of this paper is to summarize the benefits of including a service learning project in college classes and focusses on benefits to all stakeholders, including students, community, and faculty. Design/methodology/approach: Using a snowball approach in academic databases as well as a nominal group technique to poll faculty, key…

  15. Effects of humic acid on DNA quantification with Quantifiler® Human DNA Quantification kit and short tandem repeat amplification efficiency.

    Science.gov (United States)

    Seo, Seung Bum; Lee, Hye Young; Zhang, Ai Hua; Kim, Hye Yeon; Shin, Dong Hoon; Lee, Soong Deok

    2012-11-01

    Correct DNA quantification is an essential part to obtain reliable STR typing results. Forensic DNA analysts often use commercial kits for DNA quantification; among them, real-time-based DNA quantification kits are most frequently used. Incorrect DNA quantification due to the presence of PCR inhibitors may affect experiment results. In this study, we examined the alteration degree of DNA quantification results estimated in DNA samples containing a PCR inhibitor by using a Quantifiler® Human DNA Quantification kit. For experiments, we prepared approximately 0.25 ng/μl DNA samples containing various concentrations of humic acid (HA). The quantification results were 0.194-0.303 ng/μl at 0-1.6 ng/μl HA (final concentration in the Quantifiler reaction) and 0.003-0.168 ng/μl at 2.4-4.0 ng/μl HA. Most DNA quantity was undetermined when HA concentration was higher than 4.8 ng/μl HA. The C (T) values of an internal PCR control (IPC) were 28.0-31.0, 36.5-37.1, and undetermined at 0-1.6, 2.4, and 3.2 ng/μl HA. These results indicate that underestimated DNA quantification results may be obtained in the DNA sample with high C (T) values of IPC. Thus, researchers should carefully interpret the DNA quantification results. We additionally examined the effects of HA on the STR amplification by using an Identifiler® kit and a MiniFiler™ kit. Based on the results of this study, it is thought that a better understanding of various effects of HA would help researchers recognize and manipulate samples containing HA.

  16. Quantification of human error and common-mode failures in man-machine systems

    International Nuclear Information System (INIS)

    Lisboa, J.J.

    1988-01-01

    Quantification of human performance, particularly the determination of human error, is essential for realistic assessment of overall system performance of man-machine systems. This paper presents an analysis of human errors in nuclear power plant systems when measured against common-mode failures (CMF). Human errors evaluated are improper testing, inadequate maintenance strategy, and miscalibration. The methodology presented in the paper represents a positive contribution to power plant systems availability by identifying sources of common-mode failure when operational functions are involved. It is also applicable to other complex systems such as chemical plants, aircraft and motor industries; in fact, any large man-created, man-machine system could be included

  17. Socio-economic research on fusion. SERF 1997-98. Macro Tast E2: External costs and benefits. Task 2: Comparison of external costs

    International Nuclear Information System (INIS)

    Schleisner, Lotte; Korhonen, Riitta

    1998-12-01

    This report is part of the SERF (Socio-Economic Research on Fusion) project, Macro Task E2, which covers External Costs and Benefits. The report is the documentation of Task 2, Comparison of External Costs. The aim of Task 2 Comparison of External Costs, has been to compare the external costs of the fusion energy with those from other alternative energy generation technologies. In this task identification and quantification of the external costs for wind energy and photovoltaic have been performed by Risoe, while identification and quantification of the external cost for nuclear fission and fossil fuels have been discussed by VTT. The methodology used for the assessment of the externalities of the fuel cycles selected has been the one developed within the ExternE Project. First estimates for the externalities of fusion energy have been under examination in Macrotask E2. Externalities of fossil fuels and nuclear fission have already been evaluated in the ExternE project and a vast amount of material for different sites in various countries is available. This material is used in comparison. In the case of renewable wind energy and photovoltaic are assessed separately. External costs of the various alternatives may change as new technologies are developed and costs can to a high extent be avoided (e.g. acidifying impacts but also global warming due to carbon dioxide emissions). Also fusion technology can experience major progress and some important cost components probably can be avoided already by 2050. (EG)

  18. Socio-economic research on fusion. SERF 1997-98. Macro Tast E2: External costs and benefits. Task 2: Comparison of external costs

    Energy Technology Data Exchange (ETDEWEB)

    Schleisner, Lotte; Korhonen, Riitta

    1998-12-01

    This report is part of the SERF (Socio-Economic Research on Fusion) project, Macro Task E2, which covers External Costs and Benefits. The report is the documentation of Task 2, Comparison of External Costs. The aim of Task 2 Comparison of External Costs, has been to compare the external costs of the fusion energy with those from other alternative energy generation technologies. In this task identification and quantification of the external costs for wind energy and photovoltaic have been performed by Risoe, while identification and quantification of the external cost for nuclear fission and fossil fuels have been discussed by VTT. The methodology used for the assessment of the externalities of the fuel cycles selected has been the one developed within the ExternE Project. First estimates for the externalities of fusion energy have been under examination in Macrotask E2. Externalities of fossil fuels and nuclear fission have already been evaluated in the ExternE project and a vast amount of material for different sites in various countries is available. This material is used in comparison. In the case of renewable wind energy and photovoltaic are assessed separately. External costs of the various alternatives may change as new technologies are developed and costs can to a high extent be avoided (e.g. acidifying impacts but also global warming due to carbon dioxide emissions). Also fusion technology can experience major progress and some important cost components probably can be avoided already by 2050. (EG) 36 refs.

  19. Quantitative evaluation of geodiversity: development of methodological procedures with application to territorial management

    Science.gov (United States)

    Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.

    2012-04-01

    Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity

  20. Quantification of local mobilities

    DEFF Research Database (Denmark)

    Zhang, Y. B.

    2018-01-01

    A new method for quantification of mobilities of local recrystallization boundary segments is presented. The quantification is based on microstructures characterized using electron microscopy and on determination of migration velocities and driving forces for local boundary segments. Pure aluminium...... is investigated and the results show that even for a single recrystallization boundary, different boundary segments migrate differently, and the differences can be understood based on variations in mobilities and local deformed microstructures. The present work has important implications for understanding...

  1. A systematic framework for effective uncertainty assessment of severe accident calculations; Hybrid qualitative and quantitative methodology

    International Nuclear Information System (INIS)

    Hoseyni, Seyed Mohsen; Pourgol-Mohammad, Mohammad; Tehranifard, Ali Abbaspour; Yousefpour, Faramarz

    2014-01-01

    This paper describes a systematic framework for characterizing important phenomena and quantifying the degree of contribution of each parameter to the output in severe accident uncertainty assessment. The proposed methodology comprises qualitative as well as quantitative phases. The qualitative part so called Modified PIRT, being a robust process of PIRT for more precise quantification of uncertainties, is a two step process for identifying and ranking based on uncertainty importance in severe accident phenomena. In this process identified severe accident phenomena are ranked according to their effect on the figure of merit and their level of knowledge. Analytical Hierarchical Process (AHP) serves here as a systematic approach for severe accident phenomena ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the severe accident model(s) used to represent the important phenomena. The methodology uses subjective justification by evaluating available information and data from experiments, and code predictions for this step. The quantitative part utilizes uncertainty importance measures for the quantification of the effect of each input parameter to the output uncertainty. A response surface fitting approach is proposed for estimating associated uncertainties with less calculation cost. The quantitative results are used to plan in reducing epistemic uncertainty in the output variable(s). The application of the proposed methodology is demonstrated for the ACRR MP-2 severe accident test facility. - Highlights: • A two stage framework for severe accident uncertainty analysis is proposed. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • Uncertainty importance measure quantitatively calculates effect of each uncertainty source. • Methodology is applied successfully on ACRR MP-2 severe accident test facility

  2. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Min, Jung Joon; Lee, Byeong Il; Choi, Eun Seo; Tak, Yoon O; Choi, Heung Kook; Lee, Ju Young

    2009-01-01

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  3. [Electronic versus paper-based patient records: a cost-benefit analysis].

    Science.gov (United States)

    Neubauer, A S; Priglinger, S; Ehrt, O

    2001-11-01

    The aim of this study is to compare the costs and benefits of electronic, paperless patient records with the conventional paper-based charts. Costs and benefits of planned electronic patient records are calculated for a University eye hospital with 140 beds. Benefit is determined by direct costs saved by electronic records. In the example shown, the additional benefits of electronic patient records, as far as they can be quantified total 192,000 DM per year. The costs of the necessary investments are 234,000 DM per year when using a linear depreciation over 4 years. In total, there are additional annual costs for electronic patient records of 42,000 DM. Different scenarios were analyzed. By increasing the time of depreciation to 6 years, the cost deficit reduces to only approximately 9,000 DM. Increased wages reduce the deficit further while the deficit increases with a loss of functions of the electronic patient record. However, several benefits of electronic records regarding research, teaching, quality control and better data access cannot be easily quantified and would greatly increase the benefit to cost ratio. Only part of the advantages of electronic patient records can easily be quantified in terms of directly saved costs. The small cost deficit calculated in this example is overcompensated by several benefits, which can only be enumerated qualitatively due to problems in quantification.

  4. ARAMIS project: A comprehensive methodology for the identification of reference accident scenarios in process industries

    International Nuclear Information System (INIS)

    Delvosalle, Christian; Fievez, Cecile; Pipart, Aurore; Debray, Bruno

    2006-01-01

    In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term 'major accidents' must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called 'risk matrix', crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage

  5. Benefit-cost assessment programs: Costa Rica case study

    International Nuclear Information System (INIS)

    Clark, A.L.; Trocki, L.K.

    1991-01-01

    An assessment of mineral potential, in terms of types and numbers of deposits, approximate location and associated tonnage and grades, is a valuable input to a nation's economic planning and mineral policy development. This study provides a methodology for applying benefit-cost analysis to mineral resource assessment programs, both to determine the cost effectiveness of resource assessments and to ascertain future benefits to the nation. In a case study of Costa Rica, the benefit-cost ratio of a resource assessment program was computed to be a minimum of 4:1 ($10.6 million to $2.5 million), not including the economic benefits accuring from the creation of 800 mining sector and 1,200 support services jobs. The benefit-cost ratio would be considerably higher if presently proposed revisions of mineral policy were implemented and benefits could be defined for Costa Rica

  6. Cost benefit analysis of power plant database integration

    International Nuclear Information System (INIS)

    Wilber, B.E.; Cimento, A.; Stuart, R.

    1988-01-01

    A cost benefit analysis of plant wide data integration allows utility management to evaluate integration and automation benefits from an economic perspective. With this evaluation, the utility can determine both the quantitative and qualitative savings that can be expected from data integration. The cost benefit analysis is then a planning tool which helps the utility to develop a focused long term implementation strategy that will yield significant near term benefits. This paper presents a flexible cost benefit analysis methodology which is both simple to use and yields accurate, verifiable results. Included in this paper is a list of parameters to consider, a procedure for performing the cost savings analysis, and samples of this procedure when applied to a utility. A case study is presented involving a specific utility where this procedure was applied. Their uses of the cost-benefit analysis are also described

  7. Impact of knowledge and misconceptions on benefit and risk perception of CCS.

    Science.gov (United States)

    Wallquist, Lasse; Visschers, Vivianne H M; Siegrist, Michael

    2010-09-01

    Carbon Dioxide Capture and Storage (CCS) is assumed to be one of the key technologies in the mitigation of climate change. Public acceptance may have a strong impact on the progress of this technology. Benefit perception and risk perception are known to be important determinants of public acceptance of CCS. In this study, the prevalence and effect of cognitive concepts underlying laypeople's risk perception and benefit perception of CCS were examined in a representative survey (N=654) in Switzerland. Results confirm findings from previous qualitative studies and show a quantification of a variety of widespread intuitive concepts that laypeople hold about storage mechanisms as well as about leakage and socioeconomic issues, which all appeared to influence risk perception and benefit perception. The perception of an overpressurized reservoir and concerns about diffuse impacts furthermore amplified risk perception. Appropriate images about storage mechanisms and climate change awareness were increasing the perception of benefits. Knowledge about CO2 seemed to lower both perceived benefits and perceived risks. Implications for risk communication and management are discussed.

  8. Development of Total Reflection X-ray fluorescence spectrometry quantitative methodologies for elemental characterization of building materials and their degradation products

    Science.gov (United States)

    García-Florentino, Cristina; Maguregui, Maite; Marguí, Eva; Torrent, Laura; Queralt, Ignasi; Madariaga, Juan Manuel

    2018-05-01

    In this work, a Total Reflection X-ray fluorescence (TXRF) spectrometry based quantitative methodology for elemental characterization of liquid extracts and solids belonging to old building materials and their degradation products from a building of the beginning of 20th century with a high historic cultural value in Getxo, (Basque Country, North of Spain) is proposed. This quantification strategy can be considered a faster methodology comparing to traditional Energy or Wavelength Dispersive X-ray fluorescence (ED-XRF and WD-XRF) spectrometry based methodologies or other techniques such as Inductively Coupled Plasma Mass Spectrometry (ICP-MS). In particular, two kinds of liquid extracts were analysed: (i) water soluble extracts from different mortars and (ii) acid extracts from mortars, black crusts, and calcium carbonate formations. In order to try to avoid the acid extraction step of the materials and their degradation products, it was also studied the TXRF direct measurement of the powdered solid suspensions in water. With this aim, different parameters such as the deposition volume and the measuring time were studied for each kind of samples. Depending on the quantified element, the limits of detection achieved with the TXRF quantitative methodologies for liquid extracts and solids were set around 0.01-1.2 and 2-200 mg/L respectively. The quantification of K, Ca, Ti, Mn, Fe, Zn, Rb, Sr, Sn and Pb in the liquid extracts was proved to be a faster alternative to other more classic quantification techniques (i.e. ICP-MS), accurate enough to obtain information about the composition of the acidic soluble part of the materials and their degradation products. Regarding the solid samples measured as suspensions, it was quite difficult to obtain stable and repetitive suspensions affecting in this way the accuracy of the results. To cope with this problem, correction factors based on the quantitative results obtained using ED-XRF were calculated to improve the accuracy of

  9. Real-time PCR quantification of human complement C4A and C4B genes

    Directory of Open Access Journals (Sweden)

    Fust George

    2006-01-01

    Full Text Available Abstract Background The fourth component of human complement (C4, an essential factor of the innate immunity, is represented as two isoforms (C4A and C4B in the genome. Although these genes differ only in 5 nucleotides, the encoded C4A and C4B proteins are functionally different. Based on phenotypic determination, unbalanced production of C4A and C4B is associated with several diseases, such as systemic lupus erythematosus, type 1 diabetes, several autoimmune diseases, moreover with higher morbidity and mortality of myocardial infarction and increased susceptibility for bacterial infections. Despite of this major clinical relevance, only low throughput, time and labor intensive methods have been used so far for the quantification of C4A and C4B genes. Results A novel quantitative real-time PCR (qPCR technique was developed for rapid and accurate quantification of the C4A and C4B genes applying a duplex, TaqMan based methodology. The reliable, single-step analysis provides the determination of the copy number of the C4A and C4B genes applying a wide range of DNA template concentration (0.3–300 ng genomic DNA. The developed qPCR was applied to determine C4A and C4B gene dosages in a healthy Hungarian population (N = 118. The obtained data were compared to the results of an earlier study of the same population. Moreover a set of 33 samples were analyzed by two independent methods. No significant difference was observed between the gene dosages determined by the employed techniques demonstrating the reliability of the novel qPCR methodology. A Microsoft Excel worksheet and a DOS executable are also provided for simple and automated evaluation of the measured data. Conclusion This report describes a novel real-time PCR method for single-step quantification of C4A and C4B genes. The developed technique could facilitate studies investigating disease association of different C4 isotypes.

  10. Quantification in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Buvat, Irene

    2005-01-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena; 2 - quantification in SPECT, problems and correction methods: Attenuation, scattering, un-stationary spatial resolution, partial volume effect, movement, tomographic reconstruction, calibration; 3 - Synthesis: actual quantification accuracy; 4 - Beyond the activity concentration measurement

  11. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  12. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  13. Health economic evaluation: important principles and methodology.

    Science.gov (United States)

    Rudmik, Luke; Drummond, Michael

    2013-06-01

    To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.

  14. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  15. Exploring the Central Nervous System: methodological state of the art

    International Nuclear Information System (INIS)

    Darcourt, Jacques; Koulibaly, Pierre-Malick; Migneco, Octave

    2005-01-01

    The analysis of the clinical use of brain SPECT demonstrate a defacing between the methodological developments published recently and its current use in clinical practice. We review a description of recent methodological developments that could be useful in three classical clinical application: the diagnosis of Alzheimer's disease, the evaluation of dopaminergic neurotransmission in Parkinson's Disease and the study of epilepsy. In Alzheimer's disease the methods of spatial standardization and the comparison to a normative data base are more useful to observers that have the least experience and for this end methodological approaches that are oriented to routine work better and are simpler than SPM. Quantification is essential in the study of dopaminergic neurotransmission and the measurement of binding potential appears biased due to septal penetration, attenuation, diffusion and partial volume effect. Partial volume effect introduces most error and its correction is difficult because of the co registration precision required with magnetic resonance images. The study of epilepsy by subtraction of ictal and interictal SPECT has demonstrated its clinical value. It is a fusion of images operation that has now very well defined methods (au)

  16. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples.

    Science.gov (United States)

    Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S

    2016-03-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).

  17. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples

    Science.gov (United States)

    Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David

    2016-01-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.

  18. Sustainable Facility Development: Perceived Benefits and Challenges

    Science.gov (United States)

    Stinnett, Brad; Gibson, Fred

    2016-01-01

    Purpose: The purpose of this paper is to assess the perceived benefits and challenges of implementing sustainable initiatives in collegiate recreational sports facilities. Additionally, this paper intends to contribute to the evolving field of facility sustainability in higher education. Design/methodology/approach The design included qualitative…

  19. Use of the SSHAC methodology within regulated environments: Cost-effective application for seismic characterization at multiple sites

    International Nuclear Information System (INIS)

    Coppersmith, Kevin J.; Bommer, Julian J.

    2012-01-01

    Highlights: ► SSHAC processes provide high levels of regulatory assurance in hazard assessments for purposes of licensing and safety review. ► SSHAC projects provide structure to the evaluation of available data, models, and methods for building hazard input models. ► Experience on several nuclear projects in the past 15 years leads to the identification of key essential procedural steps. ► Conducting a regional SSHAC Level 3 study, followed by Level 2 site-specific studies can be time and cost effective. - Abstract: Essential elements of license applications and safety reviews for nuclear facilities are quantifications of earthquake and other natural hazards. A Senior Seismic Hazard Analysis Committee (SSHAC) Level 3 or 4 process provides regulatory assurance that the hazard assessment considers all data and models proposed by members of the technical community and the associated uncertainties have been properly quantified. The SSHAC process has been endorsed as an acceptable hazard assessment methodology in US NRC regulatory guidance. Where hazard studies are required for multiple sites, regional SSHAC Level 3 or 4 studies followed by site-specific Level 2 refinements can provide major benefits in cost and duration.

  20. [Quantifying the additional clinical benefit of new medicines: little - considerable - significant - 6 remarks from a biometrician's point of view].

    Science.gov (United States)

    Vach, Werner

    2014-11-01

    According to the German Pharmaceutical Market Reorganisation Act [Arzneimittelmarktneuordnungsgesetz (AMNOG)] of 22.12.2010, the benefit assessment of a new drug should include an evaluation of the "degree of additional benefit". A corresponding regulation of the German Ministry of Health states that the quantification of the degree of additional benefit should be made in the terms "major additional benefit", "considerable additional benefit" and "little additional benefit". In September 2011 the IQWiG undertook and explained in appendix A of the dossier evaluation of Ticagrelor an "operationalisation of the extent of additional benefit according to AM-NutzenV". Therein a distinction was made between the target categories "survival time (mortality)", "serious (or, respectively, severe) symptoms", "quality of life", and "not serious (or, respectively, not severe) symptoms". In the operationalisation of the IQWiG, the categorisation of the additional benefit with regard to mortality was addressed by definition of threshold values for the upper limit of the 95% confidence interval for the relative risk (RR). The statutory regulations and the operationalisation of the IQWiG will have direct long-term effects on the provision of medical care since they have a say as to which drugs are to be available at which prices. By introduction of terms such as "major additional benefit", "considerable additional benefit" or "desired effects" and linking them to statistical parameters and algorithms, they also open a series of further fundamental questions as to if and how we should handle these terms in the future and what consequences are inherent to the use of statistical criteria in their "definition". In the present article 6 of the questions that arise in this context are discussed: Can a "considerable additional benefit" be defined with statistical methods? Can a classification of the additional benefit on the basis of an estimated RR be reliable? What are the fundamental

  1. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  2. Cost-benefit

    International Nuclear Information System (INIS)

    1975-01-01

    A critical review of the cost benefit analysis is given for the LMFBR-type reactor development program given in an environmental impact statement of AEC. Several methodological shortcomings are signalled. As compared with a HTGR-type/LWR-type mix of reactors the LMFBR-type reactor will not be competitive until the U 3 O 8 prices reach a level of $ 50/lb which is not likely to happen before the year 2020. It is recommended to review the draft of the ZEC document and include timing as one of the issues. Deferal of the LMFBR-type reactor development program if necessary will not be intolerably costly

  3. Processing and quantification of x-ray energy dispersive spectra in the Analytical Electron Microscope

    International Nuclear Information System (INIS)

    Zaluzec, N.J.

    1988-08-01

    Spectral processing in x-ray energy dispersive spectroscopy deals with the extraction of characteristic signals from experimental data. In this text, the four basic procedures for this methodology are reviewed and their limitations outlined. Quantification, on the other hand, deals with the interpretation of the information obtained from spectral processing. Here the limitations are for the most part instrumental in nature. The prospects of higher voltage operation does not, in theory, present any new problems and may in fact prove to be more desirable assuming that electron damage effects do not preclude analysis. 28 refs., 6 figs

  4. An approach for quantification of platinum distribution in tissues by LA-ICP-MS imaging using isotope dilution analysis.

    Science.gov (United States)

    Moraleja, I; Mena, M L; Lázaro, A; Neumann, B; Tejedor, A; Jakubowski, N; Gómez-Gómez, M M; Esteban-Fernández, D

    2018-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been revealed as a convenient technique for trace elemental imaging in tissue sections, providing elemental 2D distribution at a quantitative level. For quantification purposes, in the last years several approaches have been proposed in the literature such as the use of CRMs or matrix matched standards. The use of Isotope Dilution (ID) for quantification by LA-ICP-MS has been also described, being mainly useful for bulk analysis but not feasible for spatial measurements so far. In this work, a quantification method based on ID analysis was developed by printing isotope-enriched inks onto kidney slices from rats treated with antitumoral Pt-based drugs using a commercial ink-jet device, in order to perform an elemental quantification in different areas from bio-images. For the ID experiments 194 Pt enriched platinum was used. The methodology was validated by deposition of natural Pt standard droplets with a known amount of Pt onto the surface of a control tissue, where could be quantified even 50pg of Pt, with recoveries higher than 90%. The amount of Pt present in the whole kidney slices was quantified for cisplatin, carboplatin and oxaliplatin-treated rats. The results obtained were in accordance with those previously reported. The amount of Pt distributed between the medullar and cortical areas was also quantified, observing different behavior for the three drugs. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. A risk assessment methodology for incorporating uncertainties using fuzzy concepts

    International Nuclear Information System (INIS)

    Cho, Hyo-Nam; Choi, Hyun-Ho; Kim, Yoon-Bae

    2002-01-01

    This paper proposes a new methodology for incorporating uncertainties using fuzzy concepts into conventional risk assessment frameworks. This paper also introduces new forms of fuzzy membership curves, designed to consider the uncertainty range that represents the degree of uncertainties involved in both probabilistic parameter estimates and subjective judgments, since it is often difficult or even impossible to precisely estimate the occurrence rate of an event in terms of one single crisp probability. It is to be noted that simple linguistic variables such as 'High/Low' and 'Good/Bad' have the limitations in quantifying the various risks inherent in construction projects, but only represent subjective mental cognition adequately. Therefore, in this paper, the statements that include some quantification with giving specific value or scale, such as 'Close to any value' or 'Higher/Lower than analyzed value', are used in order to get over the limitations. It may be stated that the proposed methodology will be very useful for the systematic and rational risk assessment of construction projects

  6. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    , multiple development goals can be reinforced by specific climate funding granted on the basis of multiple benefits and synergies, for instance through currently negotiated mechanisms such as Nationally Appropriate Mitigation Actions (NAMAs) (REDD+, Kissinger et al 2012). 3. Challenges to quantifying GHG information for the agricultural sector The quantification of GHG emissions from agriculture is fundamental to identifying mitigation solutions that are consistent with the goals of achieving greater resilience in production systems, food security, and rural welfare. GHG emissions data are already needed for such varied purposes as guiding national planning for low-emissions development, generating and trading carbon credits, certifying sustainable agriculture practices, informing consumers' choices with regard to reducing their carbon footprints, assessing product supply chains, and supporting farmers in adopting less carbon-intensive farming practices. Demonstrating the robustness, feasibility, and cost effectiveness of agricultural GHG inventories and monitoring is a necessary technical foundation for including agriculture in the international negotiations under the United Nations Framework Convention on Climate Change (UNFCCC), and is needed to provide robust data and methodology platforms for global corporate supply-chain initiatives (e.g., SAFA, FAO 2012). Given such varied drivers for GHG reductions, there are a number of uses for agricultural GHG information, including (1) reporting and accounting at the national or company level, (2) land-use planning and management to achieve specific objectives, (3) monitoring and evaluating impact of management, (4) developing a credible and thus tradable offset credit, and (5) research and capacity development. The information needs for these uses is likely to differ in the required level of certainty, scale of analysis, and need for comparability across systems or repeatability over time, and they may depend on whether

  7. COMPARING AND CONTRASTING THE ALTERNATIVE METHODOLOGIES AVAILABLE FOR EVALUATING THE IMPACT OF TOURISM

    Directory of Open Access Journals (Sweden)

    Silvana DJURASEVIC

    2007-06-01

    Full Text Available Tourism has impacts upon a destination country. The aim of this work is to compare and contrast the alternative methodologies available for evaluating the impact of tourism. Tourism can be one of alternatives for development of a destination and sometimes the only possibility. For that reason it is very important to compare benefit and cost that tourism brings with corresponding valyes of alternative investment. Obtained results represent a very important input for planning and also for decision making policy. Different methodologies bring different results, different techniques have their own strenghts and weaknesses. For that reason, depending on the need, it is important to combine the methodologies in order to achieve the maximal benefit and minimal costs, from economical aspect, socio-cultural and environmental development.

  8. Theoretical and methodological aspects of assessing economic effectiveness of nuclear power plant construction using cost-benefit analysis

    International Nuclear Information System (INIS)

    Moravcik, A.

    1984-01-01

    The cost benefit of investments is devided into social and economic benefits. The postulates are discussed for the assessment of the cost benefit of capital costs of nuclear power plants. The relations are given for total cost benefit of capital costs expressed by the total profit rate of capital costs, and the absolute effectiveness exoressed by the socio-economic benefit of capital costs. The absolute cost benefit of capital costs is characterized by several complex indexes. Comparable capital cost benefit is used for assessing the effectiveness of interchangeable variants of solution. The minimum calculated costs serve as the criterion for selecting the optimal variant. (E.S.)

  9. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    Science.gov (United States)

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  10. Behavioural Insights into Benefits Claimants' Training

    Science.gov (United States)

    Gloster, Rosie; Buzzeo, Jonathan; Cox, Annette; Bertram, Christine; Tassinari, Arianna; Schmidtke, Kelly Ann; Vlaev, Ivo

    2018-01-01

    Purpose: The purpose of this paper is to explore the behavioural determinants of work-related benefits claimants' training behaviours and to suggest ways to improve claimants' compliance with training referrals. Design/methodology/approach: Qualitative interviews were conducted with 20 Jobcentre Plus staff and training providers, and 60 claimants.…

  11. Module-based Hybrid Uncertainty Quantification for Multi-physics Applications: Theory and Software

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chen, Xiao [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Iaccarino, Gianluca [Stanford Univ., CA (United States); Mittal, Akshay [Stanford Univ., CA (United States)

    2013-10-08

    In this project we proposed to develop an innovative uncertainty quantification methodology that captures the best of the two competing approaches in UQ, namely, intrusive and non-intrusive approaches. The idea is to develop the mathematics and the associated computational framework and algorithms to facilitate the use of intrusive or non-intrusive UQ methods in different modules of a multi-physics multi-module simulation model in a way that physics code developers for different modules are shielded (as much as possible) from the chores of accounting for the uncertain ties introduced by the other modules. As the result of our research and development, we have produced a number of publications, conference presentations, and a software product.

  12. Methodological Challenges to Economic Evaluations of Vaccines: Is a Common Approach Still Possible?

    Science.gov (United States)

    Jit, Mark; Hutubessy, Raymond

    2016-06-01

    Economic evaluation of vaccination is a key tool to inform effective spending on vaccines. However, many evaluations have been criticised for failing to capture features of vaccines which are relevant to decision makers. These include broader societal benefits (such as improved educational achievement, economic growth and political stability), reduced health disparities, medical innovation, reduced hospital beds pressures, greater peace of mind and synergies in economic benefits with non-vaccine interventions. Also, the fiscal implications of vaccination programmes are not always made explicit. Alternative methodological frameworks have been proposed to better capture these benefits. However, any broadening of the methodology for economic evaluation must also involve evaluations of non-vaccine interventions, and hence may not always benefit vaccines given a fixed health-care budget. The scope of an economic evaluation must consider the budget from which vaccines are funded, and the decision-maker's stated aims for that spending to achieve.

  13. A Study on an Accident Diagnosis Methodology Using Influence Diagrams

    International Nuclear Information System (INIS)

    Kang, Kyungmin; Jae, Moosung

    2006-01-01

    For nuclear power plants, EOPs help operators to diagnose, control and mitigate accidents. However, it is very difficult that operators follow appropriate EOPs for accidents with similar symptoms in a given short period of time. Also EOPs are very complicated to follow and have many procedures to do. Therefore, if operators cannot diagnose correctly, the accident would become severe. Correct diagnostic action depends on the decision making ability of operators. Therefore, the methodology that can diagnose accidents quickly and help operators follow appropriate procedures should be developed. Due to the complexity of the tasks, it is very important to reduce human errors during diagnostic actions. In this study, to minimize human errors an accident diagnosis model has been constructed based on EOPs, accident symptoms and component reliabilities. For construction of model, Influence Diagrams have been applied. This decision-making tool consists of nodes and arcs. It is applicable to complicated situations, such as those required for developing strategies for managing severe accidents in nuclear power plants. And quantification of model has performed with total probability and Bayesian theorem. Through this quantification, the results should help operators diagnose complex situations

  14. Uncertainty Quantification given Discontinuous Climate Model Response and a Limited Number of Model Runs

    Science.gov (United States)

    Sargsyan, K.; Safta, C.; Debusschere, B.; Najm, H.

    2010-12-01

    Uncertainty quantification in complex climate models is challenged by the sparsity of available climate model predictions due to the high computational cost of model runs. Another feature that prevents classical uncertainty analysis from being readily applicable is bifurcative behavior in climate model response with respect to certain input parameters. A typical example is the Atlantic Meridional Overturning Circulation. The predicted maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We outline a methodology for uncertainty quantification given discontinuous model response and a limited number of model runs. Our approach is two-fold. First we detect the discontinuity with Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve shape and location for arbitrarily distributed input parameter values. Then, we construct spectral representations of uncertainty, using Polynomial Chaos (PC) expansions on either side of the discontinuity curve, leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification. The approach is enabled by a Rosenblatt transformation that maps each side of the discontinuity to regular domains where desirable orthogonality properties for the spectral bases hold. We obtain PC modes by either orthogonal projection or Bayesian inference, and argue for a hybrid approach that targets a balance between the accuracy provided by the orthogonal projection and the flexibility provided by the Bayesian inference - where the latter allows obtaining reasonable expansions without extra forward model runs. The model output, and its associated uncertainty at specific design points, are then computed by taking an ensemble average over PC expansions corresponding to possible realizations of the discontinuity curve. The methodology is tested on synthetic examples of

  15. An Uncertainty Quantification Framework for Remote Sensing Retrievals

    Science.gov (United States)

    Braverman, A. J.; Hobbs, J.

    2017-12-01

    Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.

  16. Methodology Declassification of Impacted Buildings. Application of Technology MARSSIM

    International Nuclear Information System (INIS)

    Vico, A.M.; Álvarez, A.; Gómez, J.M.; Quiñones, J.

    2015-01-01

    This work describes the material measurement methodology to assure the absence of contamination on impacted buildings due to processes related to the first part of the nuclear fuel cycle performed at the former Junta de Energía Nuclear, JEN, currently Centro de Investigaciones Energéticas Medioambientales y Tecnológicas, CIEMAT. The first part of the work encloses the identification and quantification of natural isotopes and its proportion in the studied surfaces through different analytical techniques. The experimental study has involved the proper equipment selection to carry out the field measurement and the characterization of uranium isotopes and their immediate descendants. According to European Union recommendations and specifications established by CSN (Consejo de Seguridad Nuclear), Spanish Regulatory authorities, for CIEMAT, the surface activity reference level have been established, which allow to decide if a surface can be classified as a conventional surface. In order to make decisions about the compliance with the established clearance criteria, MARSSIM methodology is applied by using the results obtained from field measurements (impacted and non impacted surfaces).

  17. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, Cetin; Williams, Brian; McClure, Patrick; Nelson, Ralph A.

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M and S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for

  18. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Williams, Brian [Los Alamos National Laboratory; Mc Clure, Patrick [Los Alamos National Laboratory; Nelson, Ralph A [IDAHO NATIONAL LAB

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M&S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for cost

  19. Risk-benefit evaluation for large technological systems

    International Nuclear Information System (INIS)

    Okrent, D.

    1979-01-01

    The related topics of risk-benefit analysis, risk analysis, and risk-acceptance criteria (How safe is safe enough) are of growing importance. An interdisciplinary study on various aspects of these topics, including applications to nuclear power, was recently completed at the University of California, Los Angeles (UCLA), with the support of the National Science Foundation. In addition to more than 30 topical reports and various open-literature publications, a final report (UCLA-ENG-7777) to the study, titled ''A Generalized Evaluation Approach to Risk--Benefit for Large Technological Systems and Its Application to Nuclear Power'', was issued in early 1978. This article briefly summarizes portions of the final report dealing with general aspects of risk-benefit methodology, societal knowledge and perception of risk, and risk-acceptance criteria

  20. Improving perfusion quantification in arterial spin labeling for delayed arrival times by using optimized acquisition schemes

    International Nuclear Information System (INIS)

    Kramme, Johanna; Diehl, Volker; Madai, Vince I.; Sobesky, Jan; Guenther, Matthias

    2015-01-01

    The improvement in Arterial Spin Labeling (ASL) perfusion quantification, especially for delayed bolus arrival times (BAT), with an acquisition redistribution scheme mitigating the T1 decay of the label in multi-TI ASL measurements is investigated. A multi inflow time (TI) 3D-GRASE sequence is presented which adapts the distribution of acquisitions accordingly, by keeping the scan time constant. The MR sequence increases the number of averages at long TIs and decreases their number at short TIs and thus compensating the T1 decay of the label. The improvement of perfusion quantification is evaluated in simulations as well as in-vivo in healthy volunteers and patients with prolonged BATs due to age or steno-occlusive disease. The improvement in perfusion quantification depends on BAT. At healthy BATs the differences are small, but become larger for longer BATs typically found in certain diseases. The relative error of perfusion is improved up to 30% at BATs > 1500 ms in comparison to the standard acquisition scheme. This adapted acquisition scheme improves the perfusion measurement in comparison to standard multi-TI ASL implementations. It provides relevant benefit in clinical conditions that cause prolonged BATs and is therefore of high clinical relevance for neuroimaging of steno-occlusive diseases.

  1. An Alternative to the Carlson-Parkin Method for the Quantification of Qualitative Inflation Expectations: Evidence from the Ifo World Economic Survey

    OpenAIRE

    Henzel, Steffen; Wollmershäuser, Timo

    2005-01-01

    This paper presents a new methodology for the quantification of qualitative survey data. Traditional conversion methods, such as the probability approach of Carlson and Parkin (1975) or the time-varying parameters model of Seitz (1988), require very restrictive assumptions concerning the expectations formation process of survey respondents. Above all, the unbiasedness of expectations, which is a necessary condition for rationality, is imposed. Our approach avoids these assumptions. The novelt...

  2. A Novel Water Supply Network Sectorization Methodology Based on a Complete Economic Analysis, Including Uncertainties

    Directory of Open Access Journals (Sweden)

    Enrique Campbell

    2016-04-01

    Full Text Available The core idea behind sectorization of Water Supply Networks (WSNs is to establish areas partially isolated from the rest of the network to improve operational control. Besides the benefits associated with sectorization, some drawbacks must be taken into consideration by water operators: the economic investment associated with both boundary valves and flowmeters and the reduction of both pressure and system resilience. The target of sectorization is to properly balance these negative and positive aspects. Sectorization methodologies addressing the economic aspects mainly consider costs of valves and flowmeters and of energy, and the benefits in terms of water saving linked to pressure reduction. However, sectorization entails other benefits, such as the reduction of domestic consumption, the reduction of burst frequency and the enhanced capacity to detect and intervene over future leakage events. We implement a development proposed by the International Water Association (IWA to estimate the aforementioned benefits. Such a development is integrated in a novel sectorization methodology based on a social network community detection algorithm, combined with a genetic algorithm optimization method and Monte Carlo simulation. The methodology is implemented over a fraction of the WSN of Managua city, capital of Nicaragua, generating a net benefit of 25,572 $/year.

  3. Radiation protection optimization using a knowledge based methodology

    International Nuclear Information System (INIS)

    Reyes-Jimenez, J.; Tsoukalas, L.H.

    1991-01-01

    This paper presents a knowledge based methodology for radiological planning and radiation protection optimization. The cost-benefit methodology described on International Commission of Radiation Protection Report No. 37 is employed within a knowledge based framework for the purpose of optimizing radiation protection and plan maintenance activities while optimizing radiation protection. 1, 2 The methodology is demonstrated through an application to a heating ventilation and air conditioning (HVAC) system. HVAC is used to reduce radioactivity concentration levels in selected contaminated multi-compartment models at nuclear power plants when higher than normal radiation levels are detected. The overall objective is to reduce personnel exposure resulting from airborne radioactivity, when routine or maintenance access is required in contaminated areas. 2 figs, 15 refs

  4. Wider benefits of adult education

    DEFF Research Database (Denmark)

    Schuller, Tom; Desjardins, Richard

    2010-01-01

    This article discusses the measurement of the social outcomes of learning. It extends the discussion beyond employment and labor market outcomes to consider the impact of adult learning on social domains, with particular focus on health and civic engagement. It emphasizes the distinction between ...... public and private, and monetary and nonmonetary benefits. It reviews methodological issues on measuring outcomes, and identifies a number of channels through which adult learning has its effects....

  5. Application of decision-making methodology to certificate-of-need applications for CT scanners

    International Nuclear Information System (INIS)

    Gottinger, H.W.; Shapiro, P.

    1985-01-01

    This paper describes a case study and application of decision-making methodology to two competing Certificate of Need (CON) applications for CT body scanners. We demonstrate the use of decision-making methodology by evaluating the CON applications. Explicit value judgements reflecting the monetary equivalent of the different categories of benefit are introduced to facilitate this comparison. The difference between the benefits (measured in monetary terms) and costs is called the net social value. Any alternative with positive net social value is judged economically justifiable, and the alternative with the greatest net social value is judged the most attractive. (orig.)

  6. Strategy study of quantification harmonization of SUV in PET/CT images; Estudo da estrategia de harmonizacao da quantificacao do SUV em imagens de PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-07-01

    and quantitative assessments in different scopes. We concluded that the harmonization strategy of the SUV quantification presented in this paper was effective in reducing the variability of small structures quantification. However, for the comparison of SUV quantification between different scanners and institutions, it is essential that, in addition to the harmonization of quantification, the standardization of the methodology of patient preparation is maintained, in order to minimize the SUV variability due to biological factors. (author)

  7. Probabilistic risk assessment modeling of digital instrumentation and control systems using two dynamic methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, T., E-mail: aldemir.1@osu.ed [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Guarro, S. [ASCA, Inc., 1720 S. Catalina Avenue, Suite 220, Redondo Beach, CA 90277-5501 (United States); Mandelli, D. [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Kirschenbaum, J. [Ohio State University, Department of Computer Science and Engineering, Columbus, OH 43210 (United States); Mangan, L.A. [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Bucci, P. [Ohio State University, Department of Computer Science and Engineering, Columbus, OH 43210 (United States); Yau, M. [ASCA, Inc., 1720 S. Catalina Avenue, Suite 220, Redondo Beach, CA 90277-5501 (United States); Ekici, E. [Ohio State University, Department of Electrical and Computer Engineering, Columbus, OH 43210 (United States); Miller, D.W.; Sun, X. [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Arndt, S.A. [U.S. Nuclear Regulatory Commission, Washington, DC 20555-0001 (United States)

    2010-10-15

    The Markov/cell-to-cell mapping technique (CCMT) and the dynamic flowgraph methodology (DFM) are two system logic modeling methodologies that have been proposed to address the dynamic characteristics of digital instrumentation and control (I and C) systems and provide risk-analytical capabilities that supplement those provided by traditional probabilistic risk assessment (PRA) techniques for nuclear power plants. Both methodologies utilize a discrete state, multi-valued logic representation of the digital I and C system. For probabilistic quantification purposes, both techniques require the estimation of the probabilities of basic system failure modes, including digital I and C software failure modes, that appear in the prime implicants identified as contributors to a given system event of interest. As in any other system modeling process, the accuracy and predictive value of the models produced by the two techniques, depend not only on the intrinsic features of the modeling paradigm, but also and to a considerable extent on information and knowledge available to the analyst, concerning the system behavior and operation rules under normal and off-nominal conditions, and the associated controlled/monitored process dynamics. The application of the two methodologies is illustrated using a digital feedwater control system (DFWCS) similar to that of an operating pressurized water reactor. This application was carried out to demonstrate how the use of either technique, or both, can facilitate the updating of an existing nuclear power plant PRA model following an upgrade of the instrumentation and control system from analog to digital. Because of scope limitations, the focus of the demonstration of the methodologies was intentionally limited to aspects of digital I and C system behavior for which probabilistic data was on hand or could be generated within the existing project bounds of time and resources. The data used in the probabilistic quantification portion of the

  8. Development of a methodology for conducting an integrated HRA/PRA --

    Energy Technology Data Exchange (ETDEWEB)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S. (Brookhaven National Lab., Upton, NY (United States)); Wreathall, J. (Wreathall (John) and Co., Dublin, OH (United States)); Cooper, S.E. (Science Applications International Corp., McLean, VA (United States))

    1993-01-01

    During Low Power and Shutdown (LP S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant's systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP S, (2) identification of potentially important LP S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP S conditions for a pressurized water reactor (PWR).

  9. Methodology to estimate the cost of the severe accidents risk / maximum benefit; Metodologia para estimar el costo del riesgo de accidentes severos / beneficio maximo

    Energy Technology Data Exchange (ETDEWEB)

    Mendoza, G.; Flores, R. M.; Vega, E., E-mail: gozalo.mendoza@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2016-09-15

    For programs and activities to manage aging effects, any changes to plant operations, inspections, maintenance activities, systems and administrative control procedures during the renewal period should be characterized, designed to manage the effects of aging as required by 10 Cfr Part 54 that could impact the environment. Environmental impacts significantly different from those described in the final environmental statement for the current operating license should be described in detail. When complying with the requirements of a license renewal application, the Severe Accident Mitigation Alternatives (SAMA) analysis is contained in a supplement to the environmental report of the plant that meets the requirements of 10 Cfr Part 51. In this paper, the methodology for estimating the cost of severe accidents risk is established and discussed, which is then used to identify and select the alternatives for severe accident mitigation, which are analyzed to estimate the maximum benefit that an alternative could achieve if this eliminate all risk. Using the regulatory analysis techniques of the US Nuclear Regulatory Commission (NRC) estimates the cost of severe accidents risk. The ultimate goal of implementing the methodology is to identify candidates for SAMA that have the potential to reduce the severe accidents risk and determine if the implementation of each candidate is cost-effective. (Author)

  10. Validation of an HPLC method for quantification of total quercetin in Calendula officinalis extracts

    International Nuclear Information System (INIS)

    Muñoz Muñoz, John Alexander; Morgan Machado, Jorge Enrique; Trujillo González, Mary

    2015-01-01

    Introduction: calendula officinalis extracts are used as natural raw material in a wide range of pharmaceutical and cosmetic preparations; however, there are no official methods for quality control of these extracts. Objective: to validate an HPLC-based analytical method for quantification total quercetin in glycolic and hydroalcoholic extracts of Calendula officinalis. Methods: to quantify total quercetin content in the matrices, it was necessary to hydrolyze flavonoid glycosides under optimal conditions. The chromatographic separation was performed on a C-18 SiliaChrom 4.6x150 mm 5 µm column, adapted to a SiliaChrom 5 um C-18 4.6x10 mm precolumn, with UV detection at 370 nm. The gradient elution was performed with a mobile phase consisting of methanol (MeOH) and phosphoric acid (H 3 PO 4 ) (0.08 % w/v). The quantification was performed through the external standard method and comparison with quercetin reference standard. Results: the studied method selectivity against extract components and degradation products under acid/basic hydrolysis, oxidation and light exposure conditions showed no signals that interfere with the quercetin quantification. It was statistically proved that the method is linear from 1.0 to 5.0 mg/mL. Intermediate precision expressed as a variation coefficient was 1.8 and 1.74 % and the recovery percentage was 102.15 and 101.32 %, for glycolic and hydroalcoholic extracts, respectively. Conclusions: the suggested methodology meets the quality parameters required for quantifying total quercetin, which makes it a useful tool for quality control of C. officinalis extracts. (author)

  11. Overview of hybrid subspace methods for uncertainty quantification, sensitivity analysis

    International Nuclear Information System (INIS)

    Abdel-Khalik, Hany S.; Bang, Youngsuk; Wang, Congjian

    2013-01-01

    Highlights: ► We overview the state-of-the-art in uncertainty quantification and sensitivity analysis. ► We overview new developments in above areas using hybrid methods. ► We give a tutorial introduction to above areas and the new developments. ► Hybrid methods address the explosion in dimensionality in nonlinear models. ► Representative numerical experiments are given. -- Abstract: The role of modeling and simulation has been heavily promoted in recent years to improve understanding of complex engineering systems. To realize the benefits of modeling and simulation, concerted efforts in the areas of uncertainty quantification and sensitivity analysis are required. The manuscript intends to serve as a pedagogical presentation of the material to young researchers and practitioners with little background on the subjects. We believe this is important as the role of these subjects is expected to be integral to the design, safety, and operation of existing as well as next generation reactors. In addition to covering the basics, an overview of the current state-of-the-art will be given with particular emphasis on the challenges pertaining to nuclear reactor modeling. The second objective will focus on presenting our own development of hybrid subspace methods intended to address the explosion in the computational overhead required when handling real-world complex engineering systems.

  12. Benefit-analysis of accomplishments from the magnetic fusion energy (MFE) research program

    International Nuclear Information System (INIS)

    Lago, A.M.; Weinblatt, H.; Hamilton, E.E.

    1987-01-01

    This report presents the results of a study commissioned by the US Department of Energy's (DOE) Office of Program Analysis to examine benefits from selected accomplishments of DOE's Magnetic Fusion Energy (MFE) Research Program. The study objectives are presented. The MFE-induced innovation and accomplishments which were studied are listed. Finally, the benefit estimation methodology used is described in detail. The next seven chapters document the results of benefit estimation for the MFE accomplishments studied

  13. Validation of a spectrophotometric methodology for the quantification of polysaccharides from roots of Operculina macrocarpa (jalapa

    Directory of Open Access Journals (Sweden)

    Marcos A.M. Galvão

    Full Text Available The roots from Operculina macrocarpa (L. Urb., Convolvulaceae, are widely used in Brazilian traditional medicine as a laxative and purgative. The biological properties of this drug material have been attributed to its polysaccharides content. Thus, the aim of this study was to evaluate the polysaccharide content in drug material from O. macrocarpa by spectrophotometric quantitative analysis. The root was used as plant material and the botanical identification was performed by macro and microscopic analysis. The plant material was used to validate the spectrophotometric procedures at 490 nm for the quantification of the reaction product from drug polysaccharides and phenol-sulfuric acid solution. The analytical procedure was evaluated in order to comply with the necessary legal requirements by the determination of the following parameters: specificity, linearity, selectivity, precision, accuracy and robustness. This study provides with a simple and valid analytical procedure (linear, precise, accurate and reproducible, which can be satisfactorily used for quality control and standardization of herbal drug from O. macrocarpa.

  14. A multifractal approach to space-filling recovery for PET quantification

    Energy Technology Data Exchange (ETDEWEB)

    Willaime, Julien M. Y., E-mail: julien.willaime@siemens.com; Aboagye, Eric O. [Comprehensive Cancer Imaging Centre, Imperial College London, Hammersmith Hospital, London W12 0NN (United Kingdom); Tsoumpas, Charalampos [Division of Medical Physics, University of Leeds, LS2 9JT (United Kingdom); Turkheimer, Federico E. [Department of Neuroimaging, Institute of Psychiatry, King’s College London, London SE5 8AF (United Kingdom)

    2014-11-01

    Purpose: A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). Methods: A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV {sub mean}) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal and synthetic objects contaminated by partial volume effects (PVEs), validated on realistic {sup 18}F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical {sup 18}F-fluorothymidine PET test–retest dataset. Results: TLA estimates were stable for a range of resolutions typical in PET oncology (4–6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV {sub mean} or TV measurements across imaging protocols. Conclusions: The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.

  15. The benefits of customer profitability analysis in the hospitality industry

    Directory of Open Access Journals (Sweden)

    Dragan Georgiev

    2017-03-01

    Full Text Available The article reveals the benefits of customer profitability analysis implementation according to the specifics of the hotel product and the state of the management accounting in hotels. On this basis is substantiated the necessity management accounting and information systems in the hotels to be anteriorly adapted and developed in relevance with the objectives and methodological tools of customer profitability analysis, while keeping their function in collecting information for operational revenues and costs by responsibility centers. A model for customer profitability analysis based on ABC method is proposed in this connection, providing an example to clarify its methodological aspects and benefits. The latter consist in providing information for the purposes of taking a variety of management decisions regarding costs, product mix, pricing, performance measurement and implementation of various marketing initiatives.

  16. Caffeine as an indicator for the quantification of untreated wastewater in karst systems.

    Science.gov (United States)

    Hillebrand, Olav; Nödler, Karsten; Licha, Tobias; Sauter, Martin; Geyer, Tobias

    2012-02-01

    Contamination from untreated wastewater leakage and related bacterial contamination poses a threat to drinking water quality. However, a quantification of the magnitude of leakage is difficult. The objective of this work is to provide a highly sensitive methodology for the estimation of the mass of untreated wastewater entering karst aquifers with rapid recharge. For this purpose a balance approach is adapted. It is based on the mass flow of caffeine in spring water, the load of caffeine in untreated wastewater and the daily water consumption per person in a spring catchment area. Caffeine is a source-specific indicator for wastewater, consumed and discharged in quantities allowing detection in a karst spring. The methodology was applied to estimate the amount of leaking and infiltrating wastewater to a well investigated karst aquifer on a daily basis. The calculated mean volume of untreated wastewater entering the aquifer was found to be 2.2 ± 0.5 m(3) d(-1) (undiluted wastewater). It corresponds to approximately 0.4% of the total amount of wastewater within the spring catchment. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Sharing water and benefits in transboundary river basins

    Science.gov (United States)

    Arjoon, Diane; Tilmant, Amaury; Herrmann, Markus

    2016-06-01

    The equitable sharing of benefits in transboundary river basins is necessary to solve disputes among riparian countries and to reach a consensus on basin-wide development and management activities. Benefit-sharing arrangements must be collaboratively developed to be perceived not only as efficient, but also as equitable in order to be considered acceptable to all riparian countries. The current literature mainly describes what is meant by the term benefit sharing in the context of transboundary river basins and discusses this from a conceptual point of view, but falls short of providing practical, institutional arrangements that ensure maximum economic welfare as well as collaboratively developed methods for encouraging the equitable sharing of benefits. In this study, we define an institutional arrangement that distributes welfare in a river basin by maximizing the economic benefits of water use and then sharing these benefits in an equitable manner using a method developed through stakeholder involvement. We describe a methodology in which (i) a hydrological model is used to allocate scarce water resources, in an economically efficient manner, to water users in a transboundary basin, (ii) water users are obliged to pay for water, and (iii) the total of these water charges is equitably redistributed as monetary compensation to users in an amount determined through the application of a sharing method developed by stakeholder input, thus based on a stakeholder vision of fairness, using an axiomatic approach. With the proposed benefit-sharing mechanism, the efficiency-equity trade-off still exists, but the extent of the imbalance is reduced because benefits are maximized and redistributed according to a key that has been collectively agreed upon by the participants. The whole system is overseen by a river basin authority. The methodology is applied to the Eastern Nile River basin as a case study. The described technique not only ensures economic efficiency, but may

  18. The measurement of employment benefits

    Energy Technology Data Exchange (ETDEWEB)

    Burtraw, D

    1994-07-01

    The consideration of employment effects and so-called 'hidden employment benefits' is one of the most confused and contentious issues in benefit-cost analysis and applied welfare economics generally. New investments create new employment opportunities, and often advocates for specific investments cite these employment opportunities as alleged benefits associated with the project. Indeed, from the local perspective, such employment opportunities may appear to be beneficial because they appear to come for free. If there is unemployment in the local area, then new investments create valuable employment opportunities for those in the local community. Even if there is full employment in the local area then new investments create incentives for immigrant from other locations that may have pecuniary benefits locally through increased property values, business revenues, etc. The focus in this study is on net economic benefits from a broad national perspective. From this perspective, many of the alleged employment benefits at the local level are offset by lost benefits at other locales, and do not count as benefits according to economic theory. This paper outlines a methodology for testing this rebuttable presumption with empirical data pertaining to labor markets that would be affected by a specific new investment. The theoretical question that is relevant is whether the social opportunity cost of new employment is less than the market wage. This would be the case, for example, if one expects unemployment or underemployment to persist in a specific region of the economy or occupational category affected by the new investment. In this case, new employment opportunities produce a net increase in social wealth rather than just a transfer of income.

  19. The measurement of employment benefits

    International Nuclear Information System (INIS)

    Burtraw, D.

    1994-01-01

    The consideration of employment effects and so-called 'hidden employment benefits' is one of the most confused and contentious issues in benefit-cost analysis and applied welfare economics generally. New investments create new employment opportunities, and often advocates for specific investments cite these employment opportunities as alleged benefits associated with the project. Indeed, from the local perspective, such employment opportunities may appear to be beneficial because they appear to come for free. If there is unemployment in the local area, then new investments create valuable employment opportunities for those in the local community. Even if there is full employment in the local area then new investments create incentives for immigrant from other locations that may have pecuniary benefits locally through increased property values, business revenues, etc. The focus in this study is on net economic benefits from a broad national perspective. From this perspective, many of the alleged employment benefits at the local level are offset by lost benefits at other locales, and do not count as benefits according to economic theory. This paper outlines a methodology for testing this rebuttable presumption with empirical data pertaining to labor markets that would be affected by a specific new investment. The theoretical question that is relevant is whether the social opportunity cost of new employment is less than the market wage. This would be the case, for example, if one expects unemployment or underemployment to persist in a specific region of the economy or occupational category affected by the new investment. In this case, new employment opportunities produce a net increase in social wealth rather than just a transfer of income

  20. Ideas underlying quantification of margins and uncertainties(QMU): a white paper.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon Craig; Trucano, Timothy Guy; Pilch, Martin M.

    2006-09-01

    This report describes key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions at Sandia National Laboratories. While QMU is a broad process and methodology for generating critical technical information to be used in stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, we discuss the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, the need to separate aleatory and epistemic uncertainty in QMU, and the risk-informed decision making that is best suited for decisive application of QMU. The paper is written at a high level, but provides a systematic bibliography of useful papers for the interested reader to deepen their understanding of these ideas.

  1. Quantification of heterogeneity as a biomarker in tumor imaging: a systematic review.

    Directory of Open Access Journals (Sweden)

    Lejla Alic

    Full Text Available BACKGROUND: Many techniques are proposed for the quantification of tumor heterogeneity as an imaging biomarker for differentiation between tumor types, tumor grading, response monitoring and outcome prediction. However, in clinical practice these methods are barely used. This study evaluates the reported performance of the described methods and identifies barriers to their implementation in clinical practice. METHODOLOGY: The Ovid, Embase, and Cochrane Central databases were searched up to 20 September 2013. Heterogeneity analysis methods were classified into four categories, i.e., non-spatial methods (NSM, spatial grey level methods (SGLM, fractal analysis (FA methods, and filters and transforms (F&T. The performance of the different methods was compared. PRINCIPAL FINDINGS: Of the 7351 potentially relevant publications, 209 were included. Of these studies, 58% reported the use of NSM, 49% SGLM, 10% FA, and 28% F&T. Differentiation between tumor types, tumor grading and/or outcome prediction was the goal in 87% of the studies. Overall, the reported area under the curve (AUC ranged from 0.5 to 1 (median 0.87. No relation was found between the performance and the quantification methods used, or between the performance and the imaging modality. A negative correlation was found between the tumor-feature ratio and the AUC, which is presumably caused by overfitting in small datasets. Cross-validation was reported in 63% of the classification studies. Retrospective analyses were conducted in 57% of the studies without a clear description. CONCLUSIONS: In a research setting, heterogeneity quantification methods can differentiate between tumor types, grade tumors, and predict outcome and monitor treatment effects. To translate these methods to clinical practice, more prospective studies are required that use external datasets for validation: these datasets should be made available to the community to facilitate the development of new and improved

  2. Benefit/cost comparisons for utility SMES applications

    International Nuclear Information System (INIS)

    De Steese, J.G.; Dagle, J.E.

    1991-01-01

    This paper summarizes eight cases studies that account for the benefits and costs of superconducting magnetic energy storage (SMES) in system-specific utility applications. Four of these scenarios are hypothetical SMES application in the Pacific Northwest, where relatively low energy costs impose a stringent test on the viability of the concept. The other four scenarios address SMES applications on high-voltage, direct-current (HVDC) transmission lines. While estimated SMES benefits are based on a previously reported methodology, this paper presents results of an improved cost-estimating approach that includes an assumed reduction in the cost of the power conditioning system (PCS) from approximately $160/kW to $80/kW. The revised approach results in all the SMES scenarios showing higher benefit/cost ratios that those reported earlier. However, in all but two cases, the value of any single benefit is still less than the unit's levelized cost. This suggests, as a general principle, that the total value of multiple benefits should always be considered if SMES is to appear cost effective in may utility applications. These results should offer utilities further encouragement to conduct more detailed analyses of SMES benefits in scenarios that apply to individual systems

  3. Benefit/cost comparison for utility SMES applications

    Science.gov (United States)

    Desteese, J. G.; Dagle, J. E.

    1991-08-01

    This paper summarizes eight case studies that account for the benefits and costs of superconducting magnetic energy storage (SMES) in system-specific utility applications. Four of these scenarios are hypothetical SMES applications in the Pacific Northwest, where relatively low energy costs impose a stringent test on the viability of the concept. The other four scenarios address SMES applications on high-voltage, direct-current (HVDC) transmission lines. While estimated SMES benefits are based on a previously reported methodology, this paper presents results of an improved cost-estimating approach that includes an assumed reduction in the cost of the power conditioning system (PCS) from approximately $160/kW to $80/kW. The revised approach results in all the SMES scenarios showing higher benefit/cost ratios than those reported earlier. However, in all but two cases, the value of any single benefit is still less than the unit's levelized cost. This suggests, as a general principle, that the total value of multiple benefits should always be considered if SMES is to appear cost effective in many utility applications. These results should offer utilities further encouragement to conduct more detailed analyses of SMES benefits in scenarios that apply to individual systems.

  4. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    Science.gov (United States)

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  5. An Innovative Fuzzy-Logic-Based Methodology for Trend Identification

    International Nuclear Information System (INIS)

    Wang Xin; Tsoukalas, Lefteri H.; Wei, Thomas Y.C.; Reifman, Jaques

    2001-01-01

    A new fuzzy-logic-based methodology for on-line signal trend identification is introduced. The methodology may be used for detecting the onset of nuclear power plant (NPP) transients at the earliest possible time and could be of great benefit to diagnostic, maintenance, and performance-monitoring programs. Although signal trend identification is complicated by the presence of noise, fuzzy methods can help capture important features of on-line signals, integrate the information included in these features, and classify incoming NPP signals into increasing, decreasing, and steady-state trend categories. A computer program named PROTREN is developed and tested for the purpose of verifying this methodology using NPP and simulation data. The results indicate that the new fuzzy-logic-based methodology is capable of detecting transients accurately, it identifies trends reliably and does not misinterpret a steady-state signal as a transient one

  6. Risks and benefits of energy systems in Czechoslovakia

    International Nuclear Information System (INIS)

    Bohal, L.; Erban, P.; Kadlec, J.; Kraus, V.; Trcka, V.

    1984-01-01

    The paper describes the fundamental philosophy of an approach to risk and benefit assessment in the fuel and energy complex in Czechoslovakia. The first part analyses the need to solve the risk and benefit problems stemming from structural changes occurring in the Czechoslovakian fuel and energy complex. The second part describes main features of risk and benefit research with special respect to the fuel and energy complex defined within the framework of the national economy with interfaces to the relevant environment. Furthermore, a glimpse is given of how to assess, using the general philosophy, the risks and benefits of various developing variants of the fuel and energy complex. The third part deals with methodological aspects of such risk and benefit evaluation research with special consideration of the methods of long-term prediction in structural analysis and multi-measure assessment. Finally, further progress in solving these problems in VUPEK and some other Czechoslovakian scientific institutions is briefly noted. (author)

  7. A proposed quantitative credit-rating methodology for South African provincial departments

    OpenAIRE

    Erika Fourie; Tanja Verster; Gary Wayne van Vuuren

    2016-01-01

    The development of subnational credit-rating methodologies affords benefits for subnationals, the sovereign and its citizens. Trusted credit ratings facilitate access to financial markets and above-average ratings allow for the negotiation of better collateral and guarantee agreements, as well as for funding of, for example, infrastructure projects at superior (lower) interest rates. This paper develops the quantitative section of a credit-rating methodology for South African subnationals. Th...

  8. Methodology for economic evaluation of software development projects

    International Nuclear Information System (INIS)

    Witte, D.M.

    1990-01-01

    Many oil and gas exploration and production companies develop computer software in-house or with contract programmers to support their exploration activities. Software development projects compete for funding with exploration and development projects, though most companies lack valid comparison measures for the two types of projects. This paper presents a methodology of pro form a cash flow analysis for software development proposals intended for internal use. This methodology, based on estimates of development and support costs, exploration benefits, and probability of successful development and implementation, can be used to compare proposed software development projects directly with competing exploration proposals

  9. Expectations for methodology and translation of animal research: a survey of health care workers.

    Science.gov (United States)

    Joffe, Ari R; Bara, Meredith; Anton, Natalie; Nobis, Nathan

    2015-05-07

    Health care workers (HCW) often perform, promote, and advocate use of public funds for animal research (AR); therefore, an awareness of the empirical costs and benefits of animal research is an important issue for HCW. We aim to determine what health-care-workers consider should be acceptable standards of AR methodology and translation rate to humans. After development and validation, an e-mail survey was sent to all pediatricians and pediatric intensive care unit nurses and respiratory-therapists (RTs) affiliated with a Canadian University. We presented questions about demographics, methodology of AR, and expectations from AR. Responses of pediatricians and nurses/RTs were compared using Chi-square, with P methodological quality, most respondents expect that: AR is done to high quality; costs and difficulty are not acceptable justifications for low quality; findings should be reproducible between laboratories and strains of the same species; and guidelines for AR funded with public money should be consistent with these expectations. Asked about benefits of AR, most thought that there are sometimes/often large benefits to humans from AR, and disagreed that "AR rarely produces benefit to humans." Asked about expectations of translation to humans (of toxicity, carcinogenicity, teratogenicity, and treatment findings), most: expect translation >40% of the time; thought that misleading AR results should occur methodological quality of, and the translation rate to humans of findings from AR. These expectations are higher than the empirical data show having been achieved. Unless these areas of AR significantly improve, HCW support of AR may be tenuous.

  10. eAMI: A Qualitative Quantification of Periodic Breathing Based on Amplitude of Oscillations

    Science.gov (United States)

    Fernandez Tellez, Helio; Pattyn, Nathalie; Mairesse, Olivier; Dolenc-Groselj, Leja; Eiken, Ola; Mekjavic, Igor B.; Migeotte, P. F.; Macdonald-Nethercott, Eoin; Meeusen, Romain; Neyt, Xavier

    2015-01-01

    Study Objectives: Periodic breathing is sleep disordered breathing characterized by instability in the respiratory pattern that exhibits an oscillatory behavior. Periodic breathing is associated with increased mortality, and it is observed in a variety of situations, such as acute hypoxia, chronic heart failure, and damage to respiratory centers. The standard quantification for the diagnosis of sleep related breathing disorders is the apnea-hypopnea index (AHI), which measures the proportion of apneic/hypopneic events during polysomnography. Determining the AHI is labor-intensive and requires the simultaneous recording of airflow and oxygen saturation. In this paper, we propose an automated, simple, and novel methodology for the detection and qualification of periodic breathing: the estimated amplitude modulation index (eAMI). Patients or Participants: Antarctic cohort (3,800 meters): 13 normal individuals. Clinical cohort: 39 different patients suffering from diverse sleep-related pathologies. Measurements and Results: When tested in a population with high levels of periodic breathing (Antarctic cohort), eAMI was closely correlated with AHI (r = 0.95, P Dolenc-Groselj L, Eiken O, Mekjavic IB, Migeotte PF, Macdonald-Nethercott E, Meeusen R, Neyt X. eAMI: a qualitative quantification of periodic breathing based on amplitude of oscillations. SLEEP 2015;38(3):381–389. PMID:25581914

  11. A Bayesian approach for quantification of model uncertainty

    International Nuclear Information System (INIS)

    Park, Inseok; Amarchinta, Hemanth K.; Grandhi, Ramana V.

    2010-01-01

    In most engineering problems, more than one model can be created to represent an engineering system's behavior. Uncertainty is inevitably involved in selecting the best model from among the models that are possible. Uncertainty in model selection cannot be ignored, especially when the differences between the predictions of competing models are significant. In this research, a methodology is proposed to quantify model uncertainty using measured differences between experimental data and model outcomes under a Bayesian statistical framework. The adjustment factor approach is used to propagate model uncertainty into prediction of a system response. A nonlinear vibration system is used to demonstrate the processes for implementing the adjustment factor approach. Finally, the methodology is applied on the engineering benefits of a laser peening process, and a confidence band for residual stresses is established to indicate the reliability of model prediction.

  12. Optimization of SPECT calibration for quantification of images applied to dosimetry with iodine-131

    International Nuclear Information System (INIS)

    Carvalho, Samira Marques de

    2018-01-01

    SPECT systems calibration plays an essential role in the accuracy of the quantification of images. In this work, in its first stage, an optimized SPECT calibration method was proposed for 131 I studies, considering the partial volume effect (PVE) and the position of the calibration source. In the second stage, the study aimed to investigate the impact of count density and reconstruction parameters on the determination of the calibration factor and the quantification of the image in dosimetry studies, considering the reality of clinical practice in Brazil. In the final step, the study aimed evaluating the influence of several factors in the calibration for absorbed dose calculation using Monte Carlo simulations (MC) GATE code. Calibration was performed by determining a calibration curve (sensitivity versus volume) obtained by applying different thresholds. Then, the calibration factors were determined with an exponential function adjustment. Images were performed with high and low counts densities for several source positions within the simulator. To validate the calibration method, the calibration factors were used for absolute quantification of the total reference activities. The images were reconstructed adopting two approaches of different parameters, usually used in patient images. The methodology developed for the calibration of the tomographic system was easier and faster to implement than other procedures suggested to improve the accuracy of the results. The study also revealed the influence of the location of the calibration source, demonstrating better precision in the absolute quantification considering the location of the target region during the calibration of the system. The study applied in the Brazilian thyroid protocol suggests the revision of the calibration of the SPECT system, including different positions for the reference source, besides acquisitions considering the Signal to Noise Ratio (SNR) of the images. Finally, the doses obtained with the

  13. (1) H-MRS processing parameters affect metabolite quantification

    DEFF Research Database (Denmark)

    Bhogal, Alex A; Schür, Remmelt R; Houtepen, Lotte C

    2017-01-01

    investigated the influence of model parameters and spectral quantification software on fitted metabolite concentration values. Sixty spectra in 30 individuals (repeated measures) were acquired using a 7-T MRI scanner. Data were processed by four independent research groups with the freedom to choose their own...... + NAAG/Cr + PCr and Glu/Cr + PCr, respectively. Metabolite quantification using identical (1) H-MRS data was influenced by processing parameters, basis sets and software choice. Locally preferred processing choices affected metabolite quantification, even when using identical software. Our results......Proton magnetic resonance spectroscopy ((1) H-MRS) can be used to quantify in vivo metabolite levels, such as lactate, γ-aminobutyric acid (GABA) and glutamate (Glu). However, there are considerable analysis choices which can alter the accuracy or precision of (1) H-MRS metabolite quantification...

  14. Development of a methodology for the application of the analysis of human reliability to individualized temporary storage facility; Desarrollo de una metodologia de aplicacion del Analisis de Fiabilidad Humana a una instalacion de Almacen Temporal Individualizado

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, P.; Dies, J.; Tapia, C.; Blas, A. de

    2014-07-01

    The paper aims to present the methodology that has been developed with the purpose of applying an ATI without the need of having experts during the process of modelling and quantification analysis of HRA. The developed methodology is based on ATHEANA and relies on the use of other methods of analysis of human action and in-depth analysis. (Author)

  15. Benchmarking common quantification strategies for large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Hogrebe, Alexander; von Stechow, Louise; Bekker-Jensen, Dorte B

    2018-01-01

    Comprehensive mass spectrometry (MS)-based proteomics is now feasible, but reproducible quantification remains challenging, especially for post-translational modifications such as phosphorylation. Here, we compare the most popular quantification techniques for global phosphoproteomics: label-free...

  16. Need for a marginal methodology in assessing natural gas system methane emissions in response to incremental consumption.

    Science.gov (United States)

    Mac Kinnon, Michael; Heydarzadeh, Zahra; Doan, Quy; Ngo, Cuong; Reed, Jeff; Brouwer, Jacob

    2018-05-17

    Accurate quantification of methane emissions from the natural gas system is important for establishing greenhouse gas inventories and understanding cause and effect for reducing emissions. Current carbon intensity methods generally assume methane emissions are proportional to gas throughput so that increases in gas consumption yield linear increases in emitted methane. However, emissions sources are diverse and many are not proportional to throughput. Insights into the causal drivers of system methane emissions, and how system-wide changes affect such drivers are required. The development of a novel cause-based methodology to assess marginal methane emissions per unit of fuel consumed is introduced. The carbon intensities of technologies consuming natural gas are critical metrics currently used in policy decisions for reaching environmental goals. For example, the low-carbon fuel standard in California uses carbon intensity to determine incentives provided. Current methods generally assume methane emissions from the natural gas system are completely proportional to throughput. The proposed cause-based marginal emissions method will provide a better understanding of the actual drivers of emissions to support development of more effective mitigation measures. Additionally, increasing the accuracy of carbon intensity calculations supports the development of policies that can maximize the environmental benefits of alternative fuels, including reducing greenhouse gas emissions.

  17. Validation of the methodology for quantitative determination of arsenic in drinking water by hydride generation

    International Nuclear Information System (INIS)

    Silva Trejos, Paulina

    2008-01-01

    The analytical methodology was validated to quantitatively determine the arsenic in drinking water. The atomic absorption method for hydride generation was used. The percentage of recovery for the digestion of the samples was determined in a microwave oven with lots of HNO 3 , the results concluded that the optimal amount to 5,00 mL of sample was 0,50 mL with a recovery rate of 90,5 ±0, 5. The field of optimal linearity was 0-30 ppb with a correlation coefficient of 0,9994. The limits of detection and quantification limits according to Miller and Miller were 1,20 ± 0,02 and 4,01±0,02, respectively. The precision was evaluated by determining the repeatability and reproducibility, the results obtained were 0,34 and 0,30, respectively. The evaluation of the accuracy can report a -1,1 bias. The drinking water sample taken from the laboratory pipe showed As concentrations below the limits of quantification reported in this investigation. (author) [es

  18. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  19. Dose assessment by quantification of chromosome aberrations and micronuclei in peripheral blood lymphocytes from patients exposed to gamma radiation

    Energy Technology Data Exchange (ETDEWEB)

    Silva-Barbosa, Isvania; Pereira-MagnataI, Simey; Amaral, Ademir [Pernambuco Univ., Recife, PE (Brazil). Dept. de Energia Nuclear. Grupo de Estudos em Radioprotecao e Radioecologia - GERAR; Sotero, Graca [Fundacao de Hematologia e Hemoterapia, Recife, PE (Brazil); Melo, Homero Cavalcanti [Hospital do Cancer, Recife, PE (Brazil). Centro de Radioterapia de Pernambuco]. E-mail: isvania@uol.com.br

    2005-07-15

    Scoring of unstable chromosome aberrations (dicentrics, rings and fragments) and micronuclei in circulating lymphocytes are the most extensively studied biological means for estimating individual exposure to ionizing radiation (IR), which can be used as complementary methods to physical dosimetry or when the latter cannot be performed. In this work, the quantification of the frequencies of chromosome aberrations and micronuclei were carried out based on cytogenetic analyses of peripheral blood samples from 5 patients with cervical uterine cancer following radiotherapy in order to evaluate the absorbed dose as a result of partial-body exposure to 60Co source. Blood samples were collected from each patient in three phases of the treatment: before irradiation, 24 h after receiving 0.08 Gy and 1.8 Gy, respectively. The results presented in this report emphasize biological dosimetry, employing the quantification of chromosome aberrations and micronuclei in lymphocytes from peripheral blood, as an important methodology of dose assessment for either whole or partial-body exposure to IR.

  20. On the Dichotomy of Qualitative and Quantitative Researches in Contemporary Scientific Methodology

    Directory of Open Access Journals (Sweden)

    U V Suvakovic

    2011-12-01

    Full Text Available Argumentation in favor of overcoming the long-ago-established dichotomy of qualitative and quantitative scientific research is presented in the article. Proceeding from the view of materialistic dialecticians that every scientific research must deal with a subject, the author assumes that it is impossible to conduct a quantitative research without first establishing the quality to be studied. This also concerns measuring, which is referred only to quantitative procedures in literature. By way of illustration, the author designs two instruments for measuring the successfulness of political parties - the scale and the quotient of party successfulness. On the other hand, even the qualitative analysis usually involves certain quantifications. The author concludes that to achieve methodological correctness the existing dichotomy of qualitative and quantitative research should be considered as overcome and a typology of scientific research including predominantly qualitative and predominantly quantitative studies, depending on the methodological components prevailing in them, should be used.

  1. Concept mapping methodology and community-engaged research: A perfect pairing.

    Science.gov (United States)

    Vaughn, Lisa M; Jones, Jennifer R; Booth, Emily; Burke, Jessica G

    2017-02-01

    Concept mapping methodology as refined by Trochim et al. is uniquely suited to engage communities in all aspects of research from project set-up to data collection to interpreting results to dissemination of results, and an increasing number of research studies have utilized the methodology for exploring complex health issues in communities. In the current manuscript, we present the results of a literature search of peer-reviewed articles in health-related research where concept mapping was used in collaboration with the community. A total of 103 articles met the inclusion criteria. We first address how community engagement was defined in the articles and then focus on the articles describing high community engagement and the associated community outcomes/benefits and methodological challenges. A majority (61%; n=63) of the articles were classified as low to moderate community engagement and participation while 38% (n=39) of the articles were classified as high community engagement and participation. The results of this literature review enhance our understanding of how concept mapping can be used in direct collaboration with communities and highlights the many potential benefits for both researchers and communities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Combined multi-criteria and cost-benefit analysis

    DEFF Research Database (Denmark)

    Moshøj, Claus Rehfeld

    1996-01-01

    The paper is an introduction to both theory and application of combined Cost-Benefit and Multi-Criteria Analysis. The first section is devoted to basic utility theory and its practical application in Cost-Benefit Analysis. Based on some of the problems encountered, arguments in favour...... of the application of utility-based Multi-Criteria Analyses methods as an extension and refinement of the traditional Cost-Benefit Analysis are provided. The theory presented in this paper is closely related the methods used in the WARP software (Leleur & Jensen, 1989). The presentation is however wider in scope.......The second section introduces the stated preference methodology used in WARP to create weight profiles for project pool sensitivity analysis. This section includes a simple example. The third section discusses how decision makers can get a priori aid to make their pair-wise comparisons based on project pool...

  3. Quantification of localized vertebral deformities using a sparse wavelet-based shape model.

    Science.gov (United States)

    Zewail, R; Elsafi, A; Durdle, N

    2008-01-01

    Medical experts often examine hundreds of spine x-ray images to determine existence of various pathologies. Common pathologies of interest are anterior osteophites, disc space narrowing, and wedging. By careful inspection of the outline shapes of the vertebral bodies, experts are able to identify and assess vertebral abnormalities with respect to the pathology under investigation. In this paper, we present a novel method for quantification of vertebral deformation using a sparse shape model. Using wavelets and Independent component analysis (ICA), we construct a sparse shape model that benefits from the approximation power of wavelets and the capability of ICA to capture higher order statistics in wavelet space. The new model is able to capture localized pathology-related shape deformations, hence it allows for quantification of vertebral shape variations. We investigate the capability of the model to predict localized pathology related deformations. Next, using support-vector machines, we demonstrate the diagnostic capabilities of the method through the discrimination of anterior osteophites in lumbar vertebrae. Experiments were conducted using a set of 150 contours from digital x-ray images of lumbar spine. Each vertebra is labeled as normal or abnormal. Results reported in this work focus on anterior osteophites as the pathology of interest.

  4. Colour thresholding and objective quantification in bioimaging

    Science.gov (United States)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  5. Quantification analysis of CT for aphasic patients

    International Nuclear Information System (INIS)

    Watanabe, Shunzo; Ooyama, Hiroshi; Hojo, Kei; Tasaki, Hiroichi; Hanazono, Toshihide; Sato, Tokijiro; Metoki, Hirobumi; Totsuka, Motokichi; Oosumi, Noboru.

    1987-01-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on Slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis). (author)

  6. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  7. The Generalized Roy Model and the Cost-Benefit Analysis of Social Programs*

    Science.gov (United States)

    Eisenhauer, Philipp; Heckman, James J.; Vytlacil, Edward

    2015-01-01

    The literature on treatment effects focuses on gross benefits from program participation. We extend this literature by developing conditions under which it is possible to identify parameters measuring the cost and net surplus from program participation. Using the generalized Roy model, we nonparametrically identify the cost, benefit, and net surplus of selection into treatment without requiring the analyst to have direct information on the cost. We apply our methodology to estimate the gross benefit and net surplus of attending college. PMID:26709315

  8. The Generalized Roy Model and the Cost-Benefit Analysis of Social Programs.

    Science.gov (United States)

    Eisenhauer, Philipp; Heckman, James J; Vytlacil, Edward

    2015-04-01

    The literature on treatment effects focuses on gross benefits from program participation. We extend this literature by developing conditions under which it is possible to identify parameters measuring the cost and net surplus from program participation. Using the generalized Roy model, we nonparametrically identify the cost, benefit, and net surplus of selection into treatment without requiring the analyst to have direct information on the cost. We apply our methodology to estimate the gross benefit and net surplus of attending college.

  9. Performance of the Real-Q EBV Quantification Kit for Epstein-Barr Virus DNA Quantification in Whole Blood.

    Science.gov (United States)

    Huh, Hee Jae; Park, Jong Eun; Kim, Ji Youn; Yun, Sun Ae; Lee, Myoung Keun; Lee, Nam Yong; Kim, Jong Won; Ki, Chang Seok

    2017-03-01

    There has been increasing interest in standardized and quantitative Epstein-Barr virus (EBV) DNA testing for the management of EBV disease. We evaluated the performance of the Real-Q EBV Quantification Kit (BioSewoom, Korea) in whole blood (WB). Nucleic acid extraction and real-time PCR were performed by using the MagNA Pure 96 (Roche Diagnostics, Germany) and 7500 Fast real-time PCR system (Applied Biosystems, USA), respectively. Assay sensitivity, linearity, and conversion factor were determined by using the World Health Organization international standard diluted in EBV-negative WB. We used 81 WB clinical specimens to compare performance of the Real-Q EBV Quantification Kit and artus EBV RG PCR Kit (Qiagen, Germany). The limit of detection (LOD) and limit of quantification (LOQ) for the Real-Q kit were 453 and 750 IU/mL, respectively. The conversion factor from EBV genomic copies to IU was 0.62. The linear range of the assay was from 750 to 10⁶ IU/mL. Viral load values measured with the Real-Q assay were on average 0.54 log₁₀ copies/mL higher than those measured with the artus assay. The Real-Q assay offered good analytical performance for EBV DNA quantification in WB.

  10. Tannins quantification in barks of Mimosa tenuiflora and Acacia mearnsii

    Directory of Open Access Journals (Sweden)

    Leandro Calegari

    2016-03-01

    Full Text Available Due to its chemical complexity, there are several methodologies for vegetable tannins quantification. Thus, this work aims at quantifying both tannin and non-tannin substances present in the barks of Mimosa tenuiflora and Acacia mearnsii by two different methods. From bark particles of both species, analytical solutions were produced by using a steam-jacketed extractor. The solution was analyzed by Stiasny and hide-powder (no chromed methods. For both species, tannin levels were superior when analyzed by hide-powder method, reaching 47.8% and 24.1% for A. mearnsii and M. tenuiflora, respectively. By Stiasny method, the tannins levels considered were 39.0% for A. mearnsii, and 15.5% for M. tenuiflora. Despite the best results presented by A. mearnsii, the bark of M. tenuiflora also showed great potential due to its considerable amount of tannin and the availability of the species at Caatinga biome.

  11. Real-time polymerase chain reaction-based approach for quantification of the pat gene in the T25 Zea mays event.

    Science.gov (United States)

    Weighardt, Florian; Barbati, Cristina; Paoletti, Claudia; Querci, Maddalena; Kay, Simon; De Beuckeleer, Marc; Van den Eede, Guy

    2004-01-01

    In Europe, a growing interest for reliable techniques for the quantification of genetically modified component(s) of food matrixes is arising from the need to comply with the European legislative framework on novel food products. Real-time polymerase chain reaction (PCR) is currently the most powerful technique for the quantification of specific nucleic acid sequences. Several real-time PCR methodologies based on different molecular principles have been developed for this purpose. The most frequently used approach in the field of genetically modified organism (GMO) quantification in food or feed samples is based on the 5'-3'-exonuclease activity of Taq DNA polymerase on specific degradation probes (TaqMan principle). A novel approach was developed for the establishment of a TaqMan quantification system assessing GMO contents around the 1% threshold stipulated under European Union (EU) legislation for the labeling of food products. The Zea mays T25 elite event was chosen as a model for the development of the novel GMO quantification approach. The most innovative aspect of the system is represented by the use of sequences cloned in plasmids as reference standards. In the field of GMO quantification, plasmids are an easy to use, cheap, and reliable alternative to Certified Reference Materials (CRMs), which are only available for a few of the GMOs authorized in Europe, have a relatively high production cost, and require further processing to be suitable for analysis. Strengths and weaknesses of the use of novel plasmid-based standards are addressed in detail. In addition, the quantification system was designed to avoid the use of a reference gene (e.g., a single copy, species-specific gene) as normalizer, i.e., to perform a GMO quantification based on an absolute instead of a relative measurement. In fact, experimental evidences show that the use of reference genes adds variability to the measurement system because a second independent real-time PCR-based measurement

  12. The Role of Leisure Engagement for Health Benefits Among Korean Older Women.

    Science.gov (United States)

    Kim, Junhyoung; Irwin, Lori; Kim, May; Chin, Seungtae; Kim, Jun

    2015-01-01

    This qualitative study was designed to examine the benefits of leisure to older Korean women. Using a constructive grounded theory methodology, in this study we identified three categories of benefits from leisure activities: (a) developing social connections, (b) enhancing psychological well-being, and (c) improving physical health. The findings of this study demonstrate that involvement in leisure activities offers substantial physical, psychological, and social benefits for older Korean women. The results also suggest that these benefits can provide an opportunity for older Korean adults to improve their health and well-being, which, in turn, may help promote successful aging.

  13. MAXIMIZING THE BENEFITS OF ERP SYSTEMS

    Directory of Open Access Journals (Sweden)

    Paulo André da Conceição Menezes

    2010-04-01

    Full Text Available The ERP (Enterprise Resource Planning systems have been consolidated in companies with different sizes and sectors, allowing their real benefits to be definitively evaluated. In this study, several interactions have been studied in different phases, such as the strategic priorities and strategic planning defined as ERP Strategy; business processes review and the ERP selection in the pre-implementation phase, the project management and ERP adaptation in the implementation phase, as well as the ERP revision and integration efforts in the post-implementation phase. Through rigorous use of case study methodology, this research led to developing and to testing a framework for maximizing the benefits of the ERP systems, and seeks to contribute for the generation of ERP initiatives to optimize their performance.

  14. Identification and Quantification of the Major Constituents in Egyptian Carob Extract by Liquid Chromatography–Electrospray Ionization-Tandem Mass Spectrometry

    Science.gov (United States)

    Owis, Asmaa Ibrahim; El-Naggar, El-Motaz Bellah

    2016-01-01

    Background: Carob - Ceratonia siliqua L., commonly known as St John's-bread or locust bean, family Fabaceae - is one of the most useful native Mediterranean trees. There is no data about the chromatography methods performed by high performance liquid chromatography (HPLC) for determining polyphenols in Egyptian carob pods. Objective: To establish a sensitive and specific liquid chromatography–electrospray ionization (ESI)-tandem mass spectrometry (MSn) methodology for the identification of the major constituents in Egyptian carob extract. Materials and Methods: HPLC with diode array detector and ESI-mass spectrometry (MS) was developed for the identification and quantification of phenolic acids, flavonoid glycosides, and aglycones in the methanolic extract of Egyptian C. siliqua. The MS and MSn data together with HPLC retention time of phenolic components allowed structural characterization of these compounds. Peak integration of ions in the MS scans had been used in the quantification technique. Results: A total of 36 compounds were tentatively identified. Twenty-six compounds were identified in the negative mode corresponding to 85.4% of plant dry weight, while ten compounds were identified in the positive mode representing 16.1% of plant dry weight, with the prevalence of flavonoids (75.4% of plant dry weight) predominantly represented by two methylapigenin-O-pentoside isomers (20.9 and 13.7% of plant dry weight). Conclusion: The identification of various compounds present in carob pods opens a new door to an increased understanding of the different health benefits brought about by the consumption of carob and its products. SUMMARY This research proposed a good example for the rapid identification of major constituents in complex systems such as herbs using sensitive, accurate and specific method coupling HPLC with DAD and MS, which facilitate the clarification of phytochemical composition of herbal medicine for better understanding of their nature and

  15. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    Science.gov (United States)

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  16. Competitive Reporter Monitored Amplification (CMA) - Quantification of Molecular Targets by Real Time Monitoring of Competitive Reporter Hybridization

    Science.gov (United States)

    Ullrich, Thomas; Ermantraut, Eugen; Schulz, Torsten; Steinmetzer, Katrin

    2012-01-01

    Background State of the art molecular diagnostic tests are based on the sensitive detection and quantification of nucleic acids. However, currently established diagnostic tests are characterized by elaborate and expensive technical solutions hindering the development of simple, affordable and compact point-of-care molecular tests. Methodology and Principal Findings The described competitive reporter monitored amplification allows the simultaneous amplification and quantification of multiple nucleic acid targets by polymerase chain reaction. Target quantification is accomplished by real-time detection of amplified nucleic acids utilizing a capture probe array and specific reporter probes. The reporter probes are fluorescently labeled oligonucleotides that are complementary to the respective capture probes on the array and to the respective sites of the target nucleic acids in solution. Capture probes and amplified target compete for reporter probes. Increasing amplicon concentration leads to decreased fluorescence signal at the respective capture probe position on the array which is measured after each cycle of amplification. In order to observe reporter probe hybridization in real-time without any additional washing steps, we have developed a mechanical fluorescence background displacement technique. Conclusions and Significance The system presented in this paper enables simultaneous detection and quantification of multiple targets. Moreover, the presented fluorescence background displacement technique provides a generic solution for real time monitoring of binding events of fluorescently labelled ligands to surface immobilized probes. With the model assay for the detection of human immunodeficiency virus type 1 and 2 (HIV 1/2), we have been able to observe the amplification kinetics of five targets simultaneously and accommodate two additional hybridization controls with a simple instrument set-up. The ability to accommodate multiple controls and targets into a

  17. Competitive reporter monitored amplification (CMA--quantification of molecular targets by real time monitoring of competitive reporter hybridization.

    Directory of Open Access Journals (Sweden)

    Thomas Ullrich

    Full Text Available BACKGROUND: State of the art molecular diagnostic tests are based on the sensitive detection and quantification of nucleic acids. However, currently established diagnostic tests are characterized by elaborate and expensive technical solutions hindering the development of simple, affordable and compact point-of-care molecular tests. METHODOLOGY AND PRINCIPAL FINDINGS: The described competitive reporter monitored amplification allows the simultaneous amplification and quantification of multiple nucleic acid targets by polymerase chain reaction. Target quantification is accomplished by real-time detection of amplified nucleic acids utilizing a capture probe array and specific reporter probes. The reporter probes are fluorescently labeled oligonucleotides that are complementary to the respective capture probes on the array and to the respective sites of the target nucleic acids in solution. Capture probes and amplified target compete for reporter probes. Increasing amplicon concentration leads to decreased fluorescence signal at the respective capture probe position on the array which is measured after each cycle of amplification. In order to observe reporter probe hybridization in real-time without any additional washing steps, we have developed a mechanical fluorescence background displacement technique. CONCLUSIONS AND SIGNIFICANCE: The system presented in this paper enables simultaneous detection and quantification of multiple targets. Moreover, the presented fluorescence background displacement technique provides a generic solution for real time monitoring of binding events of fluorescently labelled ligands to surface immobilized probes. With the model assay for the detection of human immunodeficiency virus type 1 and 2 (HIV 1/2, we have been able to observe the amplification kinetics of five targets simultaneously and accommodate two additional hybridization controls with a simple instrument set-up. The ability to accommodate multiple controls

  18. Microwave-assisted extraction of green coffee oil and quantification of diterpenes by HPLC.

    Science.gov (United States)

    Tsukui, A; Santos Júnior, H M; Oigman, S S; de Souza, R O M A; Bizzo, H R; Rezende, C M

    2014-12-01

    The microwave-assisted extraction (MAE) of 13 different green coffee beans (Coffea arabica L.) was compared to Soxhlet extraction for oil obtention. The full factorial design applied to the microwave-assisted extraction (MAE), related to time and temperature parameters, allowed to develop a powerful fast and smooth methodology (10 min at 45°C) compared to a 4h Soxhlet extraction. The quantification of cafestol and kahweol diterpenes present in the coffee oil was monitored by HPLC/UV and showed satisfactory linearity (R(2)=0.9979), precision (CV 3.7%), recovery (yield calculated on the diterpenes content for sample AT1 (Arabica green coffee) showed a six times higher value compared to the traditional Soxhlet method. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. FACE Analysis as a Fast and Reliable Methodology to Monitor the Sulfation and Total Amount of Chondroitin Sulfate in Biological Samples of Clinical Importance

    Directory of Open Access Journals (Sweden)

    Evgenia Karousou

    2014-06-01

    Full Text Available Glycosaminoglycans (GAGs due to their hydrophilic character and high anionic charge densities play important roles in various (pathophysiological processes. The identification and quantification of GAGs in biological samples and tissues could be useful prognostic and diagnostic tools in pathological conditions. Despite the noteworthy progress in the development of sensitive and accurate methodologies for the determination of GAGs, there is a significant lack in methodologies regarding sample preparation and reliable fast analysis methods enabling the simultaneous analysis of several biological samples. In this report, developed protocols for the isolation of GAGs in biological samples were applied to analyze various sulfated chondroitin sulfate- and hyaluronan-derived disaccharides using fluorophore-assisted carbohydrate electrophoresis (FACE. Applications to biologic samples of clinical importance include blood serum, lens capsule tissue and urine. The sample preparation protocol followed by FACE analysis allows quantification with an optimal linearity over the concentration range 1.0–220.0 µg/mL, affording a limit of quantitation of 50 ng of disaccharides. Validation of FACE results was performed by capillary electrophoresis and high performance liquid chromatography techniques.

  20. Development of a methodology for conducting an integrated HRA/PRA --

    International Nuclear Information System (INIS)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S.; Wreathall, J.; Cooper, S.E.

    1993-01-01

    During Low Power and Shutdown (LP ampersand S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant's systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP ampersand S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP ampersand S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP ampersand S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP ampersand S, (2) identification of potentially important LP ampersand S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP ampersand S conditions for a pressurized water reactor (PWR)

  1. Development of a qPCR Method for the Identification and Quantification of Two Closely Related Tuna Species, Bigeye Tuna (Thunnus obesus) and Yellowfin Tuna (Thunnus albacares), in Canned Tuna.

    Science.gov (United States)

    Bojolly, Daline; Doyen, Périne; Le Fur, Bruno; Christaki, Urania; Verrez-Bagnis, Véronique; Grard, Thierry

    2017-02-01

    Bigeye tuna (Thunnus obesus) and yellowfin tuna (Thunnus albacares) are among the most widely used tuna species for canning purposes. Not only substitution but also mixing of tuna species is prohibited by the European regulation for canned tuna products. However, as juveniles of bigeye and yellowfin tunas are very difficult to distinguish, unintentional substitutions may occur during the canning process. In this study, two mitochondrial markers from NADH dehydrogenase subunit 2 and cytochrome c oxidase subunit II genes were used to identify bigeye tuna and yellowfin tuna, respectively, utilizing TaqMan qPCR methodology. Two different qPCR-based methods were developed to quantify the percentage of flesh of each species used for can processing. The first one was based on absolute quantification using standard curves realized with these two markers; the second one was founded on relative quantification with the universal 12S rRNA gene as the endogenous gene. On the basis of our results, we conclude that our methodology could be applied to authenticate these two closely related tuna species when used in a binary mix in tuna cans.

  2. Validação de metodologia analítica para doseamento de soluções de lapachol por CLAE Validation of the analytical methodology for evaluation of lapachol in solution by HPCL

    Directory of Open Access Journals (Sweden)

    Said G. C. Fonseca

    2004-02-01

    Full Text Available Lapachol is a naphthoquinone found in several species of the Bignoniaceae family possessing mainly anticancer activity. The present work consists of the development and validation of analytical methodology for lapachol and its preparations. The results here obtained show that lapachol has a low quantification limit, that the analytical methodology is accurate, reproducible, robust and linear over the concentration range 0.5-100 µg/mL of lapachol.

  3. A multi-model assessment of the co-benefits of climate mitigation for global air quality

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Shilpa; Klimont, Zbigniew; Leitao, Joana; Riahi, Keywan; van Dingenen, Rita; Reis, Lara Aleluia; Calvin, Katherine; Dentener, Frank; Drouet, Laurent; Fujimori, Shinichiro; Harmsen, Mathijs; Luderer, Gunnar; Heyes, Chris; Strefler, Jessica; Tavoni, Massimo; van Vuuren, Detlef P.

    2016-12-01

    sector and region level. A second methodological advancement is a quantification of the co-benefits in terms of the associated atmospheric concentrations of fine particulate matter (PM2.5) and consequent mortality related outcomes across different models. This is made possible by the use of state-of the art simplified atmospheric model that allows for the first time a computationally feasible multi-model evaluation of such outcomes.

  4. Synthesis and Review: Advancing agricultural greenhouse gas quantification

    International Nuclear Information System (INIS)

    Olander, Lydia P; Wollenberg, Eva; Tubiello, Francesco N; Herold, Martin

    2014-01-01

    Reducing emissions of agricultural greenhouse gases (GHGs), such as methane and nitrous oxide, and sequestering carbon in the soil or in living biomass can help reduce the impact of agriculture on climate change while improving productivity and reducing resource use. There is an increasing demand for improved, low cost quantification of GHGs in agriculture, whether for national reporting to the United Nations Framework Convention on Climate Change (UNFCCC), underpinning and stimulating improved practices, establishing crediting mechanisms, or supporting green products. This ERL focus issue highlights GHG quantification to call attention to our existing knowledge and opportunities for further progress. In this article we synthesize the findings of 21 papers on the current state of global capability for agricultural GHG quantification and visions for its improvement. We conclude that strategic investment in quantification can lead to significant global improvement in agricultural GHG estimation in the near term. (paper)

  5. METHODOLOGICAL APPROACHES IN REALIZING AND APPLYING COST-BENEFIT ANALYSIS FOR THE INVESTMENT PROJECTS

    Directory of Open Access Journals (Sweden)

    Pelin Andrei

    2009-05-01

    Full Text Available Cost-benefit analysis represents the most frequent technique used for a rational allocation of resources. This modality of evaluating the expenditure programs is an attempt to measure the costs and gains of a community as a result of running the evaluated

  6. Value of information: A roadmap to quantifying the benefit of structural health monitoring

    DEFF Research Database (Denmark)

    Straub, D.; Chatzi, E.; Bismut, E.

    2017-01-01

    The concept of value of information (VoI) enables quantification of the benefits provided by structural health monitoring (SHM) systems – in principle. Its implementation is challenging, as it requires an explicit modelling of the structural system’s life cycle, in particular of the decisions...... that are taken based on the SHM information. In this paper, we approach the VoI analysis through an influence diagram (ID), which supports the modelling process. We provide a simple example for illustration and discuss challenges associated with real-life implementation....

  7. Counting the cost: estimating the economic benefit of pedophile treatment programs.

    Science.gov (United States)

    Shanahan, M; Donato, R

    2001-04-01

    The principal objective of this paper is to identify the economic costs and benefits of pedophile treatment programs incorporating both the tangible and intangible cost of sexual abuse to victims. Cost estimates of cognitive behavioral therapy programs in Australian prisons are compared against the tangible and intangible costs to victims of being sexually abused. Estimates are prepared that take into account a number of problematic issues. These include the range of possible recidivism rates for treatment programs; the uncertainty surrounding the number of child sexual molestation offences committed by recidivists; and the methodological problems associated with estimating the intangible costs of sexual abuse on victims. Despite the variation in parameter estimates that impact on the cost-benefit analysis of pedophile treatment programs, it is found that potential range of economic costs from child sexual abuse are substantial and the economic benefits to be derived from appropriate and effective treatment programs are high. Based on a reasonable set of parameter estimates, in-prison, cognitive therapy treatment programs for pedophiles are likely to be of net benefit to society. Despite this, a critical area of future research must include further methodological developments in estimating the quantitative impact of child sexual abuse in the community.

  8. A Practical Risk Assessment Methodology for Safety-Critical Train Control Systems

    Science.gov (United States)

    2009-07-01

    This project proposes a Practical Risk Assessment Methodology (PRAM) for analyzing railroad accident data and assessing the risk and benefit of safety-critical train control systems. This report documents in simple steps the algorithms and data input...

  9. Benefits of GRI R and D products placed in commercial use through early 1991

    International Nuclear Information System (INIS)

    Dombrowski, L.P.; Pine, G.D.; Rinholm, R.C.

    1992-02-01

    From GRI's inception in 1978 through early 1991, 170 GRI-sponsored research and development (R and D) products have been placed into commercial service. Twenty-four of these products were introduced between April 1990 and March 1991. Benefits have been quantified for 87 of the 170 items, and the calculated ratio of the benefits to gas customers to total GRI costs incurred through the end of 1990 is 4.9 to 1. The calculated internal rate of return to gas customers on their investment in GRI to date is 21.5 percent. When only the costs of completed, terminated, or deferred R and D are included, the benefit-to-cost ratio rises to 7.9 to 1, and the gas customer return on investment rises to 25 percent. The 4.9 to 1 benefit-to-cost ratio is greater than the ratio calculated in May 1990, primarily because of the quantification for the first time of the benefits of two groups of GRI information items: (1) items supporting the use of plastic pipe for gas distribution, and (2) items leading to a better understanding of mid-efficiency gas furnaces and their venting systems

  10. Critical points of DNA quantification by real-time PCR--effects of DNA extraction method and sample matrix on quantification of genetically modified organisms.

    Science.gov (United States)

    Cankar, Katarina; Stebih, Dejan; Dreo, Tanja; Zel, Jana; Gruden, Kristina

    2006-08-14

    Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to

  11. Requirements and benefits of flow forecasting for improving hydropower generation

    NARCIS (Netherlands)

    Dong, Xiaohua; Vrijling, J.K.; Dohmen-Janssen, Catarine M.; Ruigh, E.; Booij, Martijn J.; Stalenberg, B.; Hulscher, Suzanne J.M.H.; van Gelder, P.H.A.J.M.; Verlaan, M.; Zijderveld, A.; Waarts, P.

    2005-01-01

    This paper presents a methodology to identify the required lead time and accuracy of flow forecasting for improving hydropower generation of a reservoir, by simulating the benefits (in terms of electricity generated) obtained from the forecasting with varying lead times and accuracies. The

  12. La quantification en Kabiye: une approche linguistique | Pali ...

    African Journals Online (AJOL)

    ... which is denoted by lexical quantifiers. Quantification with specific reference is provided by different types of linguistic units (nouns, numerals, adjectives, adverbs, ideophones and verbs) in arguments/noun phrases and in the predicative phrase in the sense of Chomsky. Keywords: quantification, class, number, reference, ...

  13. Quantification of glucosylceramide in plasma of Gaucher disease patients

    Directory of Open Access Journals (Sweden)

    Maria Viviane Gomes Muller

    2010-12-01

    Full Text Available Gaucher disease is a sphingolipidosis that leads to an accumulation of glucosylceramide. The objective of this study was to develop a methodology, based on the extraction, purification and quantification of glucosylceramide from blood plasma, for use in clinical research laboratories. Comparison of the glucosylceramide content in plasma from Gaucher disease patients, submitted to enzyme replacement therapy or otherwise, against that from normal individuals was also carried out. The glucosylceramide, separated from other glycosphingolipids by high performance thin layer chromatography (HPTLC was chemically developed (CuSO4 / H3PO4 and the respective band confirmed by immunostaining (human anti-glucosylceramide antibody / peroxidase-conjugated secondary antibody. Chromatogram quantification by densitometry demonstrated that the glucosylceramide content in Gaucher disease patients was seventeen times higher than that in normal individuals, and seven times higher than that in patients on enzyme replacement therapy. The results obtained indicate that the methodology established can be used in complementary diagnosis and for treatment monitoring of Gaucher disease patients.A doença de Gaucher é uma esfingolipidose caracterizada pelo acúmulo de glicosilceramida. O objetivo deste estudo foi desenvolver metodologia baseada na extração, purificação e quantificação da glicosilceramida plasmática a qual possa ser usada em laboratórios de pesquisa clínica. Após o desenvolvimento desta metodologia, foi proposto, também, comparar o conteúdo de glicosilceramida presente no plasma de pacientes com doença de Gaucher, submetidos ou não a tratamento, com aquele de indivíduos normais. A glicosilceramida, separada de outros glicoesfingolipídios por cromatografia de camada delgada de alto desempenho (HPTLC, foi revelada quimicamente (CuSO4/H3PO4 e a respectiva banda foi confirmada por imunorrevelação (anticorpo anti-glicosilceramida humana

  14. Analytical methodologies based on LC–MS/MS for monitoring selected emerging compounds in liquid and solid phases of the sewage sludge

    OpenAIRE

    Boix Sales, Clara; Ibáñez Martínez, María; Fabregat-Safont, David; Morales, E.; Pastor, L.; Sancho Llopis, Juan Vicente; Sánchez-Ramírez, J. E.; Hernández Hernández, Félix

    2016-01-01

    In this work, two analytical methodologies based on liquid chromatography coupled to tandem mass spectrometry (LC–MS/MS) were developed for quantification of emerging pollutants identified in sewage sludge after a previous wide-scope screening. The target list included 13 emerging contaminants (EC): thiabendazole, acesulfame, fenofibric acid, valsartan, irbesartan, salicylic acid, diclofenac, carbamazepine, 4-aminoantipyrine (4- AA), 4-acetyl aminoantipyrine (4-AAA), 4-formyl amin...

  15. Analytical methodologies based on LC?MS/MS for monitoring selected emerging compounds in liquid and solid phases of the sewage sludge

    OpenAIRE

    Boix, C.; Ib??ez, M.; Fabregat-Safont, D.; Morales, E.; Pastor, L.; Sancho, J.V.; S?nchez-Ram?rez, J.E.; Hern?ndez, F.

    2016-01-01

    In this work, two analytical methodologies based on liquid chromatography coupled to tandem mass spectrometry (LC?MS/MS) were developed for quantification of emerging pollutants identified in sewage sludge after a previous wide-scope screening. The target list included 13 emerging contaminants (EC): thiabendazole, acesulfame, fenofibric acid, valsartan, irbesartan, salicylic acid, diclofenac, carbamazepine, 4-aminoantipyrine (4-AA), 4-acetyl aminoantipyrine (4-AAA), 4-formyl aminoantipyrine (...

  16. A proposed quantitative credit-rating methodology for South African provincial departments

    Directory of Open Access Journals (Sweden)

    Erika Fourie

    2016-05-01

    Full Text Available The development of subnational credit-rating methodologies affords benefits for subnationals, the sovereign and its citizens. Trusted credit ratings facilitate access to financial markets and above-average ratings allow for the negotiation of better collateral and guarantee agreements, as well as for funding of, for example, infrastructure projects at superior (lower interest rates. This paper develops the quantitative section of a credit-rating methodology for South African subnationals. The unique characteristics of South African data, their assembly, and the selection of dependent and independent variables for the linear-regression model chosen, are discussed. The methodology is then applied to the provincial Department of Health using linear regression modelling.

  17. Development of extreme rainfall PRA methodology for sodium-cooled fast reactor

    International Nuclear Information System (INIS)

    Nishino, Hiroyuki; Kurisaka, Kenichi; Yamano, Hidemasa

    2016-01-01

    The objective of this study is to develop a probabilistic risk assessment (PRA) methodology for extreme rainfall with focusing on decay heat removal system of a sodium-cooled fast reactor. For the extreme rainfall, annual excess probability depending on the hazard intensity was statistically estimated based on meteorological data. To identify core damage sequence, event trees were developed by assuming scenarios that structures, systems and components (SSCs) important to safety are flooded with rainwater coming into the buildings through gaps in the doors and the SSCs fail when the level of rainwater on the ground or on the roof of the building becomes higher than thresholds of doors on first floor or on the roof during the rainfall. To estimate the failure probability of the SSCs, the level of water rise was estimated by comparing the difference between precipitation and drainage capacity. By combining annual excess probability and the failure probability of SSCs, the event trees led to quantification of core damage frequency, and therefore the PRA methodology for rainfall was developed. (author)

  18. The Risks and Benefits of Running Barefoot or in Minimalist Shoes

    OpenAIRE

    Perkins, Kyle P.; Hanney, William J.; Rothschild, Carey E.

    2014-01-01

    Context: The popularity of running barefoot or in minimalist shoes has recently increased because of claims of injury prevention, enhanced running efficiency, and improved performance compared with running in shoes. Potential risks and benefits of running barefoot or in minimalist shoes have yet to be clearly defined. Objective: To determine the methodological quality and level of evidence pertaining to the risks and benefits of running barefoot or in minimalist shoes. Data Sources: In Septem...

  19. Methodology and results of the seismic probabilistic safety assessment of Krsko nuclear power plant

    International Nuclear Information System (INIS)

    Vermaut, M.K.; Monette, P.; Campbell, R.D.

    1995-01-01

    A seismic IPEEE (Individual Plant Examination for External Events) was performed for the Krsko plant. The methodology adopted is the seismic PSA (Probabilistic Safety Assessment). The Krsko NPP is located on a medium to high seismicity site. The PSA study described here includes all the steps in the PSA sequence, i.e. reassessment of the site hazard, calculation of plant structures response including soil-structure interaction, seismic plant walkdowns, probabilistic seismic fragility analysis of plant structures and components, and quantification of seismic core damage frequency (CDF). Also relay chatter analysis and soil stability studies were performed. The seismic PSA described here is limited to the analysis of CDF (level I PSA). The subsequent determination and quantification of plant damage states, containment behaviour and radioactive releases to the outside (level 2 PSA) have been performed for the Krsko NPP but are not further described in this paper. The results of the seismic PSA study indicate that, with some upgrades suggested by the PSA team, the seismic induced CDF is comparable to that of most US and Western Europe NPPs. (author)

  20. Quantification analysis of CT for aphasic patients

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, S.; Ooyama, H.; Hojo, K.; Tasaki, H.; Hanazono, T.; Sato, T.; Metoki, H.; Totsuka, M.; Oosumi, N.

    1987-02-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis).

  1. LMFBR safety criteria: cost-benefit considerations under the constraint of an a priori risk criterion

    International Nuclear Information System (INIS)

    Hartung, J.

    1979-01-01

    The role of cost-benefit considerations and a priori risk criteria as determinants of Core Disruptive Accident (CDA)-related safety criteria for large LMFBR's is explored with the aid of quantitative risk and probabilistic analysis methods. A methodology is described which allows a large number of design and siting alternatives to be traded off against each other with the goal of minimizing energy generation costs subject to the constraint of both an a priori risk criterion and a cost-benefit criterion. Application of this methodology to a specific LMFBR design project is described and the results are discussed. 5 refs

  2. Rapid quantification and sex determination of forensic evidence materials.

    Science.gov (United States)

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  3. Methodologies for tracking learning paths

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth; Gilje, Øystein; Lindstrand, Fredrik

    2009-01-01

    filmmakers: what furthers their interest and/or hinders it, and what learning patterns emerge. The aim of this article is to present and discuss issues regarding the methodology and meth- ods of the study, such as developing a relationship with interviewees when conducting inter- views online (using MSN). We...... suggest two considerations about using online interviews: how the interviewees value the given subject of conversation and their familiarity with being online. The benefit of getting online communication with the young filmmakers offers ease, because it is both practical and appropriates a meeting...

  4. Itô-SDE MCMC method for Bayesian characterization of errors associated with data limitations in stochastic expansion methods for uncertainty quantification

    Science.gov (United States)

    Arnst, M.; Abello Álvarez, B.; Ponthot, J.-P.; Boman, R.

    2017-11-01

    This paper is concerned with the characterization and the propagation of errors associated with data limitations in polynomial-chaos-based stochastic methods for uncertainty quantification. Such an issue can arise in uncertainty quantification when only a limited amount of data is available. When the available information does not suffice to accurately determine the probability distributions that must be assigned to the uncertain variables, the Bayesian method for assigning these probability distributions becomes attractive because it allows the stochastic model to account explicitly for insufficiency of the available information. In previous work, such applications of the Bayesian method had already been implemented by using the Metropolis-Hastings and Gibbs Markov Chain Monte Carlo (MCMC) methods. In this paper, we present an alternative implementation, which uses an alternative MCMC method built around an Itô stochastic differential equation (SDE) that is ergodic for the Bayesian posterior. We draw together from the mathematics literature a number of formal properties of this Itô SDE that lend support to its use in the implementation of the Bayesian method, and we describe its discretization, including the choice of the free parameters, by using the implicit Euler method. We demonstrate the proposed methodology on a problem of uncertainty quantification in a complex nonlinear engineering application relevant to metal forming.

  5. Quantification of Human and Animal Viruses to Differentiate the Origin of the Fecal Contamination Present in Environmental Samples

    Directory of Open Access Journals (Sweden)

    Sílvia Bofill-Mas

    2013-01-01

    Full Text Available Many different viruses are excreted by humans and animals and are frequently detected in fecal contaminated waters causing public health concerns. Classical bacterial indicator such as E. coli and enterococci could fail to predict the risk for waterborne pathogens such as viruses. Moreover, the presence and levels of bacterial indicators do not always correlate with the presence and concentration of viruses, especially when these indicators are present in low concentrations. Our research group has proposed new viral indicators and methodologies for determining the presence of fecal pollution in environmental samples as well as for tracing the origin of this fecal contamination (microbial source tracking. In this paper, we examine to what extent have these indicators been applied by the scientific community. Recently, quantitative assays for quantification of poultry and ovine viruses have also been described. Overall, quantification by qPCR of human adenoviruses and human polyomavirus JC, porcine adenoviruses, bovine polyomaviruses, chicken/turkey parvoviruses, and ovine polyomaviruses is suggested as a toolbox for the identification of human, porcine, bovine, poultry, and ovine fecal pollution in environmental samples.

  6. Exploring the Benefits of Respite Services to Family Caregivers: Methodological Issues and Current Findings

    Science.gov (United States)

    Zarit, Steven H.; Liu, Yin; Bangerter, Lauren R.; Rovine, Michael J.

    2017-01-01

    Objectives There is growing emphasis on empirical validation of the efficacy of community-based services for older people and their families, but research on services such as respite care faces methodological challenges that have limited the growth of outcome studies. We identify problems associated with the usual research approaches for studying respite care, with the goal of stimulating use of novel and more appropriate research designs that can lead to improved studies of community-based services. Method Using the concept of research validity, we evaluate the methodological approaches in the current literature on respite services, including adult day services, in-home respite and overnight respite. Results Although randomized control trials (RCTs) are possible in community settings, validity is compromised by practical limitations of randomization and other problems. Quasi-experimental and interrupted time series designs offer comparable validity to RCTs and can be implemented effectively in community settings. Conclusion An emphasis on RCTs by funders and researchers is not supported by scientific evidence. Alternative designs can lead to development of a valid body of research on community services such as respite. PMID:26729467

  7. The relationship between external beam radiotherapy dose and chronic urinary dysfunction - A methodological critique

    International Nuclear Information System (INIS)

    Rosewall, Tara; Catton, Charles; Currie, Geoffrey; Bayley, Andrew; Chung, Peter; Wheat, Janelle; Milosevic, Michael

    2010-01-01

    Purpose: To perform a methodological critique of the literature evaluating the relationship between external beam radiotherapy dose/volume parameters and chronic urinary dysfunction to determine why consistent associations between dose and dysfunction have not been found. Methods and materials: The radiotherapy literature was reviewed using various electronic medical search engines with appropriate keywords and MeSH headings. Inclusion criteria comprised of; English language articles, published between 1999 and June 2009, incorporating megavoltage external beam photons in standard-sized daily fraction. A methodological critique was then performed, evaluating the factors affected in the quantification of radiotherapy dose and chronic urinary dysfunction. Results: Nine of 22 eligible studies successfully identified a clinically and statistically significant relationship between dose and dysfunction. Accurate estimations of external beam radiotherapy dose were compromised by the frequent use of dosimetric variables which are poor surrogates for the dose received by the lower urinary tract tissue and do not incorporate the effect of daily variations in isocentre and bladder position. The precise categorization of chronic urinary dysfunction was obscured by reliance on subjective and aggregated toxicity metrics which vary over time. Conclusions: A high-level evidence-base for the relationship between external beam radiotherapy dose and chronic urinary dysfunction does not currently exist. The quantification of the actual external beam dose delivered to the functionally important tissues using dose accumulation strategies and the use of objective measures of individual manifestations of urinary dysfunction will assist in the identification of robust relationships between dose and urinary dysfunction for application in widespread clinical practice.

  8. Benefits of public roadside safety rest areas in Texas : technical report.

    Science.gov (United States)

    2011-05-01

    The objective of this investigation was to develop a benefit-cost analysis methodology for safety rest areas in : Texas and to demonstrate its application in select corridors throughout the state. In addition, this project : considered novel safety r...

  9. Cost-benefit analysis for combined heat and power plant

    International Nuclear Information System (INIS)

    Sazdovski, Ace; Fushtikj, Vangel

    2004-01-01

    The paper presents a methodology and practical application of Cost-Benefit Analysis for Combined Heat and Power Plant (Cogeneration facility). Methodology include up-to-date and real data for cogeneration plant in accordance with the trends ill development of the CHP technology. As a case study a CHP plant that could be built-up in Republic of Macedonia is analyzed. The main economic parameters for project evaluation, such as NPV and IRR are calculated for a number of possible scenarios. The analyze present the economic outputs that could be used as a decision for CHP project acceptance for investment. (Author)

  10. POSSIBILITY OF IMPROVING EXISTING STANDARDS AND METHODOLOGIES FOR AUDITING INFORMATION SYSTEMS TO PROVIDE E-GOVERNMENT SERVICES

    Directory of Open Access Journals (Sweden)

    Евгений Геннадьевич Панкратов

    2014-03-01

    Full Text Available This article analyzes the existing methods of e-government systems audit, their shortcomings are examined.  The approaches to improve existing techniques and adapt them to the specific characteristics of e-government systems are suggested. The paper describes the methodology, providing possibilities of integrated assessment of information systems. This methodology uses systems maturity models and can be used in the construction of e-government rankings, as well as in the audit of their implementation process. Maturity models are based on COBIT, COSO methodologies and models of e-government, developed by the relevant committee of the UN. The methodology was tested during the audit of information systems involved in the payment of temporary disability benefits. The audit was carried out during analysis of the outcome of the pilot project for the abolition of the principle of crediting payments for disability benefits.DOI: http://dx.doi.org/10.12731/2218-7405-2014-2-5

  11. Cost/Benefit Prioritization for Advanced Safeguards Research and Development

    International Nuclear Information System (INIS)

    DeMuth, S.F.; Adeli, R.; Thomas, K.E.

    2008-01-01

    A system level study utilizing commercially available Extend TM software, has been initiated to perform cost/benefit analyses for advanced safeguards research and development. The methodology is focused on estimating standard error in the inventory difference (SEID) for reprocessing and fuel fabrication facilities, for various proposed advanced safeguards measurement technologies. The inventory duration, and consequent number of inventories per year, is dictated by the detection of a significant quantity of special nuclear material (SNM). Detection is limited by the cumulative measurement uncertainty for the entire system. The cost of inventories is then compared with the cost of advanced instrumentation and/or process design changes. Current progress includes development of the methodology, future efforts will be focused on ascertaining estimated costs and performance. Case studies will be provided as examples of the methodology. (author)

  12. A High-Performance Embedded Hybrid Methodology for Uncertainty Quantification With Applications

    Energy Technology Data Exchange (ETDEWEB)

    Iaccarino, Gianluca

    2014-04-01

    Multiphysics processes modeled by a system of unsteady di erential equations are natu- rally suited for partitioned (modular) solution strategies. We consider such a model where probabilistic uncertainties are present in each module of the system and represented as a set of random input parameters. A straightforward approach in quantifying uncertainties in the predicted solution would be to sample all the input parameters into a single set, and treat the full system as a black-box. Although this method is easily parallelizable and requires minimal modi cations to deterministic solver, it is blind to the modular structure of the underlying multiphysical model. On the other hand, using spectral representations polynomial chaos expansions (PCE) can provide richer structural information regarding the dynamics of these uncertainties as they propagate from the inputs to the predicted output, but can be prohibitively expensive to implement in the high-dimensional global space of un- certain parameters. Therefore, we investigated hybrid methodologies wherein each module has the exibility of using sampling or PCE based methods of capturing local uncertainties while maintaining accuracy in the global uncertainty analysis. For the latter case, we use a conditional PCE model which mitigates the curse of dimension associated with intru- sive Galerkin or semi-intrusive Pseudospectral methods. After formalizing the theoretical framework, we demonstrate our proposed method using a numerical viscous ow simulation and benchmark the performance against a solely Monte-Carlo method and solely spectral method.

  13. Evaluation methodology for flood damage reduction by preliminary water release from hydroelectric dams

    Science.gov (United States)

    Ando, T.; Kawasaki, A.; Koike, T.

    2017-12-01

    IPCC AR5 (2014) reported that rainfall in the middle latitudes of the Northern Hemisphere has been increasing since 1901, and it is claimed that warmer climate will increase the risk of floods. In contrast, world water demand is forecasted to exceed a sustainable supply by 40 percent by 2030. In order to avoid this expectable water shortage, securing new water resources has become an utmost challenge. However, flood risk prevention and the secure of water resources are contradictory. To solve this problem, we can use existing hydroelectric dams not only as energy resources but also for flood control. However, in case of Japan, hydroelectric dams take no responsibility for it, and benefits have not been discussed accrued by controlling flood by hydroelectric dams, namely by using preliminary water release from them. Therefore, our paper proposes methodology for assessing those benefits. This methodology has three stages as shown in Fig. 1. First, RRI model is used to model flood events, taking account of the probability of rainfall. Second, flood damage is calculated using assets in inundation areas multiplied by the inundation depths generated by that RRI model. Third, the losses stemming from preliminary water release are calculated, and adding them to flood damage, overall losses are calculated. We can evaluate the benefits by changing the volume of preliminary release. As a result, shown in Fig. 2, the use of hydroelectric dams to control flooding creates 20 billion Yen benefits, in the probability of three-day-ahead rainfall prediction of the assumed maximum rainfall in Oi River, in the Shizuoka Pref. of Japan. As the third priority in the Sendai Framework for Disaster Risk Reduction 2015-2030, `investing in disaster risk reduction for resilience - public and private investment in disaster risk prevention and reduction through structural and non-structural measures' was adopted. The accuracy of rainfall prediction is the key factor in maximizing the benefits

  14. Wind power planning: assessing long-term costs and benefits

    International Nuclear Information System (INIS)

    Kennedy, Scott

    2005-01-01

    In the following paper, a new and straightforward technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic load duration curves to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. The model is applied to potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO 2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on CO 2 charges, and capital costs for wind turbines and IGCC plant is also discussed. The methodology is intended for use by energy planners in assessing the social benefit of future investments in wind power

  15. Uncertainty quantification in Eulerian-Lagrangian models for particle-laden flows

    Science.gov (United States)

    Fountoulakis, Vasileios; Jacobs, Gustaaf; Udaykumar, Hs

    2017-11-01

    A common approach to ameliorate the computational burden in simulations of particle-laden flows is to use a point-particle based Eulerian-Lagrangian model, which traces individual particles in their Lagrangian frame and models particles as mathematical points. The particle motion is determined by Stokes drag law, which is empirically corrected for Reynolds number, Mach number and other parameters. The empirical corrections are subject to uncertainty. Treating them as random variables renders the coupled system of PDEs and ODEs stochastic. An approach to quantify the propagation of this parametric uncertainty to the particle solution variables is proposed. The approach is based on averaging of the governing equations and allows for estimation of the first moments of the quantities of interest. We demonstrate the feasibility of our proposed methodology of uncertainty quantification of particle-laden flows on one-dimensional linear and nonlinear Eulerian-Lagrangian systems. This research is supported by AFOSR under Grant FA9550-16-1-0008.

  16. Considerations of health benefit-cost analysis for activities involving ionizing radiation exposure and alternatives. Technical report

    International Nuclear Information System (INIS)

    1977-01-01

    The report deals with development of methodology for the health benefit-cost analysis of activities that result in radiation exposure to humans. It attempts to frame the problems and to communicate the necessary elements of the complex technical process required for this method of analysis. The main thrust of the report is to develop a methodology for analyzing the benefits and costs of these activities. Application of the methodology is demonstrated for nuclear power production and medical uses of radiation, but no definitive analysis is attempted. The report concludes that benefit-cost analysis can be effectively applied to these applications and that it provides a basis for more informed governmental decision-making and for public participation in evaluating the issues of radiation exposure. It notes, however, that for cases where national policy is involved, decisions must inevitably be made on the basis of value judgements to which such analyses can make only limited contributions. An important conclusion is that a significant reduction in radiation exposure to the population is apparently achievable by development of methods for eliminating unproductive medical X-ray exposure

  17. Quantification of prominent volatile compounds responsible for muskmelon and watermelon aroma by purge and trap extraction followed by gas chromatography-mass spectrometry determination.

    Science.gov (United States)

    Fredes, Alejandro; Sales, Carlos; Barreda, Mercedes; Valcárcel, Mercedes; Roselló, Salvador; Beltrán, Joaquim

    2016-01-01

    A dynamic headspace purge-and-trap (DHS-P&T) methodology for the determination and quantification of 61 volatile compounds responsible for muskmelon and watermelon aroma has been developed and validated. The methodology is based on the application of purge-and-trap extraction followed by gas chromatography coupled to (ion trap) mass spectrometry detection. For this purpose two different P&T sorbent cartridges have been evaluated. The influence of different extraction factors (sample weight, extraction time, and purge flow) on extraction efficiency has been studied and optimised using response surface methodology. Precision, expressed as repeatability, has been evaluated by analysing six replicates of real samples, showing relative standard deviations between 3% and 27%. Linearity has been studied in the range of 10-6130 ng mL(-1) depending on the compound response, showing coefficients of correlation between 0.995 and 0.999. Detection limits ranged between 0.1 and 274 ng g(-1). The methodology developed is well suited for analysis of large numbers of muskmelon and watermelon samples in plant breeding programs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Base line definitions and methodological lessons from Zimbabwe

    International Nuclear Information System (INIS)

    Maya, R.S.

    1995-01-01

    The UNEP Greenhouse Gas Abatement Costing Studies carried out under the management of the UNEP Collaborating Centre On Energy and Environment at Risoe National Laboratories in Denmark has placed effort in generating methodological approaches to assessing the cost of abatement activities to reduce CO 2 emissions. These efforts have produced perhaps the most comprehensive set of methodological approaches to defining and assessing the cost of greenhouse gas abatement. Perhaps the most importance aspect of the UNEP study which involved teams of researchers from ten countries is the mix of countries in which the studies were conducted and hence the representation of views and concepts from researchers in these countries particularly those from developing countries namely, Zimbabwe, India, Venezuela, Brazil, Thailand and Senegal. Methodological lessons from Zimbabwe, therefore, would have benefited from the interactions with methodological experiences form the other participating countries. Methodological lessons from the Zimbabwean study can be placed in two categories. One relates to the modelling of tools to analyze economic trends and the various factors studied in order to determine the unit cost of CO 2 abatement. The other is the definition of factors influencing the levels of emissions reducible and those realised under specific economic trends. (au)

  19. Quantification of atherosclerotic plaque activity and vascular inflammation using [18-F] fluorodeoxyglucose positron emission tomography/computed tomography (FDG-PET/CT).

    Science.gov (United States)

    Mehta, Nehal N; Torigian, Drew A; Gelfand, Joel M; Saboury, Babak; Alavi, Abass

    2012-05-02

    Conventional non-invasive imaging modalities of atherosclerosis such as coronary artery calcium (CAC) and carotid intimal medial thickness (C-IMT) provide information about the burden of disease. However, despite multiple validation studies of CAC, and C-IMT, these modalities do not accurately assess plaque characteristics, and the composition and inflammatory state of the plaque determine its stability and, therefore, the risk of clinical events. [(18)F]-2-fluoro-2-deoxy-D-glucose (FDG) imaging using positron-emission tomography (PET)/computed tomography (CT) has been extensively studied in oncologic metabolism. Studies using animal models and immunohistochemistry in humans show that FDG-PET/CT is exquisitely sensitive for detecting macrophage activity, an important source of cellular inflammation in vessel walls. More recently, we and others have shown that FDG-PET/CT enables highly precise, novel measurements of inflammatory activity of activity of atherosclerotic plaques in large and medium-sized arteries. FDG-PET/CT studies have many advantages over other imaging modalities: 1) high contrast resolution; 2) quantification of plaque volume and metabolic activity allowing for multi-modal atherosclerotic plaque quantification; 3) dynamic, real-time, in vivo imaging; 4) minimal operator dependence. Finally, vascular inflammation detected by FDG-PET/CT has been shown to predict cardiovascular (CV) events independent of traditional risk factors and is also highly associated with overall burden of atherosclerosis. Plaque activity by FDG-PET/CT is modulated by known beneficial CV interventions such as short term (12 week) statin therapy as well as longer term therapeutic lifestyle changes (16 months). The current methodology for quantification of FDG uptake in atherosclerotic plaque involves measurement of the standardized uptake value (SUV) of an artery of interest and of the venous blood pool in order to calculate a target to background ratio (TBR), which is

  20. High level waste repository site suitability criteria. Environmental impact statement methodology

    International Nuclear Information System (INIS)

    1977-06-01

    The approach (methodology) which has been developed for the preparation of the environmental impact statement (EIS) is described. A suggested outline is presented for the High Level Waste Repository Site Suitability Criteria EIS together with a detailed description of the approach to be used in preparing the EIS. In addition, a methodology is presented by which the necessary cost/benefit/risk comparisons of alternative sets of site suitability criteria can be made. The TERA environmental research data bank, a computerized data bank which contained information on current and historical licensing activities for power plants was modified to include information on generic or programmatic EIS related issues. The content of the modified data bank was utilized to develop the EIS outline presented in this report. The report recommends that a modified matrix evaluation approach be used to make the cost/benefit/risk comparisons. The suggested matrix is designed to facilitate between criteria comparative analyses of economic, environmental, sociological and radiological risk factors. The quantitative compositing of dollar cost and benefits, environmental and sociological impacts, and radiological risks is to be performed using a semi-analytical, semi-visual procedure based on the concept of ''decision surfaces.''

  1. Millennial Expectations and Constructivist Methodologies: Their Corresponding Characteristics and Alignment

    Science.gov (United States)

    Carter, Timothy L.

    2008-01-01

    In recent years, much emphasis has been placed on constructivist methodologies and their potential benefit for learners of various ages (Brandt & Perkins, 2000; Brooks, 1990). Although certain aspects of the constructivist paradigm have replaced several aspects of the behaviorist paradigm for a large contingency of stakeholders (particularly,…

  2. Common methodological flaws in economic evaluations.

    Science.gov (United States)

    Drummond, Michael; Sculpher, Mark

    2005-07-01

    Economic evaluations are increasingly being used by those bodies such as government agencies and managed care groups that make decisions about the reimbursement of health technologies. However, several reviews of economic evaluations point to numerous deficiencies in the methodology of studies or the failure to follow published methodological guidelines. This article, written for healthcare decision-makers and other users of economic evaluations, outlines the common methodological flaws in studies, focussing on those issues that are likely to be most important when deciding on the reimbursement, or guidance for use, of health technologies. The main flaws discussed are: (i) omission of important costs or benefits; (ii) inappropriate selection of alternatives for comparison; (iii) problems in making indirect comparisons; (iv) inadequate representation of the effectiveness data; (v) inappropriate extrapolation beyond the period observed in clinical studies; (vi) excessive use of assumptions rather than data; (vii) inadequate characterization of uncertainty; (viii) problems in aggregation of results; (ix) reporting of average cost-effectiveness ratios; (x) lack of consideration of generalizability issues; and (xi) selective reporting of findings. In each case examples are given from the literature and guidance is offered on how to detect flaws in economic evaluations.

  3. Sampling-based nuclear data uncertainty quantification for continuous energy Monte-Carlo codes

    International Nuclear Information System (INIS)

    Zhu, T.

    2015-01-01

    Research on the uncertainty of nuclear data is motivated by practical necessity. Nuclear data uncertainties can propagate through nuclear system simulations into operation and safety related parameters. The tolerance for uncertainties in nuclear reactor design and operation can affect the economic efficiency of nuclear power, and essentially its sustainability. The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. For example, only recently in 2011, the capability of calculating continuous energy κ_e_f_f sensitivity to nuclear data was demonstrated in certain M-C codes by using the method of iterated fission probability. The methodology developed during this PhD research is fundamentally different from the conventional S/U approach: nuclear data are treated as random variables and sampled in accordance to presumed probability distributions. When sampled nuclear data are used in repeated model calculations, the output variance is attributed to the collective uncertainties of nuclear data. The NUSS (Nuclear data Uncertainty Stochastic Sampling) tool is based on this sampling approach and implemented to work with MCNPX’s ACE format of nuclear data, which also gives NUSS compatibility with MCNP and SERPENT M-C codes. In contrast, multigroup uncertainties are used for the sampling of ACE-formatted pointwise-energy nuclear data in a groupwise manner due to the more limited quantity and quality of nuclear data uncertainties. Conveniently, the usage of multigroup nuclear data uncertainties allows consistent comparison between NUSS and other methods (both S/U and sampling-based) that employ the same

  4. Sampling-based nuclear data uncertainty quantification for continuous energy Monte-Carlo codes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, T.

    2015-07-01

    Research on the uncertainty of nuclear data is motivated by practical necessity. Nuclear data uncertainties can propagate through nuclear system simulations into operation and safety related parameters. The tolerance for uncertainties in nuclear reactor design and operation can affect the economic efficiency of nuclear power, and essentially its sustainability. The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. For example, only recently in 2011, the capability of calculating continuous energy κ{sub eff} sensitivity to nuclear data was demonstrated in certain M-C codes by using the method of iterated fission probability. The methodology developed during this PhD research is fundamentally different from the conventional S/U approach: nuclear data are treated as random variables and sampled in accordance to presumed probability distributions. When sampled nuclear data are used in repeated model calculations, the output variance is attributed to the collective uncertainties of nuclear data. The NUSS (Nuclear data Uncertainty Stochastic Sampling) tool is based on this sampling approach and implemented to work with MCNPX’s ACE format of nuclear data, which also gives NUSS compatibility with MCNP and SERPENT M-C codes. In contrast, multigroup uncertainties are used for the sampling of ACE-formatted pointwise-energy nuclear data in a groupwise manner due to the more limited quantity and quality of nuclear data uncertainties. Conveniently, the usage of multigroup nuclear data uncertainties allows consistent comparison between NUSS and other methods (both S/U and sampling-based) that employ the same

  5. Quantification of the degradation of steels exposed to liquid lead-bismuth eutectic

    International Nuclear Information System (INIS)

    Schroer, C.; Voss, Z.; Novotny, J.; Konys, J.

    2006-05-01

    Metallographic and gravimetric methods of measuring the degradation of steels are introduced and compared, with emphasis on the quantification of oxidation in molten lead-bismuth eutectic (LBE). In future applications of LBE or other molten lead alloys, additions of oxygen should prevent the dissolution of steel constituents in the liquid heavy metal. Therefore, also the amount of steel constituents transferred between the steel (including the oxide scale formed on the surface) and the LBE has to be assessed, in order to evaluate the efficiency of oxygen additions with respect to preventing dissolution of the steel. For testing the methods of quantification, specimens of martensitic steel T91 were exposed for 1500 h to stagnant, oxygen-saturated LBE at 550 C, whereby, applying both metallographic and gravimetric measurements, the recession of the cross-section of sound material deviated by ± 3 μm for a mean value of 11 μm. Although the transfer of steel constituents between the solid phases and the LBE is negligible under the considered exposure conditions, the investigation shows that a gravimetric analysis is most promising for quantifying such a mass transfer. For laboratory experiments on the behaviour of steels in oxygen-containing LBE, it is suggested to make provisions for both metallographic and gravimetric measurements, since both types of methods have specific benefits in the characterisation of the oxidation process. (Orig.)

  6. Significant aspects of the external event analysis methodology of the Jose Cabrera NPP PSA

    International Nuclear Information System (INIS)

    Barquin Duena, A.; Martin Martinez, A.R.; Boneham, P.S.; Ortega Prieto, P.

    1994-01-01

    This paper describes the following advances in the methodology for Analysis of External Events in the PSA of the Jose Cabrera NPP: In the Fire Analysis, a version of the COMPBRN3 CODE, modified by Empresarios Agrupados according to the guidelines of Appendix D of the NUREG/CR-5088, has been used. Generic cases were modelled and general conclusions obtained, applicable to fire propagation in closed areas. The damage times obtained were appreciably lower than those obtained with the previous version of the code. The Flood Analysis methodology is based on the construction of event trees to represent flood propagation dependent on the condition of the communication paths between areas, and trees showing propagation stages as a function of affected areas and damaged mitigation equipment. To determine temporary evolution of the flood area level, the CAINZO-EA code has been developed, adapted to specific plant characteristics. In both the Fire and Flood Analyses a quantification methodology has been adopted, which consists of analysing the damages caused at each stage of growth or propagation and identifying, in the Internal Events models, the gates, basic events or headers to which safe failure (probability 1) due to damages is assigned. (Author)

  7. Integrating socio-economical dimensions in the ICRP cost-benefit model (a theoretical approach)

    International Nuclear Information System (INIS)

    Lochard, Jacques.

    1981-09-01

    This report aims at analysing, from a methodological point of view, the main problems associated with the integration of socio-economical dimensions in the cost-benefit model recommended by the ICRP in its publication no. 26. After recalling the basic principles of cost-benefit analysis, the elements to be retained in the objective function characterizing the analysis, and the question of the social benefit definitions are discussed. The theory of social surplus with an illustration taken from the radiological protection field is presented [fr

  8. Methodology for assessing the impacts of distributed generation interconnection

    Directory of Open Access Journals (Sweden)

    Luis E. Luna

    2011-06-01

    Full Text Available This paper proposes a methodology for identifying and assessing the impact of distributed generation interconnection on distribution systems using Monte Carlo techniques. This methodology consists of two analysis schemes: a technical analysis, which evaluates the reliability conditions of the distribution system; on the other hand, an economic analysis that evaluates the financial impacts on the electric utility and its customers, according to the system reliability level. The proposed methodology was applied to an IEEE test distribution system, considering different operation schemes for the distributed generation interconnection. The application of each one of these schemes provided significant improvements regarding the reliability and important economic benefits for the electric utility. However, such schemes resulted in negative profitability levels for certain customers, therefore, regulatory measures and bilateral contracts were proposed which would provide a solution for this kind of problem.

  9. Application of Fuzzy Comprehensive Evaluation Method in Trust Quantification

    Directory of Open Access Journals (Sweden)

    Shunan Ma

    2011-10-01

    Full Text Available Trust can play an important role for the sharing of resources and information in open network environments. Trust quantification is thus an important issue in dynamic trust management. By considering the fuzziness and uncertainty of trust, in this paper, we propose a fuzzy comprehensive evaluation method to quantify trust along with a trust quantification algorithm. Simulation results show that the trust quantification algorithm that we propose can effectively quantify trust and the quantified value of an entity's trust is consistent with the behavior of the entity.

  10. Methodology or method? A critical review of qualitative case study reports

    Directory of Open Access Journals (Sweden)

    Nerida Hyett

    2014-05-01

    Full Text Available Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12, social sciences and anthropology (n=7, or methods (n=15 case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.

  11. Methodology or method? A critical review of qualitative case study reports

    Science.gov (United States)

    Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia

    2014-01-01

    Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners. PMID:24809980

  12. Rapid and simple colorimetric method for the quantification of AI-2 produced from Salmonella Typhimurium.

    Science.gov (United States)

    Wattanavanitchakorn, Siriluck; Prakitchaiwattana, Cheunjit; Thamyongkit, Patchanita

    2014-04-01

    The aim of this study was to evaluate the feasibility of Fe(III) ion reduction for the simple and rapid quantification of autoinducer-2 (AI-2) produced from bacteria using Salmonella Typhimurium as a model. Since the molecular structure of AI-2 is somewhat similar to ascorbic acid it was expected that AI-2 would also act as a reducing agent and reduce Fe(III) ions in the presence of 1,10-phenanthroline to form the colored [(o-phen)3 Fe(II)]SO4 ferroin complex that could be quantified colorimetrically. In support of this, colony rinses and cell free supernatants from cultures of all tested AI-2 producing strains, but not the AI-2 negative Sinorhizobium meliloti, formed a colored complex with a λmax of 510nm. The OD510 values of these culture supernatants or colony rinses were in broad agreement with the % activity observed in the same samples using the standard Vibrio harveyi bioluminescence assay for AI-2 detection, and with previously reported results. This methodology could potentially be developed as an alternative method for the simple and rapid quantification of AI-2 levels produced in bacterial cultures. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Cost-Benefit Analysis of Rail-Noise Mitigation Programmes at European Level: Methodological Innovations from EURANO to STAIRRS

    OpenAIRE

    Aude Lenders; Nancy Da Silva; Walter Hecq; Baumgartner Thomas

    2001-01-01

    The STAIRRS project (2000-2002) is a follow-up of EURANO [1] and a Swiss study [2], in which the authors evaluated the efficiency of noise reduction measures in two European freight corridors. STAIRRS includes a cost-benefit analysis based on about 10,000 km of track modelled in seven countries. The benefits are defined in terms of the dB(A) experienced by those living in the rail corridors modelled. They are to be weighted by the number of persons benefiting each year from a noise reduction ...

  14. Cost analysis and ecological benefits of environmental recovery methodologies in bauxite mining

    OpenAIRE

    Guimarães,João Carlos Costa; Barros,Dalmo Arantes de; Pereira,José Aldo Alves; Silva,Rossi Allan; Oliveira,Antonio Donizette de; Borges,Luís Antônio Coimbra

    2013-01-01

    This work analyzed and compared three methods of environmental recovery in bauxite mining commonly used in Poços de Caldas Plateau, MG, by means of recovery costs and ecological benefits. Earnings and costs data of environmental recovery activities were obtained for the areas that belonged to the Companhia Geral de Minas – CGM, on properties sited in the city of Poços de Caldas, MG. The amount of costs of these activities was used to compare the recovery methods by updating them monetarily to...

  15. Cost-Benefit Analysis of a Biomass Power Plant in Morocco and a Photovoltaic Installation in Algeria

    International Nuclear Information System (INIS)

    Galan, A.; Gonzalez Leal, J.; Varela, M.

    1999-01-01

    This report presents an overview of cost-benefit analysis general methodology, describing its principles and basic characteristics. This methodology was applied to two case studies analyzed in the project INTERSUDMED, one biomass power plant fed by energy crops in El Hajeb (Morocco) and the other a photovoltaic installation in Djanet (Algeria). Both cases have been selected among the ones analyzed in the INTERSUDMED Project because of their interesting social implications and possible alternatives, that make them most suitable for cost-benefit analysis application. Finally, this report addresses the conclusions of both studies and summarizes the most relevant obtained results. (Author) 13 refs

  16. Cost-Benefit Analysis of a Biomass Power Plant in Morocco and a Photovoltaic Installation in Algeria

    Energy Technology Data Exchange (ETDEWEB)

    Galan, A.; Gonzalez Leal, J.; Varela, M.

    1999-07-01

    This report presents an overview of cost-benefit analysis general methodology, describing its principles and basic characteristics. This methodology was applied to two case studies analyzed in the project INTERSUDMED, one biomass power plant fed by energy crops in El Hajeb (Morocco) and the other a photovoltaic installation in Djanet (Algeria). Both cases have been selected among the ones analyzed in the INTERSUDMED Project because of their interesting social implications and possible alternatives, that make them most suitable for cost-benefit analysis application. Finally, this report addresses the conclusions of both studies and summarizes the most relevant obtained results. (Author) 13 refs.

  17. Iron overload in the liver diagnostic and quantification

    Energy Technology Data Exchange (ETDEWEB)

    Alustiza, Jose M. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)]. E-mail: jmalustiza@osatek.es; Castiella, Agustin [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Juan, Maria D. de [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Emparanza, Jose I. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Artetxe, Jose [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Uranga, Maite [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)

    2007-03-15

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification.

  18. Iron overload in the liver diagnostic and quantification

    International Nuclear Information System (INIS)

    Alustiza, Jose M.; Castiella, Agustin; Juan, Maria D. de; Emparanza, Jose I.; Artetxe, Jose; Uranga, Maite

    2007-01-01

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification

  19. Front-face fluorescence spectroscopy combined with second-order multivariate algorithms for the quantification of polyphenols in red wine samples.

    Science.gov (United States)

    Cabrera-Bañegil, Manuel; Hurtado-Sánchez, María Del Carmen; Galeano-Díaz, Teresa; Durán-Merás, Isabel

    2017-04-01

    The potential of front-face fluorescence spectroscopy combined with second-order chemometric methods was investigated for the quantification of the main polyphenols present in wine samples. Parallel factor analysis (PARAFAC) and unfolded-partial least squares coupled to residual bilinearization (U-PLS/RBL) were assessed for the quantification of catechin, epicatechin, quercetin, resveratrol, caffeic acid, gallic acid, p-coumaric acid, and vanillic acid in red wines. Excitation-emission matrices of different red wine samples, without pretreatment, were obtained in front-face mode, recording emission between 290 and 450 nm, exciting between 240 and 290 nm, for the analysis of epicatechin, catechin, caffeic acid, gallic acid, and vanillic acid; and excitation and emission between 300-360 and 330-400nm, respectively, for the analysis of resveratrol. U-PLS/RBL algorithm provided the best results and this methodology was validated by an optimized liquid chromatographic coupled to diode array and fluorimetric detectors procedure, obtaining a very good correlation for vanillic acid, caffeic acid, epicatechin and resveratrol. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    Energy Technology Data Exchange (ETDEWEB)

    Laborda, Francisco, E-mail: flaborda@unizar.es; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-21

    dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. - Highlights: • The challenge to analyze inorganic nanomaterials is described. • Techniques for detection, characterization and quantification of inorganic nanomaterials are presented. • Sample preparation methods for the analysis of nanomaterials in complex samples are presented. • Methodological approaches posed by stakeholders for solving nanometrological problems are discussed.

  1. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    International Nuclear Information System (INIS)

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-01

    dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. - Highlights: • The challenge to analyze inorganic nanomaterials is described. • Techniques for detection, characterization and quantification of inorganic nanomaterials are presented. • Sample preparation methods for the analysis of nanomaterials in complex samples are presented. • Methodological approaches posed by stakeholders for solving nanometrological problems are discussed.

  2. Quantification of menadione from plasma and urine by a novel cysteamine-derivatization based UPLC-MS/MS method.

    Science.gov (United States)

    Yuan, Teng-Fei; Wang, Shao-Ting; Li, Yan

    2017-09-15

    Menadione, as the crucial component of vitamin Ks, possessed significant nutritional and clinical values. However, there was still lack of favourable quantification strategies for it to date. For improvement, a novel cysteamine derivatization based UPLC-MS/MS method was presented in this work. The derivatizating reaction was proved non-toxic, easy-handling and high-efficient, which realized the MS detection of menadione under positive mode. Benefitting from the excellent sensitivity of the derivatizating product as well as the introduction of the stable isotope dilution technique, the quantification could be achieved in the range of 0.05-50.0ng/mL for plasma and urine matrixes with satisfied accuracy and precision. After analysis of the samples from healthy volunteers after oral administration of menadione sodium bisulfite tablets, the urinary free menadione was quantified for the very first time. We believe the progress in this work could largely promote the exploration of the metabolic mechanism of vitamin K in vivo. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very ...

  4. Quantification of isocyanates and amines in polyurethane foams and coated products by liquid chromatography–tandem mass spectrometry

    Science.gov (United States)

    Mutsuga, Motoh; Yamaguchi, Miku; Kawamura, Yoko

    2014-01-01

    An analytical method for the identification and quantification of 10 different isocyanates and 11 different amines in polyurethane (PUR) foam and PUR-coated products was developed and optimized. Isocyanates were extracted and derivatized with di-n-butylamine, while amines were extracted with methanol. Quantification was subsequently performed by liquid chromatography–tandem mass spectrometry. Using this methodology, residual levels of isocyanates and amines in commercial PUR products were quantified. Although the recoveries of certain isocyanates and amines were low, the main compounds used as monomers in the production of PUR products, and their decomposition species, were clearly identified at quantifiable levels. 2,4-and 2,6-toluenediisocyanate were detected in most PUR foam samples and a pastry bag in the range of 0.02–0.92 mg/kg, with their decomposition compounds, 2,4-and 2,6-toluenediamine, detected in all PUR foam samples in the range of 9.5–59 mg/kg. PUR-coated gloves are manufactured using 4,4′-methylenebisphenyl diisocyanate as the main raw material, and a large amount of this compound, in addition to 4,4′-methylenedianiline and dicyclohexylmethane-4,4′-diamine were found in these samples. PMID:24804074

  5. Development of sustainable performance indicators to assess the benefits of real-time monitoring in mechanised underground mining

    OpenAIRE

    Govindan, Rajesh; Cao, Wenzhuo; Korre, Anna; Durucan, Sevket; Graham, Peter; Simon, Clara; Barlow, Glenn; Pemberton, Ross

    2018-01-01

    This paper presents the development and quantification of a catalogue of Sustainable Performance Indicators (SPIs) for the assessment of the benefits real-time mining can offer in small and complex mechanised underground mining operations. The SPIs investigated in detail include: ‒ grade accuracy and error of the resource model, ‒ high/low grade ore classification accuracy and error, ‒ additional high grade ore identified per unit volume, ‒ profit expected per unit volume, ‒ or...

  6. Assessing the carbon benefit of saltmarsh restoration

    Science.gov (United States)

    Taylor, Benjamin; Paterson, David; Hanley, Nicholas

    2016-04-01

    The quantification of carbon sequestration rates in coastal ecosystems is required to better realise their potential role in climate change mitigation. Through accurate valuation this service can be fully appreciated and perhaps help facilitate efforts to restore vulnerable ecosystems such as saltmarshes. Vegetated coastal ecosystems are suggested to account for approximately 50% of oceanic sedimentary carbon despite their 2% areal extent. Saltmarshes, conservatively estimated to store 430 ± 30 Tg C in surface sediment deposits, have experienced extensive decline in the recent past; through processes such as land use change and coastal squeeze. Saltmarsh habitats offer a range of services that benefit society and the natural world, making their conservation meaningful and beneficial. The associated costs of restoration projects could, in part, be subsidised through payment for ecosystem services, specifically Blue carbon. Additional storage is generated through the (re)vegetation of mudflat areas leading to an altered ecosystem state and function; providing similar benefits to natural saltmarsh areas. The Eden Estuary, Fife, Scotland has been a site of saltmarsh restoration since 2000; providing a temporal and spatial scale to evaluate these additional benefits. The study is being conducted to quantify the carbon benefit of restoration efforts and provide an insight into the evolution of this benefit through sites of different ages. Seasonal sediment deposition and settlement rates are measured across the estuary in: mudflat, young planted saltmarsh, old planted saltmarsh and extant high marsh areas. Carbon values being derived from loss on ignition organic content values. Samples are taken across a tidal cycle on a seasonal basis; providing data on tidal influence, vegetation condition effects and climatic factors on sedimentation and carbon sequestration rates. These data will inform on the annual characteristics of sedimentary processes in the estuary and be

  7. A Big Data Analytics Methodology Program in the Health Sector

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  8. Absolute quantification of superoxide dismutase in cytosol and mitochondria of mice hepatic cells exposed to mercury by a novel metallomic approach

    Energy Technology Data Exchange (ETDEWEB)

    García-Sevillano, M.A.; García-Barrera, T. [Department of Chemistry and Materials Science, Faculty of Experimental Sciences, University of Huelva, Campus de El Carmen, Huelva 21007 (Spain); Research Center on Health and Environment (CYSMA), University of Huelva (Spain); International Campus of Excellence on Agrofood (ceiA3), University of Huelva (Spain); Navarro, F. [International Campus of Excellence on Agrofood (ceiA3), University of Huelva (Spain); Department of Environmental Biology and Public Health, Cell Biology, Faculty of Experimental Sciences, University of Huelva, Campus El Carmen, Huelva 21007 (Spain); Gómez-Ariza, J.L., E-mail: ariza@uhu.es [Department of Chemistry and Materials Science, Faculty of Experimental Sciences, University of Huelva, Campus de El Carmen, Huelva 21007 (Spain); Research Center on Health and Environment (CYSMA), University of Huelva (Spain); International Campus of Excellence on Agrofood (ceiA3), University of Huelva (Spain)

    2014-09-09

    Highlights: • Identification and quantification of Cu,Zn-superoxide dismutase in mice hepatic cells. • IDA-ICP-MSis applied to obtain a high degree of accuracy, precision and sensibility. • This methodology reduces the time of analysis and avoids clean-up procedures. • The application of this method to Hg-exposed mice reveals perturbations in Cu,Zn-SOD. - Abstract: In the last years, the development of new methods for analyzing accurate and precise individual metalloproteins is of increasing importance, since numerous metalloproteins are excellent biomarkers of oxidative stress and diseases. In that way, methods based on the use of post column isotopic dilution analysis (IDA) or enriched protein standards are required to obtain a sufficient degree of accuracy, precision and high limits of detection. This paper reports the identification and absolute quantification of Cu,Zn-superoxide dismutase (Cu,Zn-SOD) in cytosol and mitochondria from mice hepatic cells using a innovative column switching analytical approach. The method consisted of orthogonal chromatographic systems coupled to inductively coupling plasma-mass spectrometry equipped with a octopole reaction systems (ICP-ORS-MS) and UV detectors: size exclusion fractionation (SEC) of the cytosolic and mitochondrial extracts followed by online anion exchange chromatographic (AEC) separation of Cu/Zn containing species. After purification, Cu,Zn-SOD was identified after tryptic digestion by molecular mass spectrometry (MS). The MS/MS spectrum of a doubly charged peptide was used to obtain the sequence of the protein using the MASCOT searching engine. This optimized methodology reduces the time of analysis and avoids the use of sample preconcentration and clean-up procedures, such as cut-off centrifuged filters, solid phase extraction (SPE), precipitation procedures, off-line fractions insolates, etc. In this sense, the method is robust, reliable and fast with typical chromatographic run time less than 20 min

  9. Methodological challenges for the evaluation of clinical effectiveness in the context of accelerated regulatory approval: an overview.

    Science.gov (United States)

    Woolacott, Nerys; Corbett, Mark; Jones-Diette, Julie; Hodgson, Robert

    2017-10-01

    Regulatory authorities are approving innovative therapies with limited evidence. Although this level of data is sufficient for the regulator to establish an acceptable risk-benefit balance, it is problematic for downstream health technology assessment, where assessment of cost-effectiveness requires reliable estimates of effectiveness relative to existing clinical practice. Some key issues associated with a limited evidence base include using data, from nonrandomized studies, from small single-arm trials, or from single-center trials; and using surrogate end points. We examined these methodological challenges through a pragmatic review of the available literature. Methods to adjust nonrandomized studies for confounding are imperfect. The relative treatment effect generated from single-arm trials is uncertain and may be optimistic. Single-center trial results may not be generalizable. Surrogate end points, on average, overestimate treatment effects. Current methods for analyzing such data are limited, and effectiveness claims based on these suboptimal forms of evidence are likely to be subject to significant uncertainty. Assessments of cost-effectiveness, based on the modeling of such data, are likely to be subject to considerable uncertainty. This uncertainty must not be underestimated by decision makers: methods for its quantification are required and schemes to protect payers from the cost of uncertainty should be implemented. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  10. Integrated Cost-Benefit Assessment of Customer-Driven Distributed Generatio

    Directory of Open Access Journals (Sweden)

    Čedomir Zeljković

    2014-06-01

    Full Text Available Distributed generation (DG has the potential to bring respectable benefits to electricity customers, distribution utilities and community in general. Among the customer benefits, the most important are the electricity bill reduction, reliability improvement, use of recovered heat, and qualifying for financial incentives. In this paper, an integrated cost-benefit methodology for assessment of customer-driven DG is presented. Target customers are the industrial and commercial end-users that are critically dependent on electricity supply, due to high consumption, high power peak demand or high electricity supply reliability requirements. Stochastic inputs are represented by the appropriate probability models and then the Monte Carlo simulation is employed for each investment alternative. The obtained probability distributions for the prospective profit are used to assess the risk, compare the alternatives and make decisions.

  11. Comparison of Suitability of the Most Common Ancient DNA Quantification Methods.

    Science.gov (United States)

    Brzobohatá, Kristýna; Drozdová, Eva; Smutný, Jiří; Zeman, Tomáš; Beňuš, Radoslav

    2017-04-01

    Ancient DNA (aDNA) extracted from historical bones is damaged and fragmented into short segments, present in low quantity, and usually copurified with microbial DNA. A wide range of DNA quantification methods are available. The aim of this study was to compare the five most common DNA quantification methods for aDNA. Quantification methods were tested on DNA extracted from skeletal material originating from an early medieval burial site. The tested methods included ultraviolet (UV) absorbance, real-time quantitative polymerase chain reaction (qPCR) based on SYBR ® green detection, real-time qPCR based on a forensic kit, quantification via fluorescent dyes bonded to DNA, and fragmentary analysis. Differences between groups were tested using a paired t-test. Methods that measure total DNA present in the sample (NanoDrop ™ UV spectrophotometer and Qubit ® fluorometer) showed the highest concentrations. Methods based on real-time qPCR underestimated the quantity of aDNA. The most accurate method of aDNA quantification was fragmentary analysis, which also allows DNA quantification of the desired length and is not affected by PCR inhibitors. Methods based on the quantification of the total amount of DNA in samples are unsuitable for ancient samples as they overestimate the amount of DNA presumably due to the presence of microbial DNA. Real-time qPCR methods give undervalued results due to DNA damage and the presence of PCR inhibitors. DNA quantification methods based on fragment analysis show not only the quantity of DNA but also fragment length.

  12. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Directory of Open Access Journals (Sweden)

    Žel Jana

    2006-08-01

    Full Text Available Abstract Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was

  13. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Science.gov (United States)

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  14. An Analysis of Information Asset Valuation (IAV) Quantification Methodology for Application with Cyber Information Mission Impact Assessment (CIMIA)

    National Research Council Canada - National Science Library

    Hellesen, Denzil L

    2008-01-01

    .... The IAV methodology proposes that accurate valuation for an Information Asset (InfoA) is the convergence of information tangible, intangible, and flow attributes to form a functional entity that enhances mission capability...

  15. Strategic alternatives ranking methodology: Multiple RCRA incinerator evaluation test case

    International Nuclear Information System (INIS)

    Baker, G.; Thomson, R.D.; Reece, J.; Springer, L.; Main, D.

    1988-01-01

    This paper presents an important process approach to permit quantification and ranking of multiple alternatives being considered in remedial actions or hazardous waste strategies. This process is a methodology for evaluating programmatic options in support of site selection or environmental analyses. Political or other less tangible motivations for alternatives may be quantified by means of establishing the range of significant variables, weighting their importance, and by establishing specific criteria for scoring individual alternatives. An application of the process to a recent AFLC program permitted ranking incineration alternatives from a list of over 130 options. The process forced participation by the organizations to be effected, allowed a consensus of opinion to be achieved, allowed complete flexibility to evaluate factor sensitivity, and resulted in strong, quantifiable support for any subsequent site-selection action NEPA documents

  16. Health effects assessment of chemical exposures: ARIES methodology

    Energy Technology Data Exchange (ETDEWEB)

    Sierra, L; Montero, M.; Rabago, I.; Vidania, R.

    1995-07-01

    In this work, we present ARIES* update: a system designed in order to facilitate the human health effects assessment produced by accidental release of toxic chemicals. The first version of ARIES was developed in relation to 82/501/EEC Directive about mayor accidents in the chemical industry. So, the first aim was the support of the effects assessment derived for the chemicals included into this directive. From this establishment, it was considered acute exposures for high concentrations. In this report, we present the actual methodology for considering other type of exposures, such as environmental and occupational. Likewise other versions, the methodology comprises two approaches: quantitative and qualitative assessments. Quantitative assessment incorporates the mathematical algorithms useful to evaluate the effects produced by the most important routes of exposure: inhalation, ingestion, eye contact and skin absorption, in a short, medium and long term. It has been included models that realizes an accurate quantification of doses, effects,... and so on, such as simple approaches when the available information is not enough. Qualitative assessment, designed in order to complement or replace the previous one, is incorporated into an informatics system, developed in Clipper. It executes and displays outstanding and important toxicological information of about 100 chemicals. This information comes from ECDIN (Environmental Chemicals Data and Information Network) database through a collaboration with JRC-ISPRA working group. (Author) 24 refs.

  17. Health effects assessment of chemical exposures: ARIES methodology

    International Nuclear Information System (INIS)

    Sierra, L; Montero, M.; Rabago, I.; Vidania, R.

    1995-01-01

    In this work, we present ARIES* update: a system designed in order to facilitate the human health effects assessment produced by accidental release of toxic chemicals. The first version of ARIES was developed in relation to 82/501/EEC Directive about mayor accidents in the chemical industry. So, the first aim was the support of the effects assessment derived for the chemicals included into this directive. From this establishment, it was considered acute exposures for high concentrations. In this report, we present the actual methodology for considering other type of exposures, such as environmental and occupational. Likewise other versions, the methodology comprises two approaches: quantitative and qualitative assessments. Quantitative assessment incorporates the mathematical algorithms useful to evaluate the effects produced by the most important routes of exposure: inhalation, ingestion, eye contact and skin absorption, in a short, medium and long term. It has been included models that realizes an accurate quantification of doses, effects,... and so on, such as simple approaches when the available information is not enough. Qualitative assessment, designed in order to complement or replace the previous one, is incorporated into an informatics system, developed in Clipper. It executes and displays outstanding and important toxicological information of about 100 chemicals. This information comes from ECDIN (Environmental Chemicals Data and Information Network) database through a collaboration with JRC-ISPRA working group. (Author) 24 refs

  18. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  19. The ecosystem services valuation tool and its future developments

    OpenAIRE

    Liekens, Inge; Broekx, Steven; Smeets, Nele; Staes, Jan; Biest, Van der, Katrien; Schaafsma, Marije; Nocker, De, Leo; Meire, Patrick; Cerulus, Tanya

    2014-01-01

    Abstract: Although methodologies for classification, quantification, and valuation of ecosystem services are improving drastically, applications of the ecosystem services concept in day-to-day decision-making processes remain limited, especially at the planning level. Nevertheless, spatial planning decisions would benefit from systematic considerations of their effects on ecosystem services. Assessing the impacts of policy on a wide range of ecosystem services contributes to more cost-effecti...

  20. GMO quantification: valuable experience and insights for the future.

    Science.gov (United States)

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  1. A risk-informed approach of quantification of epistemic uncertainty for the long-term radioactive waste disposal. Improving reliability of expert judgements with an advanced elicitation procedure

    International Nuclear Information System (INIS)

    Sugiyama, Daisuke; Chida, Taiji; Fujita, Tomonari; Tsukamoto, Masaki

    2011-01-01

    A quantification methodology of epistemic uncertainty by expert judgement based on the risk-informed approach is developed to assess inevitable uncertainty for the long-term safety assessment of radioactive waste disposal. The proposed method in this study employs techniques of logic tree, by which options of models and/or scenarios are identified, and Evidential Support Logic (ESL), by which possibility of each option is quantified. In this report, the effect of a feedback process of discussion between experts and input of state-of-the-art knowledge in the proposed method is discussed to estimate alteration of the distribution of expert judgements which is one of the factors causing uncertainty. In a preliminary quantification experiment of uncertainty of degradation of the engineering barrier materials in a tentative sub-surface disposal using the proposed methodology, experts themselves modified questions appropriately to facilitate sound judgements and to correlate those with scientific evidences clearly. The result suggests that the method effectively improves confidence of expert judgement. Also, the degree of consensus of expert judgement was sort of improved in some cases, since scientific knowledge and information of expert judgement in other fields became common understanding. It is suggested that the proposed method could facilitate consensus on uncertainty between interested persons. (author)

  2. The Quantification Process for the PRiME-U34i

    International Nuclear Information System (INIS)

    Hwang, Mee-Jeong; Han, Sang-Hoon; Yang, Joon-Eon

    2006-01-01

    In this paper, we introduce the quantification process for the PRIME-U34i, which is the merged model of ETs (Event Trees) and FTs (Fault Trees) for the level 1 internal PSA of UCN 3 and 4. PRiME-U34i has one top event. Therefore, the quantification process is changed to a simplified method when compared to the past one. In the past, we used the text file called a user file to control the quantification process. However, this user file is so complicated that it is difficult for a non-expert to understand it. Moreover, in the past PSA, ET and FT were separated but in PRiMEU34i, ET and FT were merged together. Thus, the quantification process is different. This paper is composed of five sections. In section 2, we introduce the construction of the one top model. Section 3 shows the quantification process used in the PRiME-U34i. Section 4 describes the post processing. Last section is the conclusions

  3. Assessing Personality and Mood With Adjective Check List Methodology: A Review

    Science.gov (United States)

    Craig, Robert J.

    2005-01-01

    This article addresses the benefits and problems in using adjective check list methodology to assess personality. Recent developments in this assessment method are reviewed, emphasizing seminal adjective-based personality tests (Gough's Adjective Check List), mood tests (Lubin's Depressive Adjective Test, Multiple Affect Adjective Check List),…

  4. The applicability of real-time PCR in the diagnostic of cutaneous leishmaniasis and parasite quantification for clinical management: Current status and perspectives.

    Science.gov (United States)

    Moreira, Otacilio C; Yadon, Zaida E; Cupolillo, Elisa

    2017-09-29

    Cutaneous leishmaniasis (CL) is spread worldwide and is the most common manifestation of leishmaniasis. Diagnosis is performed by combining clinical and epidemiological features, and through the detection of Leishmania parasites (or DNA) in tissue specimens or trough parasite isolation in culture medium. Diagnosis of CL is challenging, reflecting the pleomorphic clinical manifestations of this disease. Skin lesions vary in severity, clinical appearance, and duration, and in some cases, they can be indistinguishable from lesions related to other diseases. Over the past few decades, PCR-based methods, including real-time PCR assays, have been developed for Leishmania detection, quantification and species identification, improving the molecular diagnosis of CL. This review provides an overview of many real-time PCR methods reported for the diagnostic evaluation of CL and some recommendations for the application of these methods for quantification purposes for clinical management and epidemiological studies. Furthermore, the use of real-time PCR for Leishmania species identification is also presented. The advantages of real-time PCR protocols are numerous, including increased sensitivity and specificity and simpler standardization of diagnostic procedures. However, despite the numerous assays described, there is still no consensus regarding the methods employed. Furthermore, the analytical and clinical validation of CL molecular diagnosis has not followed international guidelines so far. A consensus methodology comprising a DNA extraction protocol with an exogenous quality control and an internal reference to normalize parasite load is still needed. In addition, the analytical and clinical performance of any consensus methodology must be accurately assessed. This review shows that a standardization initiative is essential to guide researchers and clinical laboratories towards the achievement of a robust and reproducible methodology, which will permit further evaluation

  5. Artifacts Quantification of Metal Implants in MRI

    Science.gov (United States)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  6. Methodology for quantification of radionuclides used in therapy by bioanalysis 'in vitro'

    International Nuclear Information System (INIS)

    Juliao, Ligia M.Q.C.; Sousa, Wanderson O.; Mesquita, Sueli A.; Santos, Maristela S.; Oliveira, S.M. Velasques de

    2008-01-01

    In Brazil, the radionuclides used for therapy are 131 ; 153 Sm, 90 Y and 177 Lu, under routine or experimentally. The quantification of the radiopharmaceutical activity excreted by the patient through the bioassay method, can be an important tool for individualized dosimetry, aiming the planning of subsequent therapies. The Bioanalysis In Vitro Laboratory (LBIOVT) of the Service of Individual monitoring (SEMIN) of the Institute for Radiation Protection and Dosimetry (IRD/CNEN-RJ), Brazil, has equipment and procedures for gamma and beta spectrometry. These detection systems are calibrated in energy and efficiency, and used standard reference sources provided by the National Laboratory of Metrology of Ionizing Radiation (LMNRI/IRD/CNEN-RJ). The LBIOVT Quality System follows the guidelines of the ISO-ABNT-17025 standard and annually, the laboratory participates in national (PNI) and international (PROCORAD). With respect to the excreta samples from patients, these are collected immediately after administration of the radiopharmaceutical. During the first 24 hours, they are collected with the patient hospitalized, and depending upon the physical half-life of the radionuclide can also be collected in the patient's home. Both in hospital and at home, the excreta is handled, stored and transported in accordance with standards for clinical research, radiation protection and transport of radioactive and biological materials. The specific activity radionuclide is referenced to the date and time of collection, allowing further evaluation of biological individual half-life. The care with the registration of excreted volumes as well as possible loss of excreta during collection, may interfere with the interpretation of the measures, since the results are provided in specific activity (Bq / L). Regarding the bioassay laboratory, these results are reliable when the laboratory is certified and participates in intercomparison programs of measures and methods. The laboratory

  7. Lesion detection and quantification performance of the Tachyon-I time-of-flight PET scanner: phantom and human studies

    Science.gov (United States)

    Zhang, Xuezhu; Peng, Qiyu; Zhou, Jian; Huber, Jennifer S.; Moses, William W.; Qi, Jinyi

    2018-03-01

    The first generation Tachyon PET (Tachyon-I) is a demonstration single-ring PET scanner that reaches a coincidence timing resolution of 314 ps using LSO scintillator crystals coupled to conventional photomultiplier tubes. The objective of this study was to quantify the improvement in both lesion detection and quantification performance resulting from the improved time-of-flight (TOF) capability of the Tachyon-I scanner. We developed a quantitative TOF image reconstruction method for the Tachyon-I and evaluated its TOF gain for lesion detection and quantification. Scans of either a standard NEMA torso phantom or healthy volunteers were used as the normal background data. Separately scanned point source and sphere data were superimposed onto the phantom or human data after accounting for the object attenuation. We used the bootstrap method to generate multiple independent noisy datasets with and without a lesion present. The signal-to-noise ratio (SNR) of a channelized hotelling observer (CHO) was calculated for each lesion size and location combination to evaluate the lesion detection performance. The bias versus standard deviation trade-off of each lesion uptake was also calculated to evaluate the quantification performance. The resulting CHO-SNR measurements showed improved performance in lesion detection with better timing resolution. The detection performance was also dependent on the lesion size and location, in addition to the background object size and shape. The results of bias versus noise trade-off showed that the noise (standard deviation) reduction ratio was about 1.1–1.3 over the TOF 500 ps and 1.5–1.9 over the non-TOF modes, similar to the SNR gains for lesion detection. In conclusion, this Tachyon-I PET study demonstrated the benefit of improved time-of-flight capability on lesion detection and ROI quantification for both phantom and human subjects.

  8. Accounting for between-study variation in incremental net benefit in value of information methodology.

    Science.gov (United States)

    Willan, Andrew R; Eckermann, Simon

    2012-10-01

    Previous applications of value of information methods for determining optimal sample size in randomized clinical trials have assumed no between-study variation in mean incremental net benefit. By adopting a hierarchical model, we provide a solution for determining optimal sample size with this assumption relaxed. The solution is illustrated with two examples from the literature. Expected net gain increases with increasing between-study variation, reflecting the increased uncertainty in incremental net benefit and reduced extent to which data are borrowed from previous evidence. Hence, a trial can become optimal where current evidence is sufficient assuming no between-study variation. However, despite the expected net gain increasing, the optimal sample size in the illustrated examples is relatively insensitive to the amount of between-study variation. Further percentage losses in expected net gain were small even when choosing sample sizes that reflected widely different between-study variation. Copyright © 2011 John Wiley & Sons, Ltd.

  9. Meeting the challenges in the development of risk-benefit assessment of foods

    DEFF Research Database (Denmark)

    Nauta, Maarten; Andersen, Rikke; Pilegaard, Kirsten

    2018-01-01

    challenges are identified and discussed. They include the variety of different definitions and terminologies used in the underlying research disciplines, the differences between the “bottom-up” and the “top-down” approaches and the need for clear risk-benefit questions. The frequent lack of data......Background Risk-benefit assessment (RBA) of foods aims to assess the combined negative and positive health effects associated with food intake. RBAs integrate chemical and microbiological risk assessment with risk and benefit assessment in nutrition. Scope and Approach Based on the past experiences...... interdisciplinary consensus, reconsideration of methodological approaches and health metrics based on a categorisation of risk-benefit questions, and the performance of case studies to experience the feasibility of the proposed approaches....

  10. Quantification of trace-level DNA by real-time whole genome amplification.

    Science.gov (United States)

    Kang, Min-Jung; Yu, Hannah; Kim, Sook-Kyung; Park, Sang-Ryoul; Yang, Inchul

    2011-01-01

    Quantification of trace amounts of DNA is a challenge in analytical applications where the concentration of a target DNA is very low or only limited amounts of samples are available for analysis. PCR-based methods including real-time PCR are highly sensitive and widely used for quantification of low-level DNA samples. However, ordinary PCR methods require at least one copy of a specific gene sequence for amplification and may not work for a sub-genomic amount of DNA. We suggest a real-time whole genome amplification method adopting the degenerate oligonucleotide primed PCR (DOP-PCR) for quantification of sub-genomic amounts of DNA. This approach enabled quantification of sub-picogram amounts of DNA independently of their sequences. When the method was applied to the human placental DNA of which amount was accurately determined by inductively coupled plasma-optical emission spectroscopy (ICP-OES), an accurate and stable quantification capability for DNA samples ranging from 80 fg to 8 ng was obtained. In blind tests of laboratory-prepared DNA samples, measurement accuracies of 7.4%, -2.1%, and -13.9% with analytical precisions around 15% were achieved for 400-pg, 4-pg, and 400-fg DNA samples, respectively. A similar quantification capability was also observed for other DNA species from calf, E. coli, and lambda phage. Therefore, when provided with an appropriate standard DNA, the suggested real-time DOP-PCR method can be used as a universal method for quantification of trace amounts of DNA.

  11. Collagen Quantification in Tissue Specimens.

    Science.gov (United States)

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  12. Methodological quality of systematic reviews on influenza vaccination.

    Science.gov (United States)

    Remschmidt, Cornelius; Wichmann, Ole; Harder, Thomas

    2014-03-26

    There is a growing body of evidence on the risks and benefits of influenza vaccination in various target groups. Systematic reviews are of particular importance for policy decisions. However, their methodological quality can vary considerably. To investigate the methodological quality of systematic reviews on influenza vaccination (efficacy, effectiveness, safety) and to identify influencing factors. A systematic literature search on systematic reviews on influenza vaccination was performed, using MEDLINE, EMBASE and three additional databases (1990-2013). Review characteristics were extracted and the methodological quality of the reviews was evaluated using the assessment of multiple systematic reviews (AMSTAR) tool. U-test, Kruskal-Wallis test, chi-square test, and multivariable linear regression analysis were used to assess the influence of review characteristics on AMSTAR-score. Fourty-six systematic reviews fulfilled the inclusion criteria. Average methodological quality was high (median AMSTAR-score: 8), but variability was large (AMSTAR range: 0-11). Quality did not differ significantly according to vaccination target group. Cochrane reviews had higher methodological quality than non-Cochrane reviews (p=0.001). Detailed analysis showed that this was due to better study selection and data extraction, inclusion of unpublished studies, and better reporting of study characteristics (all p<0.05). In the adjusted analysis, no other factor, including industry sponsorship or journal impact factor had an influence on AMSTAR score. Systematic reviews on influenza vaccination showed large differences regarding their methodological quality. Reviews conducted by the Cochrane collaboration were of higher quality than others. When using systematic reviews to guide the development of vaccination recommendations, the methodological quality of a review in addition to its content should be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. The Methodological Imperatives of Feminist Ethnography

    Directory of Open Access Journals (Sweden)

    Richelle D. Schrock

    2013-12-01

    Full Text Available Feminist ethnography does not have a single, coherent definition and is caught between struggles over the definition and goals of feminism and the multiple practices known collectively as ethnography. Towards the end of the 1980s, debates emerged that problematized feminist ethnography as a productive methodology and these debates still haunt feminist ethnographers today. In this article, I provide a concise historiography of feminist ethnography that summarizes both its promises and its vulnerabilities. I address the three major challenges I argue feminist ethnographers currently face, which include responding productively to feminist critiques of representing "others," accounting for feminisms' commitment to social change while grappling with poststructuralist critiques of knowledge production, and confronting the historical and ongoing lack of recognition for significant contributions by feminist ethnographers. Despite these challenges, I argue that feminist ethnography is a productive methodology and I conclude by delineating its methodological imperatives. These imperatives include producing knowledge about women's lives in specific cultural contexts, recognizing the potential detriments and benefits of representation, exploring women's experiences of oppression along with the agency they exercise in their own lives, and feeling an ethical responsibility towards the communities in which the researchers work. I argue that this set of imperatives enables feminist ethnographers to successfully navigate the challenges they face.

  14. Determination of benefit of early identification of severe forms of ...

    African Journals Online (AJOL)

    Background/Aims: A pilot study to determine benefits of early identification of severe forms of malaria in peripheral centres was carried out in 3 rural communities of South Eastern Nigeria. Methodology: The study area is located in the rain forest belt of South Eastern Nigeria with high temperature and humidity. It is a typical ...

  15. Cutset Quantification Error Evaluation for Shin-Kori 1 and 2 PSA model

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2009-01-01

    Probabilistic safety assessments (PSA) for nuclear power plants (NPPs) are based on the minimal cut set (MCS) quantification method. In PSAs, the risk and importance measures are computed from a cutset equation mainly by using approximations. The conservatism of the approximations is also a source of quantification uncertainty. In this paper, exact MCS quantification methods which are based on the 'sum of disjoint products (SDP)' logic and Inclusion-exclusion formula are applied and the conservatism of the MCS quantification results in Shin-Kori 1 and 2 PSA is evaluated

  16. The business case: The missing link between information technology benefits and organisational strategies

    Directory of Open Access Journals (Sweden)

    Carl Marnewick

    2014-07-01

    Full Text Available Purpose: Business cases are an integral part of information technology (IT projects, providingthe linkage between the organisational strategies and the promised benefits. Most majorproject management standards and methodologies make reference to the business case andits intended usage. Problem investigated: The success of IT projects is measured based on the benefits they deliver; anecdotal evidence states that IT projects fail at an alarming rate. The benefits are promised in the business case and should be delivered. This study focuses on whether there is a gap between theory and practice with regard to the way that organisations use the business case to approve, manage and track the promised benefits throughout an IT project. Methodology: This article reports on exploratory research that was initiated to establish the current practice of business case application. Four research questions were developed based on an extensive literature review to support or debunk the anecdotal evidence. Semi-structured interviews were used to gather evidence from organisations based on these research questions. Findings: The results suggest that organisations make use of business cases for various reasons and mostly in line with theory. There are, however, aspects that need to be addressed, such as the linkage between the business case and the harvesting of promised benefits. Value of research: This article confirms the theoretical aspects of the business case but highlights some deviations from practice. Organisations need to be more vigilant in the management of the business case to ensure the tracking and realisation of promised benefits.

  17. The Oil Security Metrics Model: A Tool for Evaluating the Prospective Oil Security Benefits of DOE's Energy Efficiency and Renewable Energy R&D Programs

    Energy Technology Data Exchange (ETDEWEB)

    Greene, David L [ORNL; Leiby, Paul Newsome [ORNL

    2006-05-01

    Energy technology R&D is a cornerstone of U.S. energy policy. Understanding the potential for energy technology R&D to solve the nation's energy problems is critical to formulating a successful R&D program. In light of this, the U.S. Congress requested the National Research Council (NRC) to undertake both retrospective and prospective assessments of the Department of Energy's (DOE's) Energy Efficiency and Fossil Energy Research programs (NRC, 2001; NRC, 2005). ("The Congress continued to express its interest in R&D benefits assessment by providing funds for the NRC to build on the retrospective methodology to develop a methodology for assessing prospective benefits." NRC, 2005, p. ES-2) In 2004, the NRC Committee on Prospective Benefits of DOE's Energy Efficiency and Fossil Energy R&D Programs published a report recommending a new framework and principles for prospective benefits assessment. The Committee explicitly deferred the issue of estimating security benefits to future work. Recognizing the need for a rigorous framework for assessing the energy security benefits of its R&D programs, the DOE's Office of Energy Efficiency and Renewable Energy (EERE) developed a framework and approach for defining energy security metrics for R&D programs to use in gauging the energy security benefits of their programs (Lee, 2005). This report describes methods for estimating the prospective oil security benefits of EERE's R&D programs that are consistent with the methodologies of the NRC (2005) Committee and that build on Lee's (2005) framework. Its objective is to define and implement a method that makes use of the NRC's typology of prospective benefits and methodological framework, satisfies the NRC's criteria for prospective benefits evaluation, and permits measurement of that portion of the prospective energy security benefits of EERE's R&D portfolio related to oil. While the Oil Security Metrics (OSM) methodology described

  18. CIAU methodology and BEPU applications

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.

    2009-01-01

    Best-Estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are unpredictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Uncertainties may have different origins ranging from the approximation of the models, to the approximation of the numerical solution, and to the lack of precision of the values adopted for boundary and initial conditions. The amount of uncertainty that affects a calculation may strongly depend upon the codes and the modeling techniques (i.e. the code's users). A consistent and robust uncertainty methodology must be developed taking into consideration all the above aspects. The CIAU (Code with the capability of Internal Assessment of Uncertainty) and the UMAE (Uncertainty Methodology based on Accuracy Evaluation) methods have been developed by University of Pisa (UNIPI) in the framework of a long lasting research activities started since 80's and involving several researchers. CIAU is extensively discussed in the available technical literature, Refs. [1, 2, 3, 4, 5, 6, 7], and tens of additional relevant papers, that provide comprehensive details about the method, can be found in the bibliography lists of the above references. Therefore, the present paper supplies only 'spot-information' about CIAU and focuses mostly on the applications to some cases of industrial interest. In particular the application of CIAU to the OECD BEMUSE (Best Estimate Methods Uncertainty and Sensitivity Evaluation, [8, 9]) project is discussed and a critical comparison respect with other uncertainty methods (in relation to items like: sources of uncertainties, selection of the input parameters and quantification of

  19. An overall methodology for reliability prediction of mechatronic systems design with industrial application

    International Nuclear Information System (INIS)

    Habchi, Georges; Barthod, Christine

    2016-01-01

    We propose in this paper an overall ten-step methodology dedicated to the analysis and quantification of reliability during the design phase of a mechatronic system, considered as a complex system. The ten steps of the methodology are detailed according to the downward side of the V-development cycle usually used for the design of complex systems. Two main phases of analysis are complementary and cover the ten steps, qualitative analysis and quantitative analysis. The qualitative phase proposes to analyze the functional and dysfunctional behavior of the system and then determine its different failure modes and degradation states, based on external and internal functional analysis, organic and physical implementation, and dependencies between components, with consideration of customer specifications and mission profile. The quantitative phase is used to calculate the reliability of the system and its components, based on the qualitative behavior patterns, and considering data gathering and processing and reliability targets. Systemic approach is used to calculate the reliability of the system taking into account: the different technologies of a mechatronic system (mechanics, electronics, electrical .), dependencies and interactions between components and external influencing factors. To validate the methodology, the ten steps are applied to an industrial system, the smart actuator of Pack'Aero Company. - Highlights: • A ten-step methodology for reliability prediction of mechatronic systems design. • Qualitative and quantitative analysis for reliability evaluation using PN and RBD. • A dependency matrix proposal, based on the collateral and functional interactions. • Models consider mission profile, deterioration, interactions and influent factors. • Application and validation of the methodology on the “Smart Actuator” of PACK’AERO.

  20. Measuring Identification and Quantification Errors in Spectral CT Material Decomposition

    Directory of Open Access Journals (Sweden)

    Aamir Younis Raja

    2018-03-01

    Full Text Available Material decomposition methods are used to identify and quantify multiple tissue components in spectral CT but there is no published method to quantify the misidentification of materials. This paper describes a new method for assessing misidentification and mis-quantification in spectral CT. We scanned a phantom containing gadolinium (1, 2, 4, 8 mg/mL, hydroxyapatite (54.3, 211.7, 808.5 mg/mL, water and vegetable oil using a MARS spectral scanner equipped with a poly-energetic X-ray source operated at 118 kVp and a CdTe Medipix3RX camera. Two imaging protocols were used; both with and without 0.375 mm external brass filter. A proprietary material decomposition method identified voxels as gadolinium, hydroxyapatite, lipid or water. Sensitivity and specificity information was used to evaluate material misidentification. Biological samples were also scanned. There were marked differences in identification and quantification between the two protocols even though spectral and linear correlation of gadolinium and hydroxyapatite in the reconstructed images was high and no qualitative segmentation differences in the material decomposed images were observed. At 8 mg/mL, gadolinium was correctly identified for both protocols, but concentration was underestimated by over half for the unfiltered protocol. At 1 mg/mL, gadolinium was misidentified in 38% of voxels for the filtered protocol and 58% of voxels for the unfiltered protocol. Hydroxyapatite was correctly identified at the two higher concentrations for both protocols, but mis-quantified for the unfiltered protocol. Gadolinium concentration as measured in the biological specimen showed a two-fold difference between protocols. In future, this methodology could be used to compare and optimize scanning protocols, image reconstruction methods, and methods for material differentiation in spectral CT.

  1. The relationship between return on investment and quality of study methodology in workplace health promotion programs.

    Science.gov (United States)

    Baxter, Siyan; Sanderson, Kristy; Venn, Alison J; Blizzard, C Leigh; Palmer, Andrew J

    2014-01-01

    To determine the relationship between return on investment (ROI) and quality of study methodology in workplace health promotion programs. Data were obtained through a systematic literature search of National Health Service Economic Evaluation Database (NHS EED), Database of Abstracts of Reviews of Effects (DARE), Health Technology Database (HTA), Cost Effectiveness Analysis (CEA) Registry, EconLit, PubMed, Embase, Wiley, and Scopus. Included were articles written in English or German reporting cost(s) and benefit(s) and single or multicomponent health promotion programs on working adults. Return-to-work and workplace injury prevention studies were excluded. Methodological quality was graded using British Medical Journal Economic Evaluation Working Party checklist. Economic outcomes were presented as ROI. ROI was calculated as ROI = (benefits - costs of program)/costs of program. Results were weighted by study size and combined using meta-analysis techniques. Sensitivity analysis was performed using two additional methodological quality checklists. The influences of quality score and important study characteristics on ROI were explored. Fifty-one studies (61 intervention arms) published between 1984 and 2012 included 261,901 participants and 122,242 controls from nine industry types across 12 countries. Methodological quality scores were highly correlated between checklists (r = .84-.93). Methodological quality improved over time. Overall weighted ROI [mean ± standard deviation (confidence interval)] was 1.38 ± 1.97 (1.38-1.39), which indicated a 138% return on investment. When accounting for methodological quality, an inverse relationship to ROI was found. High-quality studies (n = 18) had a smaller mean ROI, 0.26 ± 1.74 (.23-.30), compared to moderate (n = 16) 0.90 ± 1.25 (.90-.91) and low-quality (n = 27) 2.32 ± 2.14 (2.30-2.33) studies. Randomized control trials (RCTs) (n = 12) exhibited negative ROI, -0.22 ± 2.41(-.27 to -.16). Financial returns become

  2. A methodology to assess the economic impact of power storage technologies.

    Science.gov (United States)

    El-Ghandour, Laila; Johnson, Timothy C

    2017-08-13

    We present a methodology for assessing the economic impact of power storage technologies. The methodology is founded on classical approaches to the optimal stopping of stochastic processes but involves an innovation that circumvents the need to, ex ante , identify the form of a driving process and works directly on observed data, avoiding model risks. Power storage is regarded as a complement to the intermittent output of renewable energy generators and is therefore important in contributing to the reduction of carbon-intensive power generation. Our aim is to present a methodology suitable for use by policy makers that is simple to maintain, adaptable to different technologies and easy to interpret. The methodology has benefits over current techniques and is able to value, by identifying a viable optimal operational strategy, a conceived storage facility based on compressed air technology operating in the UK.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  3. Stereological quantification of mast cells in human synovium

    DEFF Research Database (Denmark)

    Damsgaard, T E; Sørensen, Flemming Brandt; Herlin, T

    1999-01-01

    Mast cells participate in both the acute allergic reaction as well as in chronic inflammatory diseases. Earlier studies have revealed divergent results regarding the quantification of mast cells in the human synovium. The aim of the present study was therefore to quantify these cells in the human...... synovium, using stereological techniques. Different methods of staining and quantification have previously been used for mast cell quantification in human synovium. Stereological techniques provide precise and unbiased information on the number of cell profiles in two-dimensional tissue sections of......, in this case, human synovium. In 10 patients suffering from osteoarthritis a median of 3.6 mast cells/mm2 synovial membrane was found. The total number of cells (synoviocytes, fibroblasts, lymphocytes, leukocytes) present was 395.9 cells/mm2 (median). The mast cells constituted 0.8% of all the cell profiles...

  4. New LightCycler PCR for Rapid and Sensitive Quantification of Parvovirus B19 DNA Guides Therapeutic Decision-Making in Relapsing Infections

    Science.gov (United States)

    Harder, Timm C.; Hufnagel, Markus; Zahn, Katrin; Beutel, Karin; Schmitt, Heinz-Josef; Ullmann, Uwe; Rautenberg, Peter

    2001-01-01

    Detection of parvovirus B19 DNA offers diagnostic advantages over serology, particularly in persistent infections of immunocompromised patients. A rapid, novel method of B19 DNA detection and quantification is introduced. This method, a quantitative PCR assay, is based on real-time glass capillary thermocycling (LightCycler [LC]) and fluorescence resonance energy transfer (FRET). The PCR assay allowed quantification over a dynamic range of over 7 logs and could quantify as little as 250 B19 genome equivalents (geq) per ml as calculated for plasmid DNA (i.e., theoretically ≥5 geq per assay). Interrater agreement analysis demonstrated equivalence of LC-FRET PCR and conventional nested PCR in the diagnosis of an active B19 infection (kappa coefficient = 0.83). The benefit of the new method was demonstrated in an immunocompromised child with a relapsing infection, who required an attenuation of the immunosuppressive therapy in addition to repeated doses of immunoglobulin to eliminate the virus. PMID:11724854

  5. Strategy for the maximization of clinically relevant information from hepatitis C virus, RT-PCR quantification.

    LENUS (Irish Health Repository)

    Levis, J

    2012-02-03

    BACKGROUND: The increasing clinical application of viral load assays for monitoring viral infections has been an incentive for the development of standardized tests for the hepatitis C virus. OBJECTIVE: To develop a simple model for the prediction of baseline viral load in individuals infected with the hepatitis C virus. METHODOLOGY: Viral load quantification of each patient\\'s first sample was assessed by RT-PCR-ELISA using the Roche MONITOR assay in triplicate. Genotype of the infecting virus was identified by reverse line probe hybridization, using amplicons resulting from the qualitative HCV Roche AMPLICOR assay. RESULTS: Retrospective evaluation of first quantitative values suggested that 82.4% (n=168\\/204) of individuals had a viral load between 4.3 and 6.7 log(10) viral copies per ml. A few patients (3.4%; n=7\\/204) have a serum viremia less than the lower limit of the linear range of the RT-PCR assay. Subsequent, prospective evaluation of hepatitis C viral load of all new patients using a model based on the dynamic range of viral load in the retrospective group correctly predicted the dynamic range in 75.9% (n=33\\/54). CONCLUSION: The dynamic range of hepatitis C viremia extends beyond the linear range of the Roche MONITOR assay. Accurate determination of serum viremia is substantially improved by dilution of specimens prior to quantification.

  6. A methodology to analize the safety of a coastal nuclear power plant against the Typhoon external flooding risks

    International Nuclear Information System (INIS)

    Chen Tian; He Mi; Chen Guofei; Joly, Antoine; Pan Rong; Ji Ping

    2015-01-01

    For the protection of coastal Nuclear Power Plant (NPP) against the external flooding hazard, the risks caused by natural events have to be taken into account. In this article, a methodology is proposed to analyze the risk of the typical natural event in China (Typhoon). It includes the simulation of the storm surge and the strong waves due to its passage in Chinese coastal zones and the quantification of the sequential overtopping flow rate. The simulation is carried out by coupling 2 modules of the hydraulic modeling system TELEMAC-MASCARET from EDF, TELEMAC2D (Shallow water module) and TOMAWAC (spectral wave module). As an open-source modeling system, this methodology could still be enriched by other phenomena in the near future to ameliorate its performance in safety analysis of the coastal NPPs in China. (author)

  7. A methodology for spacecraft technology insertion analysis balancing benefit, cost, and risk

    Science.gov (United States)

    Bearden, David Allen

    Emerging technologies are changing the way space missions are developed and implemented. Technology development programs are proceeding with the goal of enhancing spacecraft performance and reducing mass and cost. However, it is often the case that technology insertion assessment activities, in the interest of maximizing performance and/or mass reduction, do not consider synergistic system-level effects. Furthermore, even though technical risks are often identified as a large cost and schedule driver, many design processes ignore effects of cost and schedule uncertainty. This research is based on the hypothesis that technology selection is a problem of balancing interrelated (and potentially competing) objectives. Current spacecraft technology selection approaches are summarized, and a Methodology for Evaluating and Ranking Insertion of Technology (MERIT) that expands on these practices to attack otherwise unsolved problems is demonstrated. MERIT combines the modern techniques of technology maturity measures, parametric models, genetic algorithms, and risk assessment (cost and schedule) in a unique manner to resolve very difficult issues including: user-generated uncertainty, relationships between cost/schedule and complexity, and technology "portfolio" management. While the methodology is sufficiently generic that it may in theory be applied to a number of technology insertion problems, this research focuses on application to the specific case of small (<500 kg) satellite design. Small satellite missions are of particular interest because they are often developed under rigid programmatic (cost and schedule) constraints and are motivated to introduce advanced technologies into the design. MERIT is demonstrated for programs procured under varying conditions and constraints such as stringent performance goals, not-to-exceed costs, or hard schedule requirements. MERIT'S contributions to the engineering community are its: unique coupling of the aspects of performance

  8. Perspectives on benefit-risk decision-making in vaccinology: Conference report.

    Science.gov (United States)

    Greenberg, M; Simondon, F; Saadatian-Elahi, M

    2016-01-01

    Benefit/risk (B/R) assessment methods are increasingly being used by regulators and companies as an important decision-making tool and their outputs as the basis of communication. B/R appraisal of vaccines, as compared with drugs, is different due to their attributes and their use. For example, vaccines are typically given to healthy people, and, for some vaccines, benefits exist both at the population and individual level. For vaccines in particular, factors such as the benefit afforded through herd effects as a function of vaccine coverage and consequently impact the B/R ratio, should also be taken into consideration and parameterized in B/R assessment models. Currently, there is no single agreed methodology for vaccine B/R assessment that can fully capture all these aspects. The conference "Perspectives on Benefit-Risk Decision-making in Vaccinology," held in Annecy (France), addressed these issues and provided recommendations on how to advance the science and practice of B/R assessment of vaccines and vaccination programs.

  9. PCR amplification of repetitive sequences as a possible approach in relative species quantification

    DEFF Research Database (Denmark)

    Ballin, Nicolai Zederkopff; Vogensen, Finn Kvist; Karlsson, Anders H

    2012-01-01

    Abstract Both relative and absolute quantifications are possible in species quantification when single copy genomic DNA is used. However, amplification of single copy genomic DNA does not allow a limit of detection as low as one obtained from amplification of repetitive sequences. Amplification...... of repetitive sequences is therefore frequently used in absolute quantification but problems occur in relative quantification as the number of repetitive sequences is unknown. A promising approach was developed where data from amplification of repetitive sequences were used in relative quantification of species...... to relatively quantify the amount of chicken DNA in a binary mixture of chicken DNA and pig DNA. However, the designed PCR primers lack the specificity required for regulatory species control....

  10. Methodology for ranking restoration options

    International Nuclear Information System (INIS)

    Hedemann Jensen, Per

    1999-04-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developing generic methodologies for ranking restoration techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps: characterisation of relevant contaminated sites; identification and characterisation of relevant restoration techniques; assessment of the radiological impact; development and application of a selection methodology for restoration options; formulation of generic conclusions and development of a manual. The project is intended to apply to situations in which sites with nuclear installations have been contaminated with radioactive materials as a result of the operation of these installations. The areas considered for remedial measures include contaminated land areas, rivers and sediments in rivers, lakes, and sea areas. Five contaminated European sites have been studied. Various remedial measures have been envisaged with respect to the optimisation of the protection of the populations being exposed to the radionuclides at the sites. Cost-benefit analysis and multi-attribute utility analysis have been applied for optimisation. Health, economic and social attributes have been included and weighting factors for the different attributes have been determined by the use of scaling constants. (au)

  11. NF ISO 14064-1 Greenhouse gases. Part 1: specifications and guidance at the organization level for quantification and reporting of greenhouse gas emissions and removals

    International Nuclear Information System (INIS)

    2005-01-01

    This document describes methodology for quantification, monitoring of greenhouse gas as well as for drafting of inventory report for organisms. Thus it suggests a method for inventory declarations for organism greenhouse gas and provides support for the monitoring and the management of their emission. It provides the terms and definitions, the principles, the greenhouse gases inventory design, development and components, the greenhouse inventory quality management, the reporting of greenhouse gases and the organization role in verification activities. (A.L.B.)

  12. Needs-based sewerage prioritization: alternative to conventional cost-benefit analysis.

    Science.gov (United States)

    Rashid, Md M; Hayes, Donald F

    2011-10-01

    This paper presents an empirical approach to select and prioritize sewerage projects within set budgetary limitations. The methodology includes a model which quantifies benefits of a sewerage project as an index or dimensionless number. The index considers need and urgency of sewerage and other project goals. Benefit is defined as the difference in anticipated impact between the current condition (without the project) and the expected condition with the project. Anticipated benefits primarily include reduction in environmental pollution, reduction of human diseases and morbidity, and other tangible and intangible improvement. This approach is a powerful decision tool for sewerage prioritization and an effective alternative to conventional cost-benefit analysis. Unlike conventional analysis, this approach makes no attempt to convert project benefits and other impacts into a monetary measure. This work recognizes that the decision to provide sewerage based solely on net benefits is not practical. Instead, benefit-cost ratios (B/C) are calculated utilizing cost-effectiveness approach. Using these ratios, 16 unserviced areas of Ensenada, Mexico are ranked. The prioritization rankings produced by this method must be further scrutinized and carefully reviewed for logic, accuracy of input data, and practicality of implementation. A similar framework may also be useful for prioritizing other public works projects. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. A Comparison of Various Software Development Methodologies: Feasibility and Methods of Integration

    Directory of Open Access Journals (Sweden)

    Samir Abou El-Seoud

    2016-12-01

    Full Text Available System development methodologies which have being used in the academic and commercial environments during last two decades have advantages and disadvantages. Researchers had tried to identify objectives, scope …etc. of the methodologies by following different approaches. Each approach has its Limitation, specific interest, coverage …etc. In this paper, we tried to perform a comparative study of those methodologies which are popular and commonly used in banking and commercial environment. We tried in our study to determine objectives, scope, tools and other features of the methodologies. We also, tried to determine how and to what extent the methodologies incorporate the facilities such as project management, cost benefit analysis, documentation …etc. One of the most important aspects of our study was how to integrate the methodologies and develop a global methodology which covers the complete span of the software development life cycle? A prototype system which integrates the selected methodologies has been developed. The developed system helps analysts and designers how to choose suitable tools or to obtain guidelines on what to do in a particular situation. The prototype system has been tested during the development of a software for an ATM “Auto Teller Machine” by selecting and applying SASD methodology during software development. This resulted in the development of high quality and well documented software system.

  14. WE-H-207A-06: Hypoxia Quantification in Static PET Images: The Signal in the Noise

    International Nuclear Information System (INIS)

    Keller, H; Yeung, I; Milosevic, M; Jaffray, D; Kueng, R; Shek, T; Driscoll, B

    2016-01-01

    Purpose: Quantification of hypoxia from PET images is of considerable clinical interest. In the absence of dynamic PET imaging the hypoxic fraction (HF) of a tumor has to be estimated from voxel values of activity concentration of a radioactive hypoxia tracer. This work is part of an effort to standardize quantification of tumor hypoxic fraction from PET images. Methods: A simple hypoxia imaging model in the tumor was developed. The distribution of the tracer activity was described as the sum of two different probability distributions, one for the normoxic (and necrotic), the other for the hypoxic voxels. The widths of the distributions arise due to variability of the transport, tumor tissue inhomogeneity, tracer binding kinetics, and due to PET image noise. Quantification of HF was performed for various levels of variability using two different methodologies: a) classification thresholds between normoxic and hypoxic voxels based on a non-hypoxic surrogate (muscle), and b) estimation of the (posterior) probability distributions based on maximizing likelihood optimization that does not require a surrogate. Data from the hypoxia imaging model and from 27 cervical cancer patients enrolled in a FAZA PET study were analyzed. Results: In the model, where the true value of HF is known, thresholds usually underestimate the value for large variability. For the patients, a significant uncertainty of the HF values (an average intra-patient range of 17%) was caused by spatial non-uniformity of image noise which is a hallmark of all PET images. Maximum likelihood estimation (MLE) is able to directly optimize for the weights of both distributions, however, may suffer from poor optimization convergence. For some patients, MLE-based HF values showed significant differences to threshold-based HF-values. Conclusion: HF-values depend critically on the magnitude of the different sources of tracer uptake variability. A measure of confidence should also be reported.

  15. WE-H-207A-06: Hypoxia Quantification in Static PET Images: The Signal in the Noise

    Energy Technology Data Exchange (ETDEWEB)

    Keller, H; Yeung, I; Milosevic, M; Jaffray, D [University of Toronto, Toronto (Canada); Princess Margaret Cancer Centre, Toronto (Canada); Kueng, R [Princess Margaret Cancer Centre, Toronto (Canada); Inselspital Bern, Bern, Switzerland. (Switzerland); Shek, T; Driscoll, B [Princess Margaret Cancer Centre, Toronto (Canada)

    2016-06-15

    Purpose: Quantification of hypoxia from PET images is of considerable clinical interest. In the absence of dynamic PET imaging the hypoxic fraction (HF) of a tumor has to be estimated from voxel values of activity concentration of a radioactive hypoxia tracer. This work is part of an effort to standardize quantification of tumor hypoxic fraction from PET images. Methods: A simple hypoxia imaging model in the tumor was developed. The distribution of the tracer activity was described as the sum of two different probability distributions, one for the normoxic (and necrotic), the other for the hypoxic voxels. The widths of the distributions arise due to variability of the transport, tumor tissue inhomogeneity, tracer binding kinetics, and due to PET image noise. Quantification of HF was performed for various levels of variability using two different methodologies: a) classification thresholds between normoxic and hypoxic voxels based on a non-hypoxic surrogate (muscle), and b) estimation of the (posterior) probability distributions based on maximizing likelihood optimization that does not require a surrogate. Data from the hypoxia imaging model and from 27 cervical cancer patients enrolled in a FAZA PET study were analyzed. Results: In the model, where the true value of HF is known, thresholds usually underestimate the value for large variability. For the patients, a significant uncertainty of the HF values (an average intra-patient range of 17%) was caused by spatial non-uniformity of image noise which is a hallmark of all PET images. Maximum likelihood estimation (MLE) is able to directly optimize for the weights of both distributions, however, may suffer from poor optimization convergence. For some patients, MLE-based HF values showed significant differences to threshold-based HF-values. Conclusion: HF-values depend critically on the magnitude of the different sources of tracer uptake variability. A measure of confidence should also be reported.

  16. Quantification of cellular uptake of DNA nanostructures by qPCR

    DEFF Research Database (Denmark)

    Okholm, Anders Hauge; Nielsen, Jesper Sejrup; Vinther, Mathias

    2014-01-01

    interactions and structural and functional features of the DNA delivery device must be thoroughly investigated. Here, we present a rapid and robust method for the precise quantification of the component materials of DNA origami structures capable of entering cells in vitro. The quantification is performed...

  17. Identification and quantification of genipin and geniposide from Genipa americana L. by HPLC-DAD using a fused-core column

    Directory of Open Access Journals (Sweden)

    Grazielle NÁTHIA-NEVES

    2018-01-01

    Full Text Available Abstract In this work, it was developed a fast, simple and selective method for quantification of genipin and geniposide from unripe fruits of genipap, which are known as natural colorants, blue and yellow, respectively. The compounds separation was performed in a fused-core C18 column using as mobile phase water (A and acetonitrile (B both acidified with 0.1% formic acid, with the following gradient: 0 min, 99% A; 9 min, 75% A; 10 min, 99% A and 13 min, 99% A. The temperature and flow rate that allowed the best chromatographic performance were 35 °C and 1.5 mL/min, respectively, resulting a total run time of 13 min, including column clean-up and re-equilibration. This short analysis time represents an advantage compared to the methods reported in the literature where the running times are 2-5 times greater. The detection wavelength was set at 240 nm. The method validation was performed based on specificity, linearity, detection and quantification limits, precision and accuracy, according to ICH methodology. Finally, the developed method was suitable for monitoring analysis of those compounds content in vegetable samples.

  18. Metabolite profiling and quantification of phytochemicals in potato extracts using ultra-high-performance liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Chong, Esther Swee Lan; McGhie, Tony K; Heyes, Julian A; Stowell, Kathryn M

    2013-12-01

    Potatoes contain a diverse range of phytochemicals which have been suggested to have health benefits. Metabolite profiling and quantification were conducted on plant extracts made from a white potato cultivar and 'Urenika', a purple potato cultivar traditionally consumed by New Zealand Maori. There is limited published information regarding the metabolite profile of Solanum tuberosum cultivar 'Urenika'. Using ultra-high- performance liquid chromatography-mass spectrometry (UHPLC-MS), a total of 31 compounds were identified and quantified in the potato extracts. The majority of the compounds were identified for the first time in 'Urenika'. These compounds include several types of anthocyanins, hydroxycinnamic acid (HCA) derivatives, and hydroxycinnamic amides (HCAA). Six classes of compounds, namely organic acids, amino acids, HCA, HCAA, flavonols and glycoalkaloids, were present in both extracts but quantities varied between the two extracts. The unknown plant metabolites in both potato extracts were assigned with molecular formulae and identified with high confidence. Quantification of the metabolites was achieved using a number of appropriate standards. High-resolution mass spectrometry data critical for accurate identification of unknown phytochemicals were achieved and could be added to potato or plant metabolomic database. © 2013 Society of Chemical Industry.

  19. Development and application of the methodology to establish life extension and modernization plan of aged hydropower plants

    International Nuclear Information System (INIS)

    Kim, Jong Sung; Kwon, Hyuck Cheon; Song, Byung Hun; Kwon, Chang Seop

    2009-01-01

    This paper provides how to establish an integrated plan for LE (Life Extension) and MD (MoDernization) of aged hydropower plants. The methodology is developed through review of overseas/domestic LE/MD histories, investigation of the previous overseas methodologies and consideration of domestic practices. The methodology includes reviews of the various factors such as condition, operation and maintenance history, up-to-date technology, and economic benefit. In order to establish the life extension/modernization plan, the methodology is applied to the domestic aged hydropower plants. Finally, priority rankings and draft practice plans for LE/MD are derived.

  20. Methodology for uranium resource estimates and reliability

    International Nuclear Information System (INIS)

    Blanchfield, D.M.

    1980-01-01

    The NURE uranium assessment method has evolved from a small group of geologists estimating resources on a few lease blocks, to a national survey involving an interdisciplinary system consisting of the following: (1) geology and geologic analogs; (2) engineering and cost modeling; (3) mathematics and probability theory, psychology and elicitation of subjective judgments; and (4) computerized calculations, computer graphics, and data base management. The evolution has been spurred primarily by two objectives; (1) quantification of uncertainty, and (2) elimination of simplifying assumptions. This has resulted in a tremendous data-gathering effort and the involvement of hundreds of technical experts, many in uranium geology, but many from other fields as well. The rationality of the methods is still largely based on the concept of an analog and the observation that the results are reasonable. The reliability, or repeatability, of the assessments is reasonably guaranteed by the series of peer and superior technical reviews which has been formalized under the current methodology. The optimism or pessimism of individual geologists who make the initial assessments is tempered by the review process, resulting in a series of assessments which are a consistent, unbiased reflection of the facts. Despite the many improvements over past methods, several objectives for future development remain, primarily to reduce subjectively in utilizing factual information in the estimation of endowment, and to improve the recognition of cost uncertainties in the assessment of economic potential. The 1980 NURE assessment methodology will undoubtly be improved, but the reader is reminded that resource estimates are and always will be a forecast for the future

  1. ICRP-26; cost-benefit analysis and nuclear energy in developing countries

    International Nuclear Information System (INIS)

    Gupta, V.K.

    1978-01-01

    Cost of an operation and benefits accruing to the society are the basic parameters involved in cost-benefit analysis by using optimisation methodology. Relative importance of the costs imposed on human health by radiation exposure and other economic and social factors are to be considered. Formula to obtain the parameter in monetory terms with respect to the detriment represented by collective dose (Rs/man-rem or $/man-rem) is explained. The collective doses in the public domain and for the occupational workers are mentioned. Estimated monetory values assigned to detriment in different countries are discussed. In absence of accurately known parameters, in particular the economic parameter which is always subject to change, the cost benefit and optimisation exercises would give variable results. (B.G.W.)

  2. Effect of flow forecasting quality on benefits of reservoir operation - a case study for the Geheyan reservoir (China)

    NARCIS (Netherlands)

    Dong, Xiaohua; Dohmen-Janssen, Catarine M.; Booij, Martijn J.; Hulscher, Suzanne J.M.H.

    2006-01-01

    This paper presents a methodology to determine the effect of flow forecasting quality on the benefits of reservoir operation. The benefits are calculated in terms of the electricity generated, and the quality of the flow forecasting is defined in terms of lead time and accuracy of the forecasts. In

  3. Service-Learning in Supply Chain Management: Benefits, Challenges and Best Practices

    Science.gov (United States)

    Schoenherr, Tobias

    2015-01-01

    Service-learning (SL) is a pedagogical approach in which students are assigned a course-related project in a not-for-profit organization, and are tasked to apply course content to execute the project. While the benefits are multifarious, only recently have supply chain management (SCM) courses adapted this innovative teaching methodology. The…

  4. Molecular quantification of genes encoding for green-fluorescent proteins

    DEFF Research Database (Denmark)

    Felske, A; Vandieken, V; Pauling, B V

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification...... PCR strategy is a highly specific and sensitive way to monitor recombinant DNA in environments like the efflux of a biotechnological plant....

  5. A highly sensitive method for quantification of iohexol

    DEFF Research Database (Denmark)

    Schulz, A.; Boeringer, F.; Swifka, J.

    2014-01-01

    -chromatography-electrospray-massspectrometry (LC-ESI-MS) approach using the multiple reaction monitoring mode for iohexol quantification. In order to test whether a significantly decreased amount of iohexol is sufficient for reliable quantification, a LC-ESI-MS approach was assessed. We analyzed the kinetics of iohexol in rats after application...... of different amounts of iohexol (15 mg to 150 1.tg per rat). Blood sampling was conducted at four time points, at 15, 30, 60, and 90 min, after iohexol injection. The analyte (iohexol) and the internal standard (iotha(amic acid) were separated from serum proteins using a centrifugal filtration device...... with a cut-off of 3 kDa. The chromatographic separation was achieved on an analytical Zorbax SB C18 column. The detection and quantification were performed on a high capacity trap mass spectrometer using positive ion ESI in the multiple reaction monitoring (MRM) mode. Furthermore, using real-time polymerase...

  6. Guide for prioritizing power plant productivity improvement projects: handbook of availability improvement methodology

    International Nuclear Information System (INIS)

    1981-01-01

    As part of its program to help improve electrical power plant productivity, the Department of Energy (DOE) has developed a methodology for evaluating productivity improvement projects. This handbook presents a simplified version of this methodology called the Availability Improvement Methodology (AIM), which provides a systematic approach for prioritizing plant improvement projects. Also included in this handbook is a description of data taking requirements necessary to support the AIM methodology, benefit/cost analysis, and root cause analysis for tracing persistent power plant problems. In applying the AIM methodology, utility engineers should be mindful that replacement power costs are frequently greater for forced outages than for planned outages. Equivalent availability includes both. A cost-effective ranking of alternative plant improvement projects must discern between those projects which will reduce forced outages and those which might reduce planned outages. As is the case with any analytical procedure, engineering judgement must be exercised with respect to results of purely mathematical calculations

  7. Proposal of methodology for calculating the degree of impact caused by perturbations recorded in a power transmission system; Proposicao de metodologia para calcular o grau de impacto causado pelas perturbacoes registradas em um sistema eletrico de transmissao

    Energy Technology Data Exchange (ETDEWEB)

    Vianna, E.A.L. [Centrais Eletricas do Norte (ELETRONORTE), Porto Velho, RO (Brazil)], E-mail: elainelimavianna@yahoo.com.br; Lambert-Torres, G.; Silva, L.E.B. da [Universidade Federal de Itajuba (UNIFEI), MG (Brazil)], Emails: germanoltorres@gmail.com, leborges@unifei.edu.br; Rissino, S.; Silva, M.F. da [Universidade Federal de Rondonia (UFRO), Porto Velho, RO (Brazil)], Emails: srissino@gmail.com, felipe@unir.br

    2009-07-01

    Disturbances recorded in a electric power system compromise the quality and continuity energy supply and are measured by means of performance indicators. This article defines the attributes that contribute to increased the severity of disturbances recorded in an Electrical Power Transmission and proposes a methodology for calculating the degree of impact caused each of them. The proposed methodology allows quantification of the impact caused by a disturbance, and its comparison with other disturbance, in one system or distinct systems.

  8. A market-based approach to share water and benefits in transboundary river basins

    Science.gov (United States)

    Arjoon, Diane; Tilmant, Amaury; Herrmann, Markus

    2016-04-01

    The equitable sharing of benefits in transboundary river basins is necessary to reach a consensus on basin-wide development and management activities. Benefit sharing arrangements must be collaboratively developed to be perceived as efficient, as well as equitable, in order to be considered acceptable to all riparian countries. The current literature falls short of providing practical, institutional arrangements that ensure maximum economic welfare as well as collaboratively developed methods for encouraging the equitable sharing of benefits. In this study we define an institutional arrangement that distributes welfare in a river basin by maximizing the economic benefits of water use and then sharing these benefits in an equitable manner using a method developed through stakeholder involvement. In this methodology (i) a hydro-economic model is used to efficiently allocate scarce water resources to water users in a transboundary basin, (ii) water users are obliged to pay for water, and (iii) the total of these water charges are equitably redistributed as monetary compensation to users. The amount of monetary compensation, for each water user, is determined through the application of a sharing method developed by stakeholder input, based on a stakeholder vision of fairness, using an axiomatic approach. The whole system is overseen by a river basin authority. The methodology is applied to the Eastern Nile River basin as a case study. The technique ensures economic efficiency and may lead to more equitable solutions in the sharing of benefits in transboundary river basins because the definition of the sharing rule is not in question, as would be the case if existing methods, such as game theory, were applied, with their inherent definitions of fairness.

  9. Terahertz identification and quantification of penicillamine enantiomers

    International Nuclear Information System (INIS)

    Ji Te; Zhao Hongwei; Chen Min; Xiao Tiqiao; Han Pengyu

    2013-01-01

    Identification and characterization of L-, D- and DL- penicillamine were demonstrated by Terahertz time-domain spectroscopy (THz-TDS). To understand the physical origins of the low frequency resonant modes, the density functional theory (DFT) was adopted for theoretical calculation. It was found that the collective THz frequency motions were decided by the intramolecular and intermolecular hydrogen bond interactions. Moreover, the quantification of penicillamine enantiomers mixture was demonstrated by a THz spectra fitting method with a relative error of less than 3.5%. This technique can be a valuable tool for the discrimination and quantification of chiral drugs in pharmaceutical industry. (authors)

  10. THE CHALLENGE OF KEEPING-UP: CURRENT METHODOLOGIES IN ANALYZING THE STUDENTS RECRUITING AREA BY UNIVERSITIES

    Directory of Open Access Journals (Sweden)

    MĂLĂESCU SIMONA

    2013-11-01

    Full Text Available The challenge of keeping-up: current methodologies in analyzing the students recruiting area by universities. Despite all progress made in the field and in some collateral areas extremely useful methodologically (e.g. the use of GIS, for some countries emerging from communist space methodologically upgrading to the latest advances in modelling and forecast of students recruitment by universities remains a difficult challenge. The analysis and modelling of the geographical area of recruiting students for a particular university represents even for the foreign literature a niche, not necessarily consciously neglected but only reached sidely due, most likely, to ignoring the benefits which the focus of concerns on this aspect would bring into focus and directing more efficiently university marketing efforts. This paper aims precisely to seek, through a meta-analysis of existing literature, disparate developments that led in some form or will allow improved modeling spatial areas of recruitment of students by universities and the challenges and limitations that apply methodological advances the area where universities belonging to the ex-communist involved. Beyond the theoretical benefit from a practical perspective, the meta-analysis aimed at synthesizing elements of good practice that can be applied to the local university system.

  11. Comparative quantification of alcohol exposure as risk factor for global burden of disease.

    Science.gov (United States)

    Rehm, Jürgen; Klotsche, Jens; Patra, Jayadeep

    2007-01-01

    Alcohol has been identified as one of the most important risk factors in the burden experienced as a result of disease. The objective of the present contribution is to establish a framework to comparatively quantify alcohol exposure as it is relevant for burden of disease. Different key indicators are combined to derive this quantification. First, adult per capita consumption, composed of recorded and unrecorded consumption, yields the best overall estimate of alcohol exposure for a country or region. Second, survey information is used to allocate the per capita consumption into sex and age groups. Third, an index for detrimental patterns of drinking is used to determine the additional impact on injury and cardiovascular burden. The methodology is applied to estimate global alcohol exposure for the year 2002. Finally, assumptions and potential problems of the approach are discussed. Copyright (c) 2007 John Wiley & Sons, Ltd.

  12. Conceptual and methodological challenges to integrating SEA and cumulative effects assessment

    International Nuclear Information System (INIS)

    Gunn, Jill; Noble, Bram F.

    2011-01-01

    The constraints to assessing and managing cumulative environmental effects in the context of project-based environmental assessment are well documented, and the potential benefits of a more strategic approach to cumulative effects assessment (CEA) are well argued; however, such benefits have yet to be clearly demonstrated in practice. While it is widely assumed that cumulative effects are best addressed in a strategic context, there has been little investigation as to whether CEA and strategic environmental assessment (SEA) are a 'good fit' - conceptually or methodologically. This paper identifies a number of conceptual and methodological challenges to the integration of CEA and SEA. Based on results of interviews with international experts and practitioners, this paper demonstrates that: definitions and conceptualizations of CEA are typically weak in practice; approaches to effects aggregation vary widely; a systems perspective lacks in both SEA and CEA; the multifarious nature of SEA complicates CEA; tiering arrangements between SEA and project-based assessment are limited to non-existing; and the relationship of SEA to regional planning remains unclear.

  13. A Generic Framework for the Evaluation of the Benefits Expected from the Smart Grid

    Directory of Open Access Journals (Sweden)

    Panayotis G. Cottis

    2013-02-01

    Full Text Available The Smart Grid has the potential to bring significant value to the various stakeholders of the electricity market. A methodology for the evaluation of the smart grid benefits is required to facilitate the decision making by quantifying the benefits expected from a smart grid project. The present paper proposes a generic framework to assess these expected benefits taking into account the regulatory, business and technical challenges focusing particularly on Distributed Systems Operators (DSOs and end users. An indicative study case is presented where the proposed cost-benefit approach assesses the expected value of DSOs from the Smart Grid and determines whether and under what conditions such an investment should be initiated.

  14. Development of a simplified methodology for the isotopic determination of fuel spent in Light Water Reactors; Desarrollo de una metodologia simplificada para la determinacion isotopica del combustible gastado en reactores de agua ligera

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez N, H.; Francois L, J.L. [FI-UNAM, 04510 Mexico D.F. (Mexico)]. e-mail: hermilo@lairn.fi-b.unam.mx

    2005-07-01

    The present work presents a simplified methodology to quantify the isotopic content of the spent fuel of light water reactors; their application is it specific to the Laguna Verde Nucleo electric Central by means of a balance cycle of 18 months. The methodology is divided in two parts: the first one consists on the development of a model of a simplified cell, for the isotopic quantification of the irradiated fuel. With this model the burnt one is simulated 48,000 MWD/TU of the fuel in the core of the reactor, taking like base one fuel assemble type 10x10 and using a two-dimensional simulator for a fuel cell of a light water reactor (CPM-3). The second part of the methodology is based on the creation from an isotopic decay model through an algorithm in C++ (decay) to evaluate the amount, by decay of the radionuclides, after having been irradiated the fuel until the time in which the reprocessing is made. Finally the method used for the quantification of the kilograms of uranium and obtained plutonium of a normalized quantity (1000 kg) of fuel irradiated in a reactor is presented. These results will allow later on to make analysis of the final disposition of the irradiated fuel. (Author)

  15. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    Science.gov (United States)

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-04-06

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality.

  16. Impact of improved attenuation correction featuring a bone atlas and truncation correction on PET quantification in whole-body PET/MR.

    Science.gov (United States)

    Oehmigen, Mark; Lindemann, Maike E; Gratz, Marcel; Kirchner, Julian; Ruhlmann, Verena; Umutlu, Lale; Blumhagen, Jan Ole; Fenchel, Matthias; Quick, Harald H

    2018-04-01

    Recent studies have shown an excellent correlation between PET/MR and PET/CT hybrid imaging in detecting lesions. However, a systematic underestimation of PET quantification in PET/MR has been observed. This is attributable to two methodological challenges of MR-based attenuation correction (AC): (1) lack of bone information, and (2) truncation of the MR-based AC maps (μmaps) along the patient arms. The aim of this study was to evaluate the impact of improved AC featuring a bone atlas and truncation correction on PET quantification in whole-body PET/MR. The MR-based Dixon method provides four-compartment μmaps (background air, lungs, fat, soft tissue) which served as a reference for PET/MR AC in this study. A model-based bone atlas provided bone tissue as a fifth compartment, while the HUGE method provided truncation correction. The study population comprised 51 patients with oncological diseases, all of whom underwent a whole-body PET/MR examination. Each whole-body PET dataset was reconstructed four times using standard four-compartment μmaps, five-compartment μmaps, four-compartment μmaps + HUGE, and five-compartment μmaps + HUGE. The SUV max for each lesion was measured to assess the impact of each μmap on PET quantification. All four μmaps in each patient provided robust results for reconstruction of the AC PET data. Overall, SUV max was quantified in 99 tumours and lesions. Compared to the reference four-compartment μmap, the mean SUV max of all 99 lesions increased by 1.4 ± 2.5% when bone was added, by 2.1 ± 3.5% when HUGE was added, and by 4.4 ± 5.7% when bone + HUGE was added. Larger quantification bias of up to 35% was found for single lesions when bone and truncation correction were added to the μmaps, depending on their individual location in the body. The novel AC method, featuring a bone model and truncation correction, improved PET quantification in whole-body PET/MR imaging. Short reconstruction times, straightforward

  17. Impact of improved attenuation correction featuring a bone atlas and truncation correction on PET quantification in whole-body PET/MR

    Energy Technology Data Exchange (ETDEWEB)

    Oehmigen, Mark; Lindemann, Maike E. [University Hospital Essen, High Field and Hybrid MR Imaging, Essen (Germany); Gratz, Marcel; Quick, Harald H. [University Hospital Essen, High Field and Hybrid MR Imaging, Essen (Germany); University Duisburg-Essen, Erwin L. Hahn Institute for MR Imaging, Essen (Germany); Kirchner, Julian [University Dusseldorf, Department of Diagnostic and Interventional Radiology, Medical Faculty, Dusseldorf (Germany); Ruhlmann, Verena [University Hospital Essen, Department of Nuclear Medicine, Essen (Germany); Umutlu, Lale [University Hospital Essen, Department of Diagnostic and Interventional Radiology and Neuroradiology, Essen (Germany); Blumhagen, Jan Ole; Fenchel, Matthias [Siemens Healthcare GmbH, Erlangen (Germany)

    2018-04-15

    Recent studies have shown an excellent correlation between PET/MR and PET/CT hybrid imaging in detecting lesions. However, a systematic underestimation of PET quantification in PET/MR has been observed. This is attributable to two methodological challenges of MR-based attenuation correction (AC): (1) lack of bone information, and (2) truncation of the MR-based AC maps (μmaps) along the patient arms. The aim of this study was to evaluate the impact of improved AC featuring a bone atlas and truncation correction on PET quantification in whole-body PET/MR. The MR-based Dixon method provides four-compartment μmaps (background air, lungs, fat, soft tissue) which served as a reference for PET/MR AC in this study. A model-based bone atlas provided bone tissue as a fifth compartment, while the HUGE method provided truncation correction. The study population comprised 51 patients with oncological diseases, all of whom underwent a whole-body PET/MR examination. Each whole-body PET dataset was reconstructed four times using standard four-compartment μmaps, five-compartment μmaps, four-compartment μmaps + HUGE, and five-compartment μmaps + HUGE. The SUV{sub max} for each lesion was measured to assess the impact of each μmap on PET quantification. All four μmaps in each patient provided robust results for reconstruction of the AC PET data. Overall, SUV{sub max} was quantified in 99 tumours and lesions. Compared to the reference four-compartment μmap, the mean SUV{sub max} of all 99 lesions increased by 1.4 ± 2.5% when bone was added, by 2.1 ± 3.5% when HUGE was added, and by 4.4 ± 5.7% when bone + HUGE was added. Larger quantification bias of up to 35% was found for single lesions when bone and truncation correction were added to the μmaps, depending on their individual location in the body. The novel AC method, featuring a bone model and truncation correction, improved PET quantification in whole-body PET/MR imaging. Short reconstruction times, straightforward

  18. SPECT quantification of regional radionuclide distributions

    International Nuclear Information System (INIS)

    Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1986-01-01

    SPECT quantification of regional radionuclide activities within the human body is affected by several physical and instrumental factors including attenuation of photons within the patient, Compton scattered events, the system's finite spatial resolution and object size, finite number of detected events, partial volume effects, the radiopharmaceutical biokinetics, and patient and/or organ motion. Furthermore, other instrumentation factors such as calibration of the center-of-rotation, sampling, and detector nonuniformities will affect the SPECT measurement process. These factors are described, together with examples of compensation methods that are currently available for improving SPECT quantification. SPECT offers the potential to improve in vivo estimates of absorbed dose, provided the acquisition, reconstruction, and compensation procedures are adequately implemented and utilized. 53 references, 2 figures

  19. The Use of Participatory Action Research within Education--Benefits to Stakeholders

    Science.gov (United States)

    Jacobs, Steven

    2016-01-01

    This paper offers a brief history and the characteristics of the research methodology known as Participatory Action Research (PAR). This paper also states how PAR can be utilized within an educational environment and describes the benefits to all stakeholders such as teachers and students when they are involved in a research project using PAR as…

  20. Promise assessment: A corollary to risk assessment for characterizing benefits

    International Nuclear Information System (INIS)

    Sholtis, J.A. Jr.

    1993-01-01

    Decisions involving the use of hazardous technologies are often made based on risk-benefit considerations. This is the case for U.S. space mission use of nuclear power and propulsion systems, where launch decisions are made within the office of the President. A great deal of time and effort is spent characterizing the risk for each nuclear-powered space mission. However, this is not so for benefits, even though they are no less important. To correct this situation, a new technical term--promise--is defined, and a new methodology--promise assessment--is proposed. This paper introduces and advances this concept, addresses its future application, as a tool, can be developed sufficiently and applied to methodically identify and characterized benefits. Further, it can introduce a degree of balance when judgments concerning the use of hazardous technologies are involved

  1. Ecological impact study methodology for hydrotechnical projects

    International Nuclear Information System (INIS)

    Manoliu, Mihai; Toculescu, Razvan

    1993-01-01

    Besides the expected benefits, hydrotechnical projects may entail unfavorable effects on the hydrological regime, environment, health and living conditions of the population. Rational water resource management should take into consideration both the favorable and unfavorable effects. This implies the assessment of socio-economic and environmental impacts of the changes of the hydrological regime. The paper proposes a methodology for carrying out impact studies of hydrotechnical projects. The results of the work are presented graphically on the basis of composite programing. A summary of mathematical methods involved in impact study design is also presented. (authors)

  2. Development of a Methodology for VHTR Accident Consequence Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joeun; Kim, Jintae; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-05-15

    The substitution of the VHTR for burning fossil fuels conserves these hydrocarbon resources for other uses and eliminates the emissions of greenhouse. In Korea, for these reasons, constructing the VHTR plan for hydrogen production is in progress. In this study, the consequence analysis for the off-site releases of radioactive materials during severe accidents has been performed using the level 3 PRA technology. The offsite consequence analysis for a VHTR using the MACCS code has been performed. Since the passive system such as the RCCS(Reactor Cavity Cooling System) are equipped, the frequency of occurrence of accidents has been evaluated to be very low. For further study, the assessment for characteristic of VHTR safety system and precise quantification of its accident scenarios is expected to conduct more certain consequence analysis. This methodology shown in this study might contribute to enhancing the safety of VHTR design by utilizing the results having far lower effect on the environment than the LWRs.

  3. A framework for quantifying net benefits of alternative prognostic models.

    Science.gov (United States)

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-30

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd.

  4. A framework for quantifying net benefits of alternative prognostic models‡

    Science.gov (United States)

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21905066

  5. Quantification of fossil fuel CO2 at the building/street level for large US cities

    Science.gov (United States)

    Gurney, K. R.; Razlivanov, I. N.; Song, Y.

    2012-12-01

    Quantification of fossil fuel CO2 emissions from the bottom-up perspective is a critical element in emerging plans on a global, integrated, carbon monitoring system (CMS). A space/time explicit emissions data product can act as both a verification and planning system. It can verify atmospheric CO2 measurements (in situ and remote) and offer detailed mitigation information to management authorities in order to optimize the mix of mitigation efforts. Here, we present the Hestia Project, an effort aimed at building a high resolution (eg. building and road link-specific, hourly) fossil fuel CO2 emissions data product for the urban domain as a pilot effort to a CMS. A complete data product has been built for the city of Indianapolis and preliminary quantification has been completed for Los Angeles and Phoenix (see figure). The effort in Indianapolis is now part of a larger effort aimed at a convergent top-down/bottom-up assessment of greenhouse gas emissions, called INFLUX. Our urban-level quantification relies on a mixture of data and modeling structures. We start with the sector-specific Vulcan Project estimate at the mix of geocoded and county-wide levels. The Hestia aim is to distribute the Vulcan result in space and time. Two components take the majority of effort: buildings and onroad emissions. In collaboration with our INFLUX colleagues, we are transporting these high resolution emissions through an atmospheric transport model for a forward comparison of the Hestia data product with atmospheric measurements, collected on aircraft and cell towers. In preparation for a formal urban-scale inversion, these forward comparisons offer insights into both improving our emissions data product and measurement strategies. A key benefit of the approach taken in this study is the tracking and archiving of fuel and process-level detail (eg. combustion process, other pollutants), allowing for a more thorough understanding and analysis of energy throughputs in the urban

  6. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    Science.gov (United States)

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  7. Quantification and Management of Manifest Occlusal Caries Lesions in Adults: A Methodological and a Clinical Study

    DEFF Research Database (Denmark)

    Bakhshandeh, Azam

    2010-01-01

    teeth with primary occlusal lesions. Randomization was performed in cases of more than one lesion in the same patient, so that the final material consisted of 60 resin sealed and 12 restored lesions. After 2-3 years, there was a drop-out of 15%; 2 patients did not show up for the control and 9...... extension of the lesions from baseline and the last control radiograph, there was scored caries progression beneath 5 (10%) of 49 sealants, caries regression beneath 1 (2%) sealant and unchanged lesion depth beneath 43 (88%) sealants and all restorations (p = 0.64). The methodological study included 110...

  8. Two-Phase Microfluidic Systems for High Throughput Quantification of Agglutination Assays

    KAUST Repository

    Castro, David

    2018-01-01

    assay, with a minimum detection limit of 50 ng/mL using optical image analysis. We compare optical image analysis and light scattering as quantification methods, and demonstrate the first light scattering quantification of agglutination assays in a two

  9. Overview of Automotive Core Tools: Applications and Benefits

    Science.gov (United States)

    Doshi, Jigar A.; Desai, Darshak

    2017-08-01

    Continuous improvement of product and process quality is always challenging and creative task in today's era of globalization. Various quality tools are available and used for the same. Some of them are successful and few of them are not. Considering the complexity in the continuous quality improvement (CQI) process various new techniques are being introduced by the industries, as well as proposed by researchers and academia. Lean Manufacturing, Six Sigma, Lean Six Sigma is some of the techniques. In recent years, there are new tools being opted by the industry, especially automotive, called as Automotive Core Tools (ACT). The intention of this paper is to review the applications and benefits along with existing research on Automotive Core Tools with special emphasis on continuous quality improvement. The methodology uses an extensive review of literature through reputed publications—journals, conference proceedings, research thesis, etc. This paper provides an overview of ACT, its enablers, and exertions, how it evolved into sophisticated methodologies and benefits used in organisations. It should be of value to practitioners of Automotive Core Tools and to academics who are interested in how CQI can be achieved using ACT. It needs to be stressed here that this paper is not intended to scorn Automotive Core Tools, rather, its purpose is limited only to provide a balance on the prevailing positive views toward ACT.

  10. Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Datta, Susmita; Payne, Samuel H.; Kang, Jiyun; Bramer, Lisa M.; Nicora, Carrie D.; Shukla, Anil K.; Metz, Thomas O.; Rodland, Karin D.; Smith, Richard D.; Tardiff, Mark F.; McDermott, Jason E.; Pounds, Joel G.; Waters, Katrina M.

    2014-12-01

    As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.

  11. Usefulness of alternative integrative assessment methodologies in public decision making

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, L. E.; Litchfield, J. W.; Currie, J. W.; McDonald, C. L.; Adams, R. C.

    1978-07-01

    Many diverse social, economic, and environmental effects are associated with each of the available energy development alternatives. The assessment of the costs, risks, and benefits of these energy development options is an important function of the U. S. Department of Energy. This task is more difficult when no single alternative is better than the others in all respects. This paper compares benefit-cost and multi-attribute utility analysis as decision aids for these more difficult and more common assessment cases. PNL has developed expertise in making these assessments through its involvement since the Calvert Cliffs decision in both the preparation of Environmental Impact Statements and the development of methods to make these statements more thorough and responsive to the spirit of the National Environmental Protection Act (NEPA). Since 1973 PNL has had continuing efforts to quantify, value, and compare all of the major factors which influence the overall impacts of energy development options. An important part of this work has been the measurement and incorporation of the relative values which community groups place on these conflicting factors. Such difficult assessment problems could be approached in many ways including the use of benefit-cost or multi-attribute utility analysis. This paper addresses the following questions: (1) Should an integrative assessment methodology be used for the overall assessment of these costs, risks, and benefits. (2) If an integrative assessment methodology is to be used, what alternative methods are available and what should be the basis for selecting a method. (3) Is it possible to use one of the available alternatives for one portion of the assessment and another for another portion of the assessment. The answers to these questions presented in this report are applicable to most public decision problems.

  12. Evaluating the risk-reduction benefits of wind energy

    Energy Technology Data Exchange (ETDEWEB)

    Brower, M.C. [Brower & Company, Andover, MA (United States); Bell, K. [Convergence Research, Seattle, WA (United States); Bernow, S.; Duckworth, M. [Tellus Inst., Boston, MA (United States); Spinney P. [Charles River Associates, Boston, MA (United States)

    1996-12-31

    This paper presents preliminary results of a study to evaluate the risk-reduction benefits of wind power for a case study utility system using decision analysis techniques. The costs and risks of two alternative decisions-whether to build a 400 MW gas-fired combined cycle plant or a 1600 MW wind plant in 2003-were compared through computer simulations as fuel prices, environmental regulatory costs, wind and conventional power plant availability, and load growth were allowed to vary. Three different market scenarios were examined: traditional regulation, a short-term power pool, and fixed-price contracts of varying duration. The study concludes that, from the perspective of ratepayers, wind energy provides a net levelized risk-reduction benefit of $3.4 to $7.8/MWh under traditional regulation, and less in the other scenarios. From the perspective of the utility plant owners, wind provides a significant risk benefit in the unregulated market scenarios but none in a regulated market. The methodology and findings should help inform utility resource planning and industry restructuring efforts. 2 figs., 3 tabs.

  13. Detection and quantification of microparticles from different cellular lineages using flow cytometry. Evaluation of the impact of secreted phospholipase A2 on microparticle assessment.

    Science.gov (United States)

    Rousseau, Matthieu; Belleannee, Clemence; Duchez, Anne-Claire; Cloutier, Nathalie; Levesque, Tania; Jacques, Frederic; Perron, Jean; Nigrovic, Peter A; Dieude, Melanie; Hebert, Marie-Josee; Gelb, Michael H; Boilard, Eric

    2015-01-01

    Microparticles, also called microvesicles, are submicron extracellular vesicles produced by plasma membrane budding and shedding recognized as key actors in numerous physio(patho)logical processes. Since they can be released by virtually any cell lineages and are retrieved in biological fluids, microparticles appear as potent biomarkers. However, the small dimensions of microparticles and soluble factors present in body fluids can considerably impede their quantification. Here, flow cytometry with improved methodology for microparticle resolution was used to detect microparticles of human and mouse species generated from platelets, red blood cells, endothelial cells, apoptotic thymocytes and cells from the male reproductive tract. A family of soluble proteins, the secreted phospholipases A2 (sPLA2), comprises enzymes concomitantly expressed with microparticles in biological fluids and that catalyze the hydrolysis of membrane phospholipids. As sPLA2 can hydrolyze phosphatidylserine, a phospholipid frequently used to assess microparticles, and might even clear microparticles, we further considered the impact of relevant sPLA2 enzymes, sPLA2 group IIA, V and X, on microparticle quantification. We observed that if enriched in fluids, certain sPLA2 enzymes impair the quantification of microparticles depending on the species studied, the source of microparticles and the means of detection employed (surface phosphatidylserine or protein antigen detection). This study provides analytical considerations for appropriate interpretation of microparticle cytofluorometric measurements in biological samples containing sPLA2 enzymes.

  14. Molecular quantification of environmental DNA using microfluidics and digital PCR.

    Science.gov (United States)

    Hoshino, Tatsuhiko; Inagaki, Fumio

    2012-09-01

    Real-time PCR has been widely used to evaluate gene abundance in natural microbial habitats. However, PCR-inhibitory substances often reduce the efficiency of PCR, leading to the underestimation of target gene copy numbers. Digital PCR using microfluidics is a new approach that allows absolute quantification of DNA molecules. In this study, digital PCR was applied to environmental samples, and the effect of PCR inhibitors on DNA quantification was tested. In the control experiment using λ DNA and humic acids, underestimation of λ DNA at 1/4400 of the theoretical value was observed with 6.58 ng μL(-1) humic acids. In contrast, digital PCR provided accurate quantification data with a concentration of humic acids up to 9.34 ng μL(-1). The inhibitory effect of paddy field soil extract on quantification of the archaeal 16S rRNA gene was also tested. By diluting the DNA extract, quantified copy numbers from real-time PCR and digital PCR became similar, indicating that dilution was a useful way to remedy PCR inhibition. The dilution strategy was, however, not applicable to all natural environmental samples. For example, when marine subsurface sediment samples were tested the copy number of archaeal 16S rRNA genes was 1.04×10(3) copies/g-sediment by digital PCR, whereas real-time PCR only resulted in 4.64×10(2) copies/g-sediment, which was most likely due to an inhibitory effect. The data from this study demonstrated that inhibitory substances had little effect on DNA quantification using microfluidics and digital PCR, and showed the great advantages of digital PCR in accurate quantifications of DNA extracted from various microbial habitats. Copyright © 2012 Elsevier GmbH. All rights reserved.

  15. Methodological Considerations in Designing and Evaluating Animal-Assisted Interventions.

    Science.gov (United States)

    Stern, Cindy; Chur-Hansen, Anna

    2013-02-27

    This paper presents a discussion of the literature on animal-assisted interventions and describes limitations surrounding current methodological quality. Benefits to human physical, psychological and social health cannot be empirically confirmed due to the methodological limitations of the existing body of research, and comparisons cannot validly be made across different studies. Without a solid research base animal-assisted interventions will not receive recognition and acceptance as a credible alternative health care treatment. The paper draws on the work of four systematic reviews conducted over April-May 2009, with no date restrictions, focusing exclusively on the use of canine-assisted interventions for older people residing in long-term care. The reviews revealed a lack of good quality studies. Although the literature base has grown in volume since its inception, it predominantly consists of anecdotal accounts and reports. Experimental studies undertaken are often flawed in aspects of design, conduct and reporting. There are few qualitative studies available leading to the inability to draw definitive conclusions. It is clear that due to the complexities associated with these interventions not all weaknesses can be eliminated. However, there are basic methodological weaknesses that can be addressed in future studies in the area. Checklists for quantitative and qualitative research designs to guide future research are offered to help address methodological rigour.

  16. Final Technical Report: Mathematical Foundations for Uncertainty Quantification in Materials Design

    Energy Technology Data Exchange (ETDEWEB)

    Plechac, Petr [Univ. of Delaware, Newark, DE (United States); Vlachos, Dionisios G. [Univ. of Delaware, Newark, DE (United States)

    2018-01-23

    We developed path-wise information theory-based and goal-oriented sensitivity analysis and parameter identification methods for complex high-dimensional dynamics and in particular of non-equilibrium extended molecular systems. The combination of these novel methodologies provided the first methods in the literature which are capable to handle UQ questions for stochastic complex systems with some or all of the following features: (a) multi-scale stochastic models such as (bio)chemical reaction networks, with a very large number of parameters, (b) spatially distributed systems such as Kinetic Monte Carlo or Langevin Dynamics, (c) non-equilibrium processes typically associated with coupled physico-chemical mechanisms, driven boundary conditions, hybrid micro-macro systems, etc. A particular computational challenge arises in simulations of multi-scale reaction networks and molecular systems. Mathematical techniques were applied to in silico prediction of novel materials with emphasis on the effect of microstructure on model uncertainty quantification (UQ). We outline acceleration methods to make calculations of real chemistry feasible followed by two complementary tasks on structure optimization and microstructure-induced UQ.

  17. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  18. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    Directory of Open Access Journals (Sweden)

    Jongbin Ko

    2014-01-01

    Full Text Available A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  19. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    Science.gov (United States)

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  20. Comparison of machine learning and semi-quantification algorithms for (I123)FP-CIT classification: the beginning of the end for semi-quantification?

    Science.gov (United States)

    Taylor, Jonathan Christopher; Fenner, John Wesley

    2017-11-29

    Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified. This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson's Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features: Voxel intensities Principal components of image voxel intensities Striatal binding radios from the putamen and caudate. Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods: Minimum of age-matched controls Mean minus 1/1.5/2 standard deviations from age-matched controls Linear regression of normal patient data against age (minus 1/1.5/2 standard errors) Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times. The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson's disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively. Classification