WorldWideScience

Sample records for benefits quantification methodology

  1. Renewable Electricity Benefits Quantification Methodology: A Request for Technical Assistance from the California Public Utilities Commission

    Energy Technology Data Exchange (ETDEWEB)

    Mosey, G.; Vimmerstedt, L.

    2009-07-01

    The California Public Utilities Commission (CPUC) requested assistance in identifying methodological alternatives for quantifying the benefits of renewable electricity. The context is the CPUC's analysis of a 33% renewable portfolio standard (RPS) in California--one element of California's Climate Change Scoping Plan. The information would be used to support development of an analytic plan to augment the cost analysis of this RPS (which recently was completed). NREL has responded to this request by developing a high-level survey of renewable electricity effects, quantification alternatives, and considerations for selection of analytic methods. This report addresses economic effects and health and environmental effects, and provides an overview of related analytic tools. Economic effects include jobs, earnings, gross state product, and electricity rate and fuel price hedging. Health and environmental effects include air quality and related public-health effects, solid and hazardous wastes, and effects on water resources.

  2. Benefit quantification of interoperability in coordinate metrology

    DEFF Research Database (Denmark)

    Savio, E.; Carmignato, S.; De Chiffre, Leonardo

    2014-01-01

    these inefficiencies. The paper presents a methodology for an economic evaluation of interoperability benefits with respect to the verification of geometrical product specifications. It requires input data from testing and inspection activities, as well as information on training of personnel and licensing of software......One of the factors contributing to limited reproducibility of coordinate measurements is the use of different inspection software. Time-consuming efforts for translation of part programmes are sometimes needed, and interoperability of inspection equipment has the potential to reduce...

  3. Methodology of cost benefit analyses

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2000-10-01

    The report addresses financial aspects of proposed investments and other steps which are intended to contribute to nuclear safety. The aim is to provide introductory insight into the procedures and potential of cost-benefit analyses as a routine guide when making decisions on costly provisions as one of the tools to assess whether a particular provision is reasonable. The topic is applied to the nuclear power sector. (P.A.)

  4. A Micropillar Compression Methodology for Ductile Damage Quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  5. A micropillar compression methodology for ductile damage quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  6. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  7. A Project-Based Quantification of BIM Benefits

    Directory of Open Access Journals (Sweden)

    Jian Li

    2014-08-01

    Full Text Available In the construction industry, research is being carried out to look for feasible methods and technologies to cut down project costs and waste. Building Information Modelling (BIM is certainly currently a promising technology/method that can achieve this. The output of the construction industry has a considerable scale; however, the concentration of the industry and the level of informatization are still not high. There is still a large gap in terms of productivity between the construction industry and other industries. Due to the lack of first-hand data regarding how much of an effect can be genuinely had by BIM in real cases, it is unrealistic for construction stakeholders to take the risk of widely adopting BIM. This paper focuses on the methodological quantification (through a case study approach of BIM's benefits in building construction resource management and real-time costs control, in contrast to traditional non-BIM technologies. Through the use of BIM technology for the dynamic querying and statistical analysis of construction schedules, engineering, resources and costs, the three implementations considered demonstrate how BIM can facilitate the comprehensive grasp of a project's implementation and progress, identify and solve the contradictions and conflicts between construction resources and costs controls, reduce project over-spends and protect the supply of resources.

  8. Development of Accident Scenarios and Quantification Methodology for RAON Accelerator

    International Nuclear Information System (INIS)

    Lee, Yongjin; Jae, Moosung

    2014-01-01

    The RIsp (Rare Isotope Science Project) plans to provide neutron-rich isotopes (RIs) and stable heavy ion beams. The accelerator is defined as radiation production system according to Nuclear Safety Law. Therefore, it needs strict operate procedures and safety assurance to prevent radiation exposure. In order to satisfy this condition, there is a need for evaluating potential risk of accelerator from the design stage itself. Though some of PSA researches have been conducted for accelerator, most of them focus on not general accident sequence but simple explanation of accident. In this paper, general accident scenarios are developed by Event Tree and deduce new quantification methodology of Event Tree. In this study, some initial events, which may occur in the accelerator, are selected. Using selected initial events, the accident scenarios of accelerator facility are developed with Event Tree. These results can be used as basic data of the accelerator for future risk assessments. After analyzing the probability of each heading, it is possible to conduct quantification and evaluate the significance of the accident result. If there is a development of the accident scenario for external events, risk assessment of entire accelerator facility will be completed. To reduce the uncertainty of the Event Tree, it is possible to produce a reliable data via the presented quantification techniques

  9. Towards an uncertainty quantification methodology with CASMO-5

    International Nuclear Information System (INIS)

    Wieselquist, W.; Vasiliev, A.; Ferroukhi, H.

    2011-01-01

    We present the development of an uncertainty quantification (UQ) methodology for the CASMO-5 lattice physics code, used extensively at the Paul Scherrer Institut for standalone neutronics calculations, as well as the generation of nuclear fuel segment libraries for the downstream core simulator, SIMULATE-3. We focus here on propagation of nuclear data uncertainties and describe the framework required for 'black box' UQ--in this case minor modifications of the code are necessary to allow perturbation of the CASMO-5 nuclear data library. We then implement a basic rst-order UQ method, direct perturbation, which directly produces sensitivity coefficients and when folded with the input nuclear data variance-covariance matrix (VCM) yields output uncertainties in the form of an output VCM. We discuss the implementation, including how to map the VCMs of a different group structure to the code library group structure (in our case the ENDF/B-VII-based 586-group library in CASMO-5), present some results for pin cell calculations, and conclude with future work. (author)

  10. A Study on Uncertainty Quantification of Reflood Model using CIRCE Methodology

    International Nuclear Information System (INIS)

    Jeon, Seongsu; Hong, Soonjoon; Oh, Deogyeon; Bang, Youngseok

    2013-01-01

    The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment generally used. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. In this paper, the application process of CIRCE methodology and main results are briefly described. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. The application of CIRCE provided the satisfactory results. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM

  11. Recommendations for benefit-risk assessment methodologies and visual representations

    DEFF Research Database (Denmark)

    Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul

    2016-01-01

    PURPOSE: The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. METHODS: Eight case studies based on the benefit......-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. RESULTS: A general pathway through the case studies...

  12. Quantification of pelvic floor muscle strength in female urinary incontinence: A systematic review and comparison of contemporary methodologies.

    Science.gov (United States)

    Deegan, Emily G; Stothers, Lynn; Kavanagh, Alex; Macnab, Andrew J

    2018-01-01

    There remains no gold standard for quantification of voluntary pelvic floor muscle (PFM) strength, despite international guidelines that recommend PFM assessment in females with urinary incontinence (UI). Methods currently reported for quantification of skeletal muscle strength across disciplines are systematically reviewed and their relevance for clinical and academic use related to the pelvic floor are described. A systematic review via Medline, PubMed, CINHAL, and the Cochrane database using key terms for pelvic floor anatomy and function were cross referenced with skeletal muscle strength quantification from 1946 to 2016. Full text peer-reviewed articles in English having female subjects with incontinence were identified. Each study was analyzed for use of controls, type of methodology as direct or indirect measures, benefits, and limitations of the technique. A total of 1586 articles were identified of which 50 met the inclusion criteria. Nine methodologies of determining PFM strength were described including: digital palpation, perineometer, dynamometry, EMG, vaginal cones, ultrasonography, magnetic resonance imaging, urine stream interruption test, and the Colpexin pull test. Thirty-two percent lacked a control group. Technical refinements in both direct and indirect instrumentation for PFM strength measurement are allowing for sensitivity. However, the most common methods of quantification remain digital palpation and perineometry; techniques that pose limitations and yield subjective or indirect measures of muscular strength. Dynamometry has potential as an accurate and sensitive tool, but is limited by inability to assess PFM strength during dynamic movements. © 2017 Wiley Periodicals, Inc.

  13. A refined methodology for modeling volume quantification performance in CT

    Science.gov (United States)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  14. Costs and benefits of sulphur oxide control: a methodological study

    Energy Technology Data Exchange (ETDEWEB)

    1981-01-01

    The objective is to present for the first time a methodology for estimating the costs and benefits of SO/sub x/ control strategies as an aid o policy formulation which could create the basis for further action in member countries. To illustrate the methodology, different control scenarios for Western Europe are developed and analyzed using the cost-benefit approach, and some preliminary conclusions are drawn. The next step assesses the impact of the emissions on ambient air quality, calculated with the aid of long-range and urban air quality models. Finally, the impact of the calculated concentrations of SO/sub x/ in the different scenarios on a number of environmental and human assets - materials, agricultural crops, health, and aquatic ecosystems - are estimated in order to have a measure of the benefits of control.

  15. NASA Electronic Publishing System: Cost/benefit Methodology

    Science.gov (United States)

    Tuey, Richard C.

    1994-01-01

    The NASA Scientific and Technical Information Office was assigned the responsibility to examine the benefits of the utilization of electronic printing and duplicating systems throughout NASA Installations and Headquarters. The subject of this report is the documentation of the methodology used in justifying the acquisition of the most cost beneficial solution for the printing and duplicating requirements of a duplicating facility that is contemplating the acquisition of an electronic printing and duplicating system. Four alternatives are presented with each alternative costed out with its associated benefits. The methodology goes a step further than just a cost benefit analysis through its comparison of risks associated with each alternative, sensitivity to number of impressions and productivity gains on the selected alternative and finally the return on investment for the selected alternative. The report can be used in conjunction with the two earlier reports, NASA-TM-106242 and TM-106510 in guiding others in determining the cost effective duplicating alternative.

  16. Quantification methodology for the French 900 MW PWR PRA

    International Nuclear Information System (INIS)

    Ducamp, F.; Lanore, J.M.; Duchemin, B.; De Villeneuve, M.J.

    1985-02-01

    This paper develops some improvements brought to provide to the classical way of risk assessment. The calculation of the contribution to the risk of one peculiar sequence of an event tree is composed of four stages: creation of a fault tree for each system which appears in the event trees, in terms of component faults; simplification of these fault trees into smaller ones, in terms of macrocomponents; creation of one ''super-tree'' by regrouping the fault trees of down systems (systems which fail in the sequence) under an AND gate and calculation of minimal cut sets of this super-tree, taking into account the up systems (systems that do not fail in the sequence) and peculiarities related to the initiating event if needed; quantification of the minimal cut sets so obtained, taking into account the duration of the scenario depicted by the sequence and the possibilities of repair. Each of these steps is developed in this article

  17. Quantification of growth benefit of carnivorous plants from prey

    Czech Academy of Sciences Publication Activity Database

    Adamec, Lubomír

    2017-01-01

    Roč. 46, č. 3 (2017), s. 1-7 ISSN 0190-9215 Institutional support: RVO:67985939 Keywords : mineral cost and benefit * stimulation of roots * growth stimulation Subject RIV: EF - Botanics OBOR OECD: Plant sciences, botany

  18. Human error probability quantification using fuzzy methodology in nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, Claudio Souza do

    2010-01-01

    This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations a hypothesis on Research Reactor IEA-R1 from IPEN. It was also obtained a Performance Shaping Factors (PSF) evaluation in order to classify them according to their influence level onto the operator's actions and to determine these PSF actual states over the plant. Both HEP estimation and PSF evaluation were done based on Specialists Evaluation using interviews and questionnaires. Specialists group was composed from selected IEA-R1 operators. Specialist's knowledge representation into linguistic variables and group evaluation values were obtained through Fuzzy Logic and Fuzzy Set Theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on Human Reliability Analysis (HRA). (author)

  19. [Early post-partum discharges: benefits, disadvantages and implementation methodology].

    Science.gov (United States)

    Berkane, N

    2015-02-01

    Early post-partum discharges (EPD) are a hot topic. Already widely practised in many European countries this procedure, was promoted by the government for a decade, requested by representatives of Midwive organisations, desired by some patients, but also criticized by the Academy of Medicine. Well organized and with an obligatory control and follow-up, EPD could help with the management of the shortage of maternity beds and hence increase the satisfaction of the patients. The procedure could even be a way to effectively implement a town-hospital network, something, which has many other benefits. However this procedure is not without potential dangers: lower quality of care, financial risks for the department, and especially a significant increase of the workload of the hospital staff. The main objective of this paper is to detail the benefits and disadvantages of EPD for maternities and to propose an organizational basis if EPD is the procedure of choice. A participatory methodology is essential when using this procedure, due to the important participation of different categories of staff involved in hospital discharge (administrative, medical and paramedical staff) and to exclude complications when certain specifications are not followed or misunderstood. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  20. Quantification of flood risk mitigation benefits: A building-scale damage assessment through the RASOR platform.

    Science.gov (United States)

    Arrighi, Chiara; Rossi, Lauro; Trasforini, Eva; Rudari, Roberto; Ferraris, Luca; Brugioni, Marcello; Franceschini, Serena; Castelli, Fabio

    2018-02-01

    Flood risk mitigation usually requires a significant investment of public resources and cost-effectiveness should be ensured. The assessment of the benefits of hydraulic works requires the quantification of (i) flood risk in absence of measures, (ii) risk in presence of mitigation works, (iii) investments to achieve acceptable residual risk. In this work a building-scale is adopted to estimate direct tangible flood losses to several building classes (e.g. residential, industrial, commercial, etc.) and respective contents, exploiting various sources of public open data in a GIS environment. The impact simulations for assigned flood hazard scenarios are computed through the RASOR platform which allows for an extensive characterization of the properties and their vulnerability through libraries of stage-damage curves. Recovery and replacement costs are estimated based on insurance data, market values and socio-economic proxies. The methodology is applied to the case study of Florence (Italy) where a system of retention basins upstream of the city is under construction to reduce flood risk. Current flood risk in the study area (70 km 2 ) is about 170 Mio euros per year without accounting for people, infrastructures, cultural heritage and vehicles at risk. The monetary investment in the retention basins is paid off in about 5 years. However, the results show that although hydraulic works are cost-effective, a significant residual risk has to be managed and the achievement of the desired level of acceptable risk would require about 1 billion euros of investments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Quantification of the detriment and comparison of health risks. Methodological problems

    International Nuclear Information System (INIS)

    Jammet, H.

    1982-01-01

    Some of the methodological problems involved in the quantitative estimate of the health detriment of different energy sources and in risk comparison are described. First, the question of determining the detriment is discussed from the point of view of the distortions introduced in the quantification when dealing with risks for which the amount of information available varies widely. The main criteria applied to classifying types of detriment are then recalled. Finally, the problems involved in comparisons are outlined: spatial and temporal variations in the types of detriment, operation under normal and accident conditions, and the risks to the public and workers. (author)

  2. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  3. Improved Methodology for Benefit Estimation of Preservation Projects

    Science.gov (United States)

    2018-04-01

    This research report presents an improved process for evaluating the benefits and economic tradeoffs associated with a variety of highway preservation projects. It includes a summary of results from a comprehensive phone survey concerning the use and...

  4. Assessment of ALWR passive safety system reliability. Phase 1: Methodology development and component failure quantification

    International Nuclear Information System (INIS)

    Hake, T.M.; Heger, A.S.

    1995-04-01

    Many advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive systems to perform safety functions, rather than active systems as in current reactor designs. These passive systems depend to a great extent on physical processes such as natural circulation for their driving force, and not on active components, such as pumps. An NRC-sponsored study was begun at Sandia National Laboratories to develop and implement a methodology for evaluating ALWR passive system reliability in the context of probabilistic risk assessment (PRA). This report documents the first of three phases of this study, including methodology development, system-level qualitative analysis, and sequence-level component failure quantification. The methodology developed addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. Traditional PRA methods, such as fault and event tree modeling, are applied to the component failure aspect. Thermal-hydraulic calculations are incorporated into a formal expert judgment process to address uncertainties in selected natural processes and success criteria. The first phase of the program has emphasized the component failure element of passive system reliability, rather than the natural process uncertainties. Although cursory evaluation of the natural processes has been performed as part of Phase 1, detailed assessment of these processes will take place during Phases 2 and 3 of the program

  5. Optimization Of Methodological Support Of Application Tax Benefits In Regions: Practice Of Perm Region

    Directory of Open Access Journals (Sweden)

    Alexandr Ivanovich Tatarkin

    2015-03-01

    Full Text Available In the article, the problem of the methodological process support of regional tax benefits is reviewed. The method of tax benefits assessment, accepted in Perm Region, was chosen as an analysis object because the relatively long period of application of benefits has allowed to build enough statistics base. In the article, the reliability of budget, economic, investment, and social effectiveness assessments of application benefits, based on the Method, is investigated. The suggestions of its perfection are formulated

  6. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Xiao-Ying; Yao, Juan; He, Hua; Glantz, Clifford S.; Booth, Alexander E.

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  7. Quantification of the least limiting water range in an oxisol using two methodological strategies

    Directory of Open Access Journals (Sweden)

    Wagner Henrique Moreira

    2014-12-01

    Full Text Available The least limiting water range (LLWR has been used as an indicator of soil physical quality as it represents, in a single parameter, the soil physical properties directly linked to plant growth, with the exception of temperature. The usual procedure for obtaining the LLWR involves determination of the water retention curve (WRC and the soil resistance to penetration curve (SRC in soil samples with undisturbed structure in the laboratory. Determination of the WRC and SRC using field measurements (in situ is preferable, but requires appropriate instrumentation. The objective of this study was to determine the LLWR from the data collected for determination of WRC and SRC in situ using portable electronic instruments, and to compare those determinations with the ones made in the laboratory. Samples were taken from the 0.0-0.1 m layer of a Latossolo Vermelho distrófico (Oxisol. Two methods were used for quantification of the LLWR: the traditional, with measurements made in soil samples with undisturbed structure; and in situ , with measurements of water content (θ, soil water potential (Ψ, and soil resistance to penetration (SR through the use of sensors. The in situ measurements of θ, Ψ and SR were taken over a period of four days of soil drying. At the same time, samples with undisturbed structure were collected for determination of bulk density (BD. Due to the limitations of measurement of Ψ by tensiometer, additional determinations of θ were made with a psychrometer (in the laboratory at the Ψ of -1500 kPa. The results show that it is possible to determine the LLWR by the θ, Ψ and SR measurements using the suggested approach and instrumentation. The quality of fit of the SRC was similar in both strategies. In contrast, the θ and Ψ in situ measurements, associated with those measured with a psychrometer, produced a better WRC description. The estimates of the LLWR were similar in both methodological strategies. The quantification of

  8. Costs without benefits? Methodological issues in assessing costs, benefits and effectiveness of water protection policies. Paper

    Energy Technology Data Exchange (ETDEWEB)

    Walz, R.; Schleich, J.

    2000-07-01

    In the last few years, the conditions for extending environmental policy in general and policy dealing with the prevention of water pollution in particular have undergone extensive changes. On the one hand, there has been indisputable considerable success in preventing water pollution which has led to less direct pressure for policy action. On the other hand, the rising sewage levies and the lower political priority assigned in general to environmental policy documented in, e. g. public opinion surveys, has led to water pollution control policy facing very different pressures of justification: more efficient use of funds, improved planning processes, proof of the achievable benefit, but also stopping the increase in levies or not hindering economic development, these or similar slogans are the objections brought against water pollution control. Regardless of how unambiguous these terms appear when used as slogans in this way, they become diffuse and unclear if regarded more closely. This paper therefore attempts to reveal the reasons for possible misunderstandings and misinterpretations on the one hand and, on the other, to reveal the basic problems and uncertainties which are necessarily linked with an assessment of costs and benefits. In order to do this, three areas are examined: level of actors and analysis, evaluation methods and assessment of costs and benefits. (orig.)

  9. Methodology for quantification of waste generated in Spanish railway construction works

    International Nuclear Information System (INIS)

    Guzmán Báez, Ana de; Villoria Sáez, Paola; Río Merino, Mercedes del; García Navarro, Justo

    2012-01-01

    Highlights: ► Two equations for C and D waste estimation in railway construction works are developed. ► Mixed C and D waste is the most generated category during railway construction works. ► Tunnel construction is essential to quantify the waste generated during the works. ► There is a relationship between C and D waste generated and railway functional units. ► The methodology proposed can be used to obtain new constants for other areas. - Abstract: In the last years, the European Union (EU) has been focused on the reduction of construction and demolition (C and D) waste. Specifically, in 2006, Spain generated roughly 47 million tons of C and D waste, of which only 13.6% was recycled. This situation has lead to the drawing up of many regulations on C and D waste during the past years forcing EU countries to include new measures for waste prevention and recycling. Among these measures, the mandatory obligation to quantify the C and D waste expected to be originated during a construction project is mandated. However, limited data is available on civil engineering projects. Therefore, the aim of this research study is to improve C and D waste management in railway projects, by developing a model for C and D waste quantification. For this purpose, we develop two equations which estimate in advance the amount, both in weight and volume, of the C and D waste likely to be generated in railway construction projects, including the category of C and D waste generated for the entire project.

  10. Application of a Bayesian model for the quantification of the European methodology for qualification of non-destructive testing

    International Nuclear Information System (INIS)

    Gandossi, Luca; Simola, Kaisa; Shepherd, Barrie

    2010-01-01

    The European methodology for qualification of non-destructive testing is a well-established approach adopted by nuclear utilities in many European countries. According to this methodology, qualification is based on a combination of technical justification and practical trials. The methodology is qualitative in nature, and it does not give explicit guidance on how the evidence from the technical justification and results from trials should be weighted. A Bayesian model for the quantification process was presented in a previous paper, proposing a way to combine the 'soft' evidence contained in a technical justification with the 'hard' evidence obtained from practical trials. This paper describes the results of a pilot study in which such a Bayesian model was applied to two realistic Qualification Dossiers by experienced NDT qualification specialists. At the end of the study, recommendations were made and a set of guidelines was developed for the application of the Bayesian model.

  11. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    OpenAIRE

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  12. A proposed approach to backfit decision-making using risk assessment and benefit-cost methodology

    International Nuclear Information System (INIS)

    O'Donnell, E.P.; Raney, T.J.

    1984-01-01

    This paper outlines a proposed approach to backfit decision-making which utilizes quantitative risk assessment techniques, benefit-cost methodology and decision criteria. In general terms, it is structured to provide an objective framework for decision-making aimed at ensuring a positive return on backfit investment while allowing for inclusion of subjective value judgments by the decision-maker. The distributions of the independent variables are combined to arrive at an overall probability distribution for the benefit-cost ratio. In this way, the decision-maker can explicitly establish the probability or level of confidence that a particular backfit will yield benefits in excess of cost. An example is presented demonstrating the application of methodology to a specific plant backfit. (orig.)

  13. Socially responsible ethnobotanical surveys in the Cape Floristic Region: ethical principles, methodology and quantification of data

    Directory of Open Access Journals (Sweden)

    Ben-Erik Van Wyk

    2012-03-01

    Full Text Available A broad overview of published and unpublished ethnobotanical surveys in the Cape Floristic Region (the traditional home of the San and Khoi communities shows that the data is incomplete. There is an urgent need to record the rich indigenous knowledge about plants in a systematic and social responsible manner in order to preserve this cultural and scientific heritage for future generations. Improved methods for quantifying data are introduced, with special reference to the simplicity and benefits of the new Matrix Method. This methodology prevents or reduces the number of false negatives, and also ensures the participation of elderly people who might be immobile. It also makes it possible to compare plant uses in different local communities. This method enables the researcher to quantify the knowledge on plant use that was preserved in a community, and to determine the relative importance of a specific plant in a more objective way. Ethical considerations for such ethnobotanical surveys are discussed, through the lens of current ethical codes and international conventions. This is an accessible approach, which can also be used in the life sciences classroom.

  14. A methodology for estimating health benefits of electricity generation using renewable technologies.

    Science.gov (United States)

    Partridge, Ian; Gamkhar, Shama

    2012-02-01

    At Copenhagen, the developed countries agreed to provide up to $100 bn per year to finance climate change mitigation and adaptation by developing countries. Projects aimed at cutting greenhouse gas (GHG) emissions will need to be evaluated against dual criteria: from the viewpoint of the developed countries they must cut emissions of GHGs at reasonable cost, while host countries will assess their contribution to development, or simply their overall economic benefits. Co-benefits of some types of project will also be of interest to host countries: for example some projects will contribute to reducing air pollution, thus improving the health of the local population. This paper uses a simple damage function methodology to quantify some of the health co-benefits of replacing coal-fired generation with wind or small hydro in China. We estimate the monetary value of these co-benefits and find that it is probably small compared to the added costs. We have not made a full cost-benefit analysis of renewable energy in China as some likely co-benefits are omitted from our calculations. Our results are subject to considerable uncertainty however, after careful consideration of their likely accuracy and comparisons with other studies, we believe that they provide a good first cut estimate of co-benefits and are sufficiently robust to stand as a guide for policy makers. In addition to these empirical results, a key contribution made by the paper is to demonstrate a simple and reasonably accurate methodology for health benefits estimation that applies the most recent academic research in the field to the solution of an increasingly important problem. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. A methodology for assessing the market benefits of alternative motor fuels: The Alternative Fuels Trade Model

    Energy Technology Data Exchange (ETDEWEB)

    Leiby, P.N.

    1993-09-01

    This report describes a modeling methodology for examining the prospective economic benefits of displacing motor gasoline use by alternative fuels. The approach is based on the Alternative Fuels Trade Model (AFTM). AFTM development was undertaken by the US Department of Energy (DOE) as part of a longer term study of alternative fuels issues. The AFTM is intended to assist with evaluating how alternative fuels may be promoted effectively, and what the consequences of substantial alternative fuels use might be. Such an evaluation of policies and consequences of an alternative fuels program is being undertaken by DOE as required by Section 502(b) of the Energy Policy Act of 1992. Interest in alternative fuels is based on the prospective economic, environmental and energy security benefits from the substitution of these fuels for conventional transportation fuels. The transportation sector is heavily dependent on oil. Increased oil use implies increased petroleum imports, with much of the increase coming from OPEC countries. Conversely, displacement of gasoline has the potential to reduce US petroleum imports, thereby reducing reliance on OPEC oil and possibly weakening OPEC`s ability to extract monopoly profits. The magnitude of US petroleum import reduction, the attendant fuel price changes, and the resulting US benefits, depend upon the nature of oil-gas substitution and the supply and demand behavior of other world regions. The methodology applies an integrated model of fuel market interactions to characterize these effects.

  16. ALARA cost/benefit analysis at Union Electric company using the ARP/AI methodology

    International Nuclear Information System (INIS)

    Williams, M.C.

    1987-01-01

    This paper describes the development of a specific method for justification of expenditures associated with reducing occupational radiation exposure to as low as reasonably achievable (ALARA). This methodology is based on the concepts of the Apparebt Reduction Potential (ARP) and Achievability index (AI) as described in NUREG/CR-0446, Union Eletric's corporate planning model and the EPRI Model for dose rate buildup with reactor operating life. The ARP provides a screening test to determine if there is a need for ALARA expenditures based on actual or predicted exposure rates and/or dose experience. The AI is a means of assessing all costs and all benefits, even though they are expressed in different units of measurement such as person-rem and dollars, to determine if ALARA expenditures are justified and their value. This method of cost/benefit analysis can be applied by any company or organization utilizing site-specific exposure and dose rate data, and incorporating consideration of administrative exposure controls which may vary from organization to organization. Specific example cases are presented and compared to other methodologies for ALARA cost/benefit analysis

  17. Comparison of DNA quantification methodology used in the DNA extraction protocol for the UK Biobank cohort.

    Science.gov (United States)

    Welsh, Samantha; Peakman, Tim; Sheard, Simon; Almond, Rachael

    2017-01-05

    UK Biobank is a large prospective cohort study in the UK established by the Medical Research Council (MRC) and the Wellcome Trust to enable approved researchers to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. A wide range of phenotypic data has been collected at recruitment and has recently been enhanced by the UK Biobank Genotyping Project. All UK Biobank participants (500,000) have been genotyped on either the UK Biobank Axiom® Array or the Affymetrix UK BiLEVE Axiom® Array and the workflow for preparing samples for genotyping is described. The genetic data is hoped to provide further insight into the genetics of disease. All data, including the genetic data, is available for access to approved researchers. Data for two methods of DNA quantification (ultraviolet-visible spectroscopy [UV/Vis]) measured on the Trinean DropSense™ 96 and PicoGreen®) were compared by two laboratories (UK Biobank and Affymetrix). The sample processing workflow established at UK Biobank, for genotyping on the custom Affymetrix Axiom® array, resulted in high quality DNA (average DNA concentration 38.13 ng/μL, average 260/280 absorbance 1.91). The DNA generated high quality genotype data (average call rate 99.48% and pass rate 99.45%). The DNA concentration measured on the Trinean DropSense™ 96 at UK Biobank correlated well with DNA concentration measured by PicoGreen® at Affymetrix (r = 0.85). The UK Biobank Genotyping Project demonstrated that the high throughput DNA extraction protocol described generates high quality DNA suitable for genotyping on the Affymetrix Axiom array. The correlation between DNA concentration derived from UV/Vis and PicoGreen® quantification methods suggests, in large-scale genetic studies involving two laboratories, it may be possible to remove the DNA quantification step in one laboratory without affecting downstream analyses. This would result in

  18. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    Science.gov (United States)

    Akram, Muhammad Farooq Bin

    The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for

  19. Standardization of a PIGE methodology for simultaneous quantification of low Z elements in barium borosilicate glass samples

    International Nuclear Information System (INIS)

    Chhillar, S.; Acharya, R.; Dasari, K.B.; Pujari, P.K.; Mishra, R.K.; Kaushik, C.P.

    2013-01-01

    In order to standardize particle induced gamma-ray emission (PIGE) methodology for simultaneous quantification of light elements, analytical sensitivities of Li, F, B, Na, Al and Si were evaluated using 4 MeV proton beam ( ∼ 10 nA current) using 3 MV Pelletron at IOP, Bhubaneswar. The PIGE method was validated by determining all six elements in a synthetic sample in graphite matrix and applied to two barium borosilicate glass (BaBSG) samples. The prompt γ-rays emitted from inelastic scattering or nuclear reactions of corresponding isotopes were measured using a 60% HPGe coupled to MCA and the current normalized count rates were used for concentration calculation. (author)

  20. Importance of the lipid peroxidation biomarkers and methodological aspects FOR malondialdehyde quantification

    Directory of Open Access Journals (Sweden)

    Denise Grotto

    2009-01-01

    Full Text Available Free radicals induce lipid peroxidation, playing an important role in pathological processes. The injury mediated by free radicals can be measured by conjugated dienes, malondialdehyde, 4-hydroxynonenal, and others. However, malondialdehyde has been pointed out as the main product to evaluate lipid peroxidation. Most assays determine malondialdehyde by its reaction with thiobarbituric acid, which can be measured by indirect (spectrometry and direct methodologies (chromatography. Though there is some controversy among the methodologies, the selective HPLC-based assays provide a more reliable lipid peroxidation measure. This review describes significant aspects about MDA determination, its importance in pathologies and biological samples treatment.

  1. Methodology for quantification of waste generated in Spanish railway construction works.

    Science.gov (United States)

    de Guzmán Báez, Ana; Villoria Sáez, Paola; del Río Merino, Mercedes; García Navarro, Justo

    2012-05-01

    In the last years, the European Union (EU) has been focused on the reduction of construction and demolition (C&D) waste. Specifically, in 2006, Spain generated roughly 47million tons of C&D waste, of which only 13.6% was recycled. This situation has lead to the drawing up of many regulations on C&D waste during the past years forcing EU countries to include new measures for waste prevention and recycling. Among these measures, the mandatory obligation to quantify the C&D waste expected to be originated during a construction project is mandated. However, limited data is available on civil engineering projects. Therefore, the aim of this research study is to improve C&D waste management in railway projects, by developing a model for C&D waste quantification. For this purpose, we develop two equations which estimate in advance the amount, both in weight and volume, of the C&D waste likely to be generated in railway construction projects, including the category of C&D waste generated for the entire project. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Performance of uncertainty quantification methodologies and linear solvers in cardiovascular simulations

    Science.gov (United States)

    Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison

    2017-11-01

    Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.

  3. Validation of a spectrophotometric methodology for the quantification of polysaccharides from roots of Operculina macrocarpa (jalapa

    Directory of Open Access Journals (Sweden)

    Marcos A.M. Galvão

    Full Text Available The roots from Operculina macrocarpa (L. Urb., Convolvulaceae, are widely used in Brazilian traditional medicine as a laxative and purgative. The biological properties of this drug material have been attributed to its polysaccharides content. Thus, the aim of this study was to evaluate the polysaccharide content in drug material from O. macrocarpa by spectrophotometric quantitative analysis. The root was used as plant material and the botanical identification was performed by macro and microscopic analysis. The plant material was used to validate the spectrophotometric procedures at 490 nm for the quantification of the reaction product from drug polysaccharides and phenol-sulfuric acid solution. The analytical procedure was evaluated in order to comply with the necessary legal requirements by the determination of the following parameters: specificity, linearity, selectivity, precision, accuracy and robustness. This study provides with a simple and valid analytical procedure (linear, precise, accurate and reproducible, which can be satisfactorily used for quality control and standardization of herbal drug from O. macrocarpa.

  4. Quantification of Benefits and Cost from Applying a Product Configuration System

    DEFF Research Database (Denmark)

    Kristjansdottir, Katrin; Shafiee, Sara; Hvam, Lars

    of generating the products’ specifications. In addition the lead-time for generating products’ specifications has been reduced and indications of improved quality of the products’ specifications and additional sales are identified. The research verifies the benefits described in the current literature...

  5. Methodology for quantification of radionuclides used in therapy by bioanalysis 'in vitro'

    International Nuclear Information System (INIS)

    Juliao, Ligia M.Q.C.; Sousa, Wanderson O.; Mesquita, Sueli A.; Santos, Maristela S.; Oliveira, S.M. Velasques de

    2008-01-01

    In Brazil, the radionuclides used for therapy are 131 ; 153 Sm, 90 Y and 177 Lu, under routine or experimentally. The quantification of the radiopharmaceutical activity excreted by the patient through the bioassay method, can be an important tool for individualized dosimetry, aiming the planning of subsequent therapies. The Bioanalysis In Vitro Laboratory (LBIOVT) of the Service of Individual monitoring (SEMIN) of the Institute for Radiation Protection and Dosimetry (IRD/CNEN-RJ), Brazil, has equipment and procedures for gamma and beta spectrometry. These detection systems are calibrated in energy and efficiency, and used standard reference sources provided by the National Laboratory of Metrology of Ionizing Radiation (LMNRI/IRD/CNEN-RJ). The LBIOVT Quality System follows the guidelines of the ISO-ABNT-17025 standard and annually, the laboratory participates in national (PNI) and international (PROCORAD). With respect to the excreta samples from patients, these are collected immediately after administration of the radiopharmaceutical. During the first 24 hours, they are collected with the patient hospitalized, and depending upon the physical half-life of the radionuclide can also be collected in the patient's home. Both in hospital and at home, the excreta is handled, stored and transported in accordance with standards for clinical research, radiation protection and transport of radioactive and biological materials. The specific activity radionuclide is referenced to the date and time of collection, allowing further evaluation of biological individual half-life. The care with the registration of excreted volumes as well as possible loss of excreta during collection, may interfere with the interpretation of the measures, since the results are provided in specific activity (Bq / L). Regarding the bioassay laboratory, these results are reliable when the laboratory is certified and participates in intercomparison programs of measures and methods. The laboratory

  6. Quantification and Management of Manifest Occlusal Caries Lesions in Adults: A Methodological and a Clinical Study

    DEFF Research Database (Denmark)

    Bakhshandeh, Azam

    2010-01-01

    teeth with primary occlusal lesions. Randomization was performed in cases of more than one lesion in the same patient, so that the final material consisted of 60 resin sealed and 12 restored lesions. After 2-3 years, there was a drop-out of 15%; 2 patients did not show up for the control and 9...... extension of the lesions from baseline and the last control radiograph, there was scored caries progression beneath 5 (10%) of 49 sealants, caries regression beneath 1 (2%) sealant and unchanged lesion depth beneath 43 (88%) sealants and all restorations (p = 0.64). The methodological study included 110...

  7. Experience and benefits from using the EPRI MOV Performance Prediction Methodology in nuclear power plants

    International Nuclear Information System (INIS)

    Walker, T.; Damerell, P.S.

    1999-01-01

    The EPRI MOV Performance Prediction Methodology (PPM) is an effective tool for evaluating design basis thrust and torque requirements for MOVs. Use of the PPM has become more widespread in US nuclear power plants as they close out their Generic Letter (GL) 89-10 programs and address MOV periodic verification per GL 96-05. The PPM has also been used at plants outside the US, many of which are implementing programs similar to US plants' GL 89-10 programs. The USNRC Safety Evaluation of the PPM and the USNRC's discussion of the PPM in GL 96-05 make the PPM an attractive alternative to differential pressure (DP) testing, which can be costly and time-consuming. Significant experience and benefits, which are summarized in this paper, have been gained using the PPM. Although use of PPM requires a commitment of resources, the benefits of a solidly justified approach and a reduced need for DP testing provide a substantial safety and economic benefit. (author)

  8. A systematic methodology for the robust quantification of energy efficiency at wastewater treatment plants featuring Data Envelopment Analysis.

    Science.gov (United States)

    Longo, S; Hospido, A; Lema, J M; Mauricio-Iglesias, M

    2018-05-10

    This article examines the potential benefits of using Data Envelopment Analysis (DEA) for conducting energy-efficiency assessment of wastewater treatment plants (WWTPs). WWTPs are characteristically heterogeneous (in size, technology, climate, function …) which limits the correct application of DEA. This paper proposes and describes the Robust Energy Efficiency DEA (REED) in its various stages, a systematic state-of-the-art methodology aimed at including exogenous variables in nonparametric frontier models and especially designed for WWTP operation. In particular, the methodology systematizes the modelling process by presenting an integrated framework for selecting the correct variables and appropriate models, possibly tackling the effect of exogenous factors. As a result, the application of REED improves the quality of the efficiency estimates and hence the significance of benchmarking. For the reader's convenience, this article is presented as a step-by-step guideline to guide the user in the determination of WWTPs energy efficiency from beginning to end. The application and benefits of the developed methodology are demonstrated by a case study related to the comparison of the energy efficiency of a set of 399 WWTPs operating in different countries and under heterogeneous environmental conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Methodology to estimate the cost of the severe accidents risk / maximum benefit

    International Nuclear Information System (INIS)

    Mendoza, G.; Flores, R. M.; Vega, E.

    2016-09-01

    For programs and activities to manage aging effects, any changes to plant operations, inspections, maintenance activities, systems and administrative control procedures during the renewal period should be characterized, designed to manage the effects of aging as required by 10 Cfr Part 54 that could impact the environment. Environmental impacts significantly different from those described in the final environmental statement for the current operating license should be described in detail. When complying with the requirements of a license renewal application, the Severe Accident Mitigation Alternatives (SAMA) analysis is contained in a supplement to the environmental report of the plant that meets the requirements of 10 Cfr Part 51. In this paper, the methodology for estimating the cost of severe accidents risk is established and discussed, which is then used to identify and select the alternatives for severe accident mitigation, which are analyzed to estimate the maximum benefit that an alternative could achieve if this eliminate all risk. Using the regulatory analysis techniques of the US Nuclear Regulatory Commission (NRC) estimates the cost of severe accidents risk. The ultimate goal of implementing the methodology is to identify candidates for SAMA that have the potential to reduce the severe accidents risk and determine if the implementation of each candidate is cost-effective. (Author)

  10. A High-Performance Embedded Hybrid Methodology for Uncertainty Quantification With Applications

    Energy Technology Data Exchange (ETDEWEB)

    Iaccarino, Gianluca

    2014-04-01

    Multiphysics processes modeled by a system of unsteady di erential equations are natu- rally suited for partitioned (modular) solution strategies. We consider such a model where probabilistic uncertainties are present in each module of the system and represented as a set of random input parameters. A straightforward approach in quantifying uncertainties in the predicted solution would be to sample all the input parameters into a single set, and treat the full system as a black-box. Although this method is easily parallelizable and requires minimal modi cations to deterministic solver, it is blind to the modular structure of the underlying multiphysical model. On the other hand, using spectral representations polynomial chaos expansions (PCE) can provide richer structural information regarding the dynamics of these uncertainties as they propagate from the inputs to the predicted output, but can be prohibitively expensive to implement in the high-dimensional global space of un- certain parameters. Therefore, we investigated hybrid methodologies wherein each module has the exibility of using sampling or PCE based methods of capturing local uncertainties while maintaining accuracy in the global uncertainty analysis. For the latter case, we use a conditional PCE model which mitigates the curse of dimension associated with intru- sive Galerkin or semi-intrusive Pseudospectral methods. After formalizing the theoretical framework, we demonstrate our proposed method using a numerical viscous ow simulation and benchmark the performance against a solely Monte-Carlo method and solely spectral method.

  11. A methodology for direct quantification of over-ranging length in helical computed tomography with real-time dosimetry.

    Science.gov (United States)

    Tien, Christopher J; Winslow, James F; Hintenlang, David E

    2011-01-31

    In helical computed tomography (CT), reconstruction information from volumes adjacent to the clinical volume of interest (VOI) is required for proper reconstruction. Previous studies have relied upon either operator console readings or indirect extrapolation of measurements in order to determine the over-ranging length of a scan. This paper presents a methodology for the direct quantification of over-ranging dose contributions using real-time dosimetry. A Siemens SOMATOM Sensation 16 multislice helical CT scanner is used with a novel real-time "point" fiber-optic dosimeter system with 10 ms temporal resolution to measure over-ranging length, which is also expressed in dose-length-product (DLP). Film was used to benchmark the exact length of over-ranging. Over-ranging length varied from 4.38 cm at pitch of 0.5 to 6.72 cm at a pitch of 1.5, which corresponds to DLP of 131 to 202 mGy-cm. The dose-extrapolation method of Van der Molen et al. yielded results within 3%, while the console reading method of Tzedakis et al. yielded consistently larger over-ranging lengths. From film measurements, it was determined that Tzedakis et al. overestimated over-ranging lengths by one-half of beam collimation width. Over-ranging length measured as a function of reconstruction slice thicknesses produced two linear regions similar to previous publications. Over-ranging is quantified with both absolute length and DLP, which contributes about 60 mGy-cm or about 10% of DLP for a routine abdominal scan. This paper presents a direct physical measurement of over-ranging length within 10% of previous methodologies. Current uncertainties are less than 1%, in comparison with 5% in other methodologies. Clinical implantation can be increased by using only one dosimeter if codependence with console readings is acceptable, with an uncertainty of 1.1% This methodology will be applied to different vendors, models, and postprocessing methods--which have been shown to produce over-ranging lengths

  12. In vivo quantification of lead in bone with a portable x-ray fluorescence system--methodology and feasibility.

    Science.gov (United States)

    Nie, L H; Sanchez, S; Newton, K; Grodzins, L; Cleveland, R O; Weisskopf, M G

    2011-02-07

    This study was conducted to investigate the methodology and feasibility of developing a portable x-ray fluorescence (XRF) technology to quantify lead (Pb) in bone in vivo. A portable XRF device was set up and optimal settings of voltage, current, and filter combination for bone lead quantification were selected to achieve the lowest detection limit. The minimum radiation dose delivered to the subject was calculated by Monte Carlo simulations. An ultrasound device was used to measure soft tissue thickness to account for signal attenuation, and an alternative method to obtain soft tissue thickness from the XRF spectrum was developed and shown to be equivalent to the ultrasound measurements (intraclass correlation coefficient, ICC = 0.82). We tested the correlation of in vivo bone lead concentrations between the standard KXRF technology and the portable XRF technology. There was a significant correlation between the bone lead concentrations obtained from the standard KXRF technology and those obtained from the portable XRF technology (ICC = 0.65). The detection limit for the portable XRF device was about 8.4 ppm with 2 mm soft tissue thickness. The entrance skin dose delivered to the human subject was about 13 mSv and the total body effective dose was about 1.5 µSv and should pose minimal radiation risk. In conclusion, portable XRF technology can be used for in vivo bone lead measurement with sensitivity comparable to the KXRF technology and good correlation with KXRF measurements.

  13. In Vivo Quantification of Lead in Bone with a Portable X-ray Fluorescence (XRF) System – Methodology and Feasibility

    Science.gov (United States)

    Nie, LH; Sanchez, S; Newton, K; Grodzins, L; Cleveland, RO; Weisskopf, MG

    2013-01-01

    This study was conducted to investigate the methodology and feasibility of developing a portable XRF technology to quantify lead (Pb) in bone in vivo. A portable XRF device was set up and optimal setting of voltage, current, and filter combination for bone lead quantification were selected to achieve the lowest detection limit. The minimum radiation dose delivered to the subject was calculated by Monte Carlo simulations. An ultrasound device was used to measure soft tissue thickness to account for signal attenuation, and an alternative method to obtain soft tissue thickness from the XRF spectrum was developed and shown to be equivalent to the ultrasound measurements (Intraclass Correlation Coefficient, ICC=0.82). We tested the correlation of in vivo bone lead concentrations between the standard KXRF technology and the portable XRF technology. There was a significant correlation between the bone lead concentrations obtained from the standard KXRF technology and those obtained from the portable XRF technology (ICC=0.65). The detection limit for the portable XRF device was about 8.4 ppm with 2 mm soft tissue thickness. The entrance skin dose delivered to the human subject was about 13 mSv and the total body effective dose was about 1.5 μSv and should pose a minimal radiation risk. In conclusion, portable XRF technology can be used for in vivo bone lead measurement with sensitivity comparable to the KXRF technology and good correlation with KXRF measurements. PMID:21242629

  14. In vivo quantification of lead in bone with a portable x-ray fluorescence system-methodology and feasibility

    International Nuclear Information System (INIS)

    Nie, L H; Sanchez, S; Newton, K; Weisskopf, M G; Grodzins, L; Cleveland, R O

    2011-01-01

    This study was conducted to investigate the methodology and feasibility of developing a portable x-ray fluorescence (XRF) technology to quantify lead (Pb) in bone in vivo. A portable XRF device was set up and optimal settings of voltage, current, and filter combination for bone lead quantification were selected to achieve the lowest detection limit. The minimum radiation dose delivered to the subject was calculated by Monte Carlo simulations. An ultrasound device was used to measure soft tissue thickness to account for signal attenuation, and an alternative method to obtain soft tissue thickness from the XRF spectrum was developed and shown to be equivalent to the ultrasound measurements (intraclass correlation coefficient, ICC = 0.82). We tested the correlation of in vivo bone lead concentrations between the standard KXRF technology and the portable XRF technology. There was a significant correlation between the bone lead concentrations obtained from the standard KXRF technology and those obtained from the portable XRF technology (ICC = 0.65). The detection limit for the portable XRF device was about 8.4 ppm with 2 mm soft tissue thickness. The entrance skin dose delivered to the human subject was about 13 mSv and the total body effective dose was about 1.5 μSv and should pose minimal radiation risk. In conclusion, portable XRF technology can be used for in vivo bone lead measurement with sensitivity comparable to the KXRF technology and good correlation with KXRF measurements. (note)

  15. Quantification of human motion: gait analysis-benefits and limitations to its application to clinical problems.

    Science.gov (United States)

    Simon, Sheldon R

    2004-12-01

    The technology supporting the analysis of human motion has advanced dramatically. Past decades of locomotion research have provided us with significant knowledge about the accuracy of tests performed, the understanding of the process of human locomotion, and how clinical testing can be used to evaluate medical disorders and affect their treatment. Gait analysis is now recognized as clinically useful and financially reimbursable for some medical conditions. Yet, the routine clinical use of gait analysis has seen very limited growth. The issue of its clinical value is related to many factors, including the applicability of existing technology to addressing clinical problems; the limited use of such tests to address a wide variety of medical disorders; the manner in which gait laboratories are organized, tests are performed, and reports generated; and the clinical understanding and expectations of laboratory results. Clinical use is most hampered by the length of time and costs required for performing a study and interpreting it. A "gait" report is lengthy, its data are not well understood, and it includes a clinical interpretation, all of which do not occur with other clinical tests. Current biotechnology research is seeking to address these problems by creating techniques to capture data rapidly, accurately, and efficiently, and to interpret such data by an assortment of modeling, statistical, wave interpretation, and artificial intelligence methodologies. The success of such efforts rests on both our technical abilities and communication between engineers and clinicians.

  16. The need for spatially explicit quantification of benefits in invasive-species management.

    Science.gov (United States)

    Januchowski-Hartley, Stephanie R; Adams, Vanessa M; Hermoso, Virgilio

    2018-04-01

    Worldwide, invasive species are a leading driver of environmental change across terrestrial, marine, and freshwater environments and cost billions of dollars annually in ecological damages and economic losses. Resources limit invasive-species control, and planning processes are needed to identify cost-effective solutions. Thus, studies are increasingly considering spatially variable natural and socioeconomic assets (e.g., species persistence, recreational fishing) when planning the allocation of actions for invasive-species management. There is a need to improve understanding of how such assets are considered in invasive-species management. We reviewed over 1600 studies focused on management of invasive species, including flora and fauna. Eighty-four of these studies were included in our final analysis because they focused on the prioritization of actions for invasive species management. Forty-five percent (n = 38) of these studies were based on spatial optimization methods, and 35% (n = 13) accounted for spatially variable assets. Across all 84 optimization studies considered, 27% (n = 23) explicitly accounted for spatially variable assets. Based on our findings, we further explored the potential costs and benefits to invasive species management when spatially variable assets are explicitly considered or not. To include spatially variable assets in decision-making processes that guide invasive-species management there is a need to quantify environmental responses to invasive species and to enhance understanding of potential impacts of invasive species on different natural or socioeconomic assets. We suggest these gaps could be filled by systematic reviews, quantifying invasive species impacts on native species at different periods, and broadening sources and enhancing sharing of knowledge. © 2017 Society for Conservation Biology.

  17. Absolute quantification of olive oil DNA by droplet digital-PCR (ddPCR): Comparison of isolation and amplification methodologies.

    Science.gov (United States)

    Scollo, Francesco; Egea, Leticia A; Gentile, Alessandra; La Malfa, Stefano; Dorado, Gabriel; Hernandez, Pilar

    2016-12-15

    Olive oil is considered a premium product for its nutritional value and health benefits, and the ability to define its origin and varietal composition is a key step towards ensuring the traceability of the product. However, isolating the DNA from such a matrix is a difficult task. In this study, the quality and quantity of olive oil DNA, isolated using four different DNA isolation protocols, was evaluated using the qRT-PCR and ddPCR techniques. The results indicate that CTAB-based extraction methods were the best for unfiltered oil, while Nucleo Spin-based extraction protocols showed greater overall reproducibility. The use of both qRT-PCR and ddPCR led to the absolute quantification of the DNA copy number. The results clearly demonstrate the importance of the choice of DNA-isolation protocol, which should take into consideration the qualitative aspects of DNA and the evaluation of the amplified DNA copy number. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. How Much Can Non-industry Standard Measurement Methodologies Benefit Methane Reduction Programs?

    Science.gov (United States)

    Risk, D. A.; O'Connell, L.; Atherton, E.

    2017-12-01

    In recent years, energy sector methane emissions have been recorded in large part by applying modern non-industry-standard techniques. Industry may lack the regulatory flexibility to use such techniques, or in some cases may not understand the possible associated economic advantage. As progressive jurisdictions move from estimation and towards routine measurement, the research community should provide guidance to help regulators and companies measure more effectively, and economically if possible. In this study, we outline a modelling experiment in which we explore the integration of non-industry-standard measurement techniques as part of a generalized compliance measurement program. The study was not intended to be exhaustive, or to recommend particular combinations, but instead to explore the inter-relationships between methodologies, development type, compliance practice. We first defined the role, applicable scale, detection limits, working distances, and approximate deployment cost of several measurement methodologies. We then considered a variety of development types differing mainly in footprint, density, and emissions "profile". Using a Monte Carlo approach, we evaluated the effect of these various factors on the cost and confidence of the compliance measurement program. We found that when added individually, some of the research techniques were indeed able to deliver an improvement in cost and/or confidence when used alongside industry-standard Optical Gas Imaging. When applied in combination, the ideal fraction of each measurement technique depended on development type, emission profile, and whether confidence or cost was more important. Results suggest that measurement cost and confidence could be improved if energy companies exploited a wider range of measurement techniques, and in a manner tailored to each development. In the short-term, combining clear scientific guidance with economic information could benefit immediate mitigation efforts over

  19. Methodology and applications for the benefit cost analysis of the seismic risk reduction in building portfolios at broadscale

    OpenAIRE

    Valcarcel, Jairo A.; Mora, Miguel G.; Cardona, Omar D.; Pujades, Lluis G.; Barbat, Alex H.; Bernal, Gabriel A.

    2013-01-01

    This article presents a methodology for an estimate of the benefit cost ratio of the seismic risk reduction in buildings portfolio at broadscale, for a world region, allowing comparing the results obtained for the countries belonging to that region. This methodology encompasses (1) the generation of a set of random seismic events and the evaluation of the spectral accelerations at the buildings location; (2) the estimation of the buildings built area, the economic value, as well as the cla...

  20. Cost analysis and ecological benefits of environmental recovery methodologies in bauxite mining

    Directory of Open Access Journals (Sweden)

    João Carlos Costa Guimarães

    2013-03-01

    Full Text Available This work analyzed and compared three methods of environmental recovery in bauxite mining commonly used in Poços de Caldas Plateau, MG, by means of recovery costs and ecological benefits. Earnings and costs data of environmental recovery activities were obtained for the areas that belonged to the Companhia Geral de Minas – CGM, on properties sited in the city of Poços de Caldas, MG. The amount of costs of these activities was used to compare the recovery methods by updating them monetarily to a reference date, in other words, the present moment. It is concluded that the difference between the present value of costs for simple restoration and rehabilitation activities are less than 1% and that between the complete restoration and rehabilitation is about 15.12%, suggesting that the choice of the methods to be used must be based on the ecological earnings proportional to each of them. The methodology of environmental restoration of the mining areas emphasizes the ecological variables in the process of establishment of the community, to the detriment of complex ecological aspects, which show difficulties in measuring the actual moment of the development of the ecosystem considered.

  1. The Optimal Time for Claiming Social Security Benefits: A Methodological Note

    OpenAIRE

    Joseph Friedman

    2014-01-01

    The optimal age for initiating Social Security benefits and the initiation versus postponement of benefits decision are the subjects of a number of recent papers. It is generally agreed that an initiation versus postponement of benefits decision may have significant consequences, but there is less agreement about how to model the problem or measure its financial implications. By law benefits are paid only to live beneficiaries. Thus, the anticipated future benefits should be weighted by the r...

  2. A benefit-cost methodology for developing environmental standards for uranium mill tailings disposal

    International Nuclear Information System (INIS)

    Leiter, A.J.

    1982-01-01

    This paper describes a method for using benefit-cost analysis in developing generally applicable environmental standards for uranium mill tailings disposal. Several disposal alternatives were selected which consist of different combinations of control measures. The resulting cost and benefit estimations allow the calculation of the incremental cost of obtaining incremental benefits of radiation protection. The overall benefit of a disposal alternative is expressed in terms of an index which is based on weighting factors assigned to individual benefits. The results show that some disposal alternatives have higher costs while providing no additional benefit than other alternatives. These alternatives should be eliminated from consideration in developing standards

  3. Validation and evaluation of an HPLC methodology for the quantification of the potent antimitotic compound (+)-discodermolide in the Caribbean marine sponge Discodermia dissoluta.

    Science.gov (United States)

    Valderrama, Katherine; Castellanos, Leonardo; Zea, Sven

    2010-08-01

    The sponge Discodermia dissoluta is the source of the potent antimitotic compound (+)-discodermolide. The relatively abundant and shallow populations of this sponge in Santa Marta, Colombia, allow for studies to evaluate the natural and biotechnological supply options of (+)-discodermolide. In this work, an RP-HPLC-UV methodology for the quantification of (+)-discodermolide from sponge samples was tested and validated. Our protocol for extracting this compound from the sponge included lyophilization, exhaustive methanol extraction, partitioning using water and dichloromethane, purification of the organic fraction in RP-18 cartridges and then finally retrieving the (+)-discodermolide in the methanol-water (80:20 v/v) fraction. This fraction was injected into an HPLC system with an Xterra RP-18 column and a detection wavelength of 235 nm. The calibration curve was linear, making it possible to calculate the LODs and quantification in these experiments. The intra-day and inter-day precision showed relative standard deviations lower than 5%. The accuracy, determined as the percentage recovery, was 99.4%. Nine samples of the sponge from the Bahamas, Bonaire, Curaçao and Santa Marta had concentrations of (+)-discodermolide ranging from 5.3 to 29.3 microg/g(-1) of wet sponge. This methodology is quick and simple, allowing for the quantification in sponges from natural environments, in situ cultures or dissociated cells.

  4. Appropriate methodologies for assessing the societal cost and benefits of conservation programs

    International Nuclear Information System (INIS)

    Power, J.M.; Gill, G.S.; Harvey, K.M.

    1983-01-01

    The use of cost-benefit analysis for assessing the societal cost and benefits of conservation programmes is discussed. It is concluded that it should not be the sole criterion for project choice. (U.K.)

  5. HPCE quantification of 5-methyl-2'-deoxycytidine in genomic DNA: methodological optimization for chestnut and other woody species.

    Science.gov (United States)

    Hasbún, Rodrigo; Valledor, Luís; Rodríguez, José L; Santamaria, Estrella; Ríos, Darcy; Sanchez, Manuel; Cañal, María J; Rodríguez, Roberto

    2008-01-01

    Quantification of deoxynucleosides using micellar high-performance capillary electrophoresis (HPCE) is an efficient, fast and inexpensive evaluation method of genomic DNA methylation. This approach has been demonstrated to be more sensitive and specific than other methods for the quantification of DNA methylation content. However, effective detection and quantification of 5-methyl-2'-deoxycytidine depend of the sample characteristics. Previous works have revealed that in most woody species, the quality and quantity of RNA-free DNA extracted that is suitable for analysis by means of HPCE varies among species of the same gender, among tissues taken from the same tree, and vary in the same tissue depending on the different seasons of the year. The aim of this work is to establish a quantification method of genomic DNA methylation that lends itself to use in different Castanea sativa Mill. materials, and in other angiosperm and gymnosperm woody species. Using a DNA extraction kit based in silica membrane has increased the resolutive capacity of the method. Under these conditions, it can be analyzed different organs or tissues of angiosperms and gymnosperms, regardless of their state of development. We emphasized the importance of samples free of nucleosides, although, in the contrary case, the method ensures the effective separation of deoxynucleosides and identification of 5-methyl-2'-deoxycytidine.

  6. Loss-of-benefits analysis for nuclear power plant shutdowns: methodology and illustrative case study

    International Nuclear Information System (INIS)

    Peerenboom, J.P.; Buehring, W.A.; Guziel, K.A.

    1983-11-01

    A framework for loss-of-benefits analysis and a taxomony for identifying and categorizing the effects of nuclear power plant shutdowns or accidents are presented. The framework consists of three fundamental steps: (1) characterizing the shutdown; (2) identifying benefits lost as a result of the shutdown; and (3) quantifying effects. A decision analysis approach to regulatory decision making is presented that explicitly considers the loss of benefits. A case study of a hypothetical reactor shutdown illustrates one key loss of benefits: net replacement energy costs (i.e., change in production costs). Sensitivity studies investigate the responsiveness of case study results to changes in nuclear capacity factor, load growth, fuel price escalation, and discount rate. The effects of multiple reactor shutdowns on production costs are also described

  7. Comparative analysis of cost benefit division methodologies in a hydrothermal generation system

    International Nuclear Information System (INIS)

    Pereira, M.V.F.; Gorenstin, B.G.; Campodonico, N.M.; Costa, J.P. da; Kelman, J.

    1989-01-01

    The development and operation planning of the Brazilian generation system has been realized in a coordinate way by several years, due to some organizations, where the main generating companies from the country take part. The benefit share of the system to each participant of the planning and integrated operation has aroused interest. This paper describes the alternate forms of cost benefit allocation, between the participant companies of a coordinate operation, in order to reach an adequateness of remuneration and incentives. It was analysed two proposal of benefit allocation for energy export/import contracts: share by generation value and share by marginal benefit, concluding that the second one represents the best way of contribution for the several factors that comprising a hydroelectric power plant (storage capacity, effective storage and turbine capacity). (C.G.C.). 1 tab

  8. Costs of disarmament - Rethinking the price tag: A methodological inquiry into the costs and benefits of arms control

    International Nuclear Information System (INIS)

    Willett, S.

    2002-06-01

    The growing number of arms control and disarmament treaties agreed on over the past decades as well as rising concerns about harmful environmental and public health effects of weapons disposal, have understandably led to an increase in the cost of implementing arms control agreements. As a result, the expenses associated with treaty compliance have emerged as a contentious issue within the realm of arms control and disarmament discussions. In particular, opponents of arms control and disarmament point to perceived rising costs of meeting current and proposed treaty obligations in an attempt to limit and undermine such activities. Yet determining just how much arms control and disarmament cost remains very much an ambiguous task. In Costs of Disarmament - Rethinking the Price Tag: A Methodological Inquiry into the Costs and Benefits of Arms Control, Susan Willett addresses the question of how the cost of arms control ought to be measured. Emphasizing the proper allocation of costs associated with arms control treaty implementation to the life cycle costs of weapon systems and their correct weighing against the benefits they procure in terms of averted arms races and increased international security, Willett argues for a revised methodology of costing arms control and disarmament that gives a more accurate - and significantly lower - estimate of the latter. Adopting such a revised methodology concludes the author, might dispel considerable misunderstanding and help point decisions over arms control and disarmament in the right direction

  9. METHODOLOGICAL APPROACHES IN REALIZING AND APPLYING COST-BENEFIT ANALYSIS FOR THE INVESTMENT PROJECTS

    Directory of Open Access Journals (Sweden)

    Pelin Andrei

    2009-05-01

    Full Text Available Cost-benefit analysis represents the most frequent technique used for a rational allocation of resources. This modality of evaluating the expenditure programs is an attempt to measure the costs and gains of a community as a result of running the evaluated

  10. Solar thermal technologies benefits assessment: Objectives, methodologies and results for 1981

    Science.gov (United States)

    Gates, W. R.

    1982-07-01

    The economic and social benefits of developing cost competitive solar thermal technologies (STT) were assessed. The analysis was restricted to STT in electric applications for 16 high insolation/high energy price states. Three fuel price scenarios and three 1990 STT system costs were considered, reflecting uncertainty over fuel prices and STT cost projections. After considering the numerous benefits of introducing STT into the energy market, three primary benefits were identified and evaluated: (1) direct energy cost savings were estimated to range from zero to $50 billion; (2) oil imports may be reduced by up to 9 percent, improving national security; and (3) significant environmental benefits can be realized in air basins where electric power plant emissions create substantial air pollution problems. STT research and development was found to be unacceptably risky for private industry in the absence of federal support. The normal risks associated with investments in research and development are accentuated because the OPEC cartel can artificially manipulate oil prices and undercut the growth of alternative energy sources.

  11. Accounting for between-study variation in incremental net benefit in value of information methodology.

    Science.gov (United States)

    Willan, Andrew R; Eckermann, Simon

    2012-10-01

    Previous applications of value of information methods for determining optimal sample size in randomized clinical trials have assumed no between-study variation in mean incremental net benefit. By adopting a hierarchical model, we provide a solution for determining optimal sample size with this assumption relaxed. The solution is illustrated with two examples from the literature. Expected net gain increases with increasing between-study variation, reflecting the increased uncertainty in incremental net benefit and reduced extent to which data are borrowed from previous evidence. Hence, a trial can become optimal where current evidence is sufficient assuming no between-study variation. However, despite the expected net gain increasing, the optimal sample size in the illustrated examples is relatively insensitive to the amount of between-study variation. Further percentage losses in expected net gain were small even when choosing sample sizes that reflected widely different between-study variation. Copyright © 2011 John Wiley & Sons, Ltd.

  12. An Analysis of Information Asset Valuation (IAV) Quantification Methodology for Application with Cyber Information Mission Impact Assessment (CIMIA)

    National Research Council Canada - National Science Library

    Hellesen, Denzil L

    2008-01-01

    .... The IAV methodology proposes that accurate valuation for an Information Asset (InfoA) is the convergence of information tangible, intangible, and flow attributes to form a functional entity that enhances mission capability...

  13. Exploring the Benefits of Respite Services to Family Caregivers: Methodological Issues and Current Findings

    Science.gov (United States)

    Zarit, Steven H.; Liu, Yin; Bangerter, Lauren R.; Rovine, Michael J.

    2017-01-01

    Objectives There is growing emphasis on empirical validation of the efficacy of community-based services for older people and their families, but research on services such as respite care faces methodological challenges that have limited the growth of outcome studies. We identify problems associated with the usual research approaches for studying respite care, with the goal of stimulating use of novel and more appropriate research designs that can lead to improved studies of community-based services. Method Using the concept of research validity, we evaluate the methodological approaches in the current literature on respite services, including adult day services, in-home respite and overnight respite. Results Although randomized control trials (RCTs) are possible in community settings, validity is compromised by practical limitations of randomization and other problems. Quasi-experimental and interrupted time series designs offer comparable validity to RCTs and can be implemented effectively in community settings. Conclusion An emphasis on RCTs by funders and researchers is not supported by scientific evidence. Alternative designs can lead to development of a valid body of research on community services such as respite. PMID:26729467

  14. Comparative risk-benefit-cost effectiveness in nuclear and alternate power sources: methodology, perspective, limitations

    International Nuclear Information System (INIS)

    Vinck, W.; Van Reijen, G.; Maurer, H.; Volta, G.

    1980-01-01

    A critical survey is given of the use of quantitative risk assessment in defining acceptable limits of safety and of its use together with cost-benefit analyses for decision making. The paper indicates uncertainties and even unknowns in risk assessment in particular if the whole fuel cycle for energy production is considered. It is made clear that for decisions on acceptance of risk also the risk perception factor must be considered. A difficult issue here is the potential for low-probability/large consequence accidents. Examples are given, suggestions for improvement are made and perspectives are outlined

  15. Cost analysis and ecological benefits of environmental recovery methodologies in bauxite mining

    OpenAIRE

    Guimarães,João Carlos Costa; Barros,Dalmo Arantes de; Pereira,José Aldo Alves; Silva,Rossi Allan; Oliveira,Antonio Donizette de; Borges,Luís Antônio Coimbra

    2013-01-01

    This work analyzed and compared three methods of environmental recovery in bauxite mining commonly used in Poços de Caldas Plateau, MG, by means of recovery costs and ecological benefits. Earnings and costs data of environmental recovery activities were obtained for the areas that belonged to the Companhia Geral de Minas – CGM, on properties sited in the city of Poços de Caldas, MG. The amount of costs of these activities was used to compare the recovery methods by updating them monetarily to...

  16. External Costs and Benefits of Energy. Methodologies, Results and Effects on Renewable Energies Competitivity

    International Nuclear Information System (INIS)

    Saez, R.; Cabal, H.; Varela, M.

    1999-01-01

    This study attempts to give a summarised vision of the concept of eternality in energy production, the social and economic usefulness of its evaluation and consideration as support to the political decision-marking in environmental regulation matters, technologies selection of new plants, priorities establishment on energy plans, etc. More relevant environmental externalisation are described, as are the effects on the health, ecosystems, materials and climate, as well as some of the socioeconomic externalisation such as the employment, increase of the GDP and the reduction and depletion of energy resources. Different methodologies used during the last years have been reviewed as well as the principals resulted obtained in the most relevant studies accomplished internationally on this topic. Special mention has deserved the European study National Implementation of the Extern E Methodology in the EU . Results obtained are represented in Table 2 of this study. Also they are exposed, in a summarised way, the results obtained in the evaluation of environmental externalisation of the Spanish electrical system in function of the fuel cycle. In this last case the obtained results are more approximated since have been obtained by extrapolation from the obtained for ten representative plants geographically distributed trough the Peninsula. Finally it has been analysed the influence that the internalization of the external costs of conventional energies can have in the competitiveness and in te market of renewable energy, those which originate less environmental effects and therefore produce much smaller external costs. The mechanisms of internalization and the consideration on the convenience or not of their incorporation in the price of energy have been also discussed. (Author) 30 refs

  17. A methodology for spacecraft technology insertion analysis balancing benefit, cost, and risk

    Science.gov (United States)

    Bearden, David Allen

    Emerging technologies are changing the way space missions are developed and implemented. Technology development programs are proceeding with the goal of enhancing spacecraft performance and reducing mass and cost. However, it is often the case that technology insertion assessment activities, in the interest of maximizing performance and/or mass reduction, do not consider synergistic system-level effects. Furthermore, even though technical risks are often identified as a large cost and schedule driver, many design processes ignore effects of cost and schedule uncertainty. This research is based on the hypothesis that technology selection is a problem of balancing interrelated (and potentially competing) objectives. Current spacecraft technology selection approaches are summarized, and a Methodology for Evaluating and Ranking Insertion of Technology (MERIT) that expands on these practices to attack otherwise unsolved problems is demonstrated. MERIT combines the modern techniques of technology maturity measures, parametric models, genetic algorithms, and risk assessment (cost and schedule) in a unique manner to resolve very difficult issues including: user-generated uncertainty, relationships between cost/schedule and complexity, and technology "portfolio" management. While the methodology is sufficiently generic that it may in theory be applied to a number of technology insertion problems, this research focuses on application to the specific case of small (<500 kg) satellite design. Small satellite missions are of particular interest because they are often developed under rigid programmatic (cost and schedule) constraints and are motivated to introduce advanced technologies into the design. MERIT is demonstrated for programs procured under varying conditions and constraints such as stringent performance goals, not-to-exceed costs, or hard schedule requirements. MERIT'S contributions to the engineering community are its: unique coupling of the aspects of performance

  18. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A.; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M.

    2014-01-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  19. Comparison of registry methodologies for reporting carbon benefits for afforestation projects in the United States

    International Nuclear Information System (INIS)

    Pearson, Timothy R.H.; Brown, Sandra; Andrasko, Kenneth

    2008-01-01

    No mandatory national program currently exists to mitigate climate change in the US Consequently, voluntary programs and mandatory state-level programs are multiplying to allow users to register emission-offset activities, creating multiple often contradictory measurement and recording standards. For the land use sector we examined a hypothetical project: tree planting on rangelands in California. We apply four sets of protocols from the following registries - the California Climate Action Registry, the Chicago Climate Exchange (CCX), the Regional Greenhouse Gas Initiative and the USDOE 1605(b) program - and compare the results to the 'actual' net sequestration and also briefly compare them to international protocols such as the relevant Clean Development Mechanism methodology. Carbon in land use can be estimated accurately, precisely and cost-effectively, but to achieve this requires good protocols. As predicted, the consequence of applying different protocols for reportable carbon was significant. The choice of measurement pools, the handling of the baseline and the issue of uncertainty led to a baseline estimate of 0-66,690 t CO 2 -e, and final sequestered carbon totals (after 60 years) that varied between 118,044 and 312,685 t CO 2 -e-a factor of 2.5 difference. The amount reported under 1605(b) is the closest to 'actual' with CCX entity reporting the most divergent

  20. Development and validation of methodologies for the quantification of phytosterols and phytosterol oxidation products in cooked and baked food products

    NARCIS (Netherlands)

    Menéndez-Carreño, M.; Knol, D.; Janssen, H.G.

    2015-01-01

    Chromatography-​mass spectrometry (GC-​MS) methodologies for the anal. of the main phytosterols (PS) and phytosterol oxidn. products (POPs) present in 19 different foodstuffs cooked or baked using margarines with or without added plant sterols are presented. Various methods for fat extn. were

  1. Benefit-Risk Monitoring of Vaccines Using an Interactive Dashboard: A Methodological Proposal from the ADVANCE Project.

    Science.gov (United States)

    Bollaerts, Kaatje; De Smedt, Tom; Donegan, Katherine; Titievsky, Lina; Bauchau, Vincent

    2018-03-26

    New vaccines are launched based on their benefit-risk (B/R) profile anticipated from clinical development. Proactive post-marketing surveillance is necessary to assess whether the vaccination uptake and the B/R profile are as expected and, ultimately, whether further public health or regulatory actions are needed. There are several, typically not integrated, facets of post-marketing vaccine surveillance: the surveillance of vaccination coverage, vaccine safety, effectiveness and impact. With this work, we aim to assess the feasibility and added value of using an interactive dashboard as a potential methodology for near real-time monitoring of vaccine coverage and pre-specified health benefits and risks of vaccines. We developed a web application with an interactive dashboard for B/R monitoring. The dashboard is demonstrated using simulated electronic healthcare record data mimicking the introduction of rotavirus vaccination in the UK. The interactive dashboard allows end users to select certain parameters, including expected vaccine effectiveness, age groups, and time periods and allows calculation of the incremental net health benefit (INHB) as well as the incremental benefit-risk ratio (IBRR) for different sets of preference weights. We assessed the potential added value of the dashboard by user testing amongst a range of stakeholders experienced in the post-marketing monitoring of vaccines. The dashboard was successfully implemented and demonstrated. The feedback from the potential end users was generally positive, although reluctance to using composite B/R measures was expressed. The use of interactive dashboards for B/R monitoring is promising and received support from various stakeholders. In future research, the use of such an interactive dashboard will be further tested with real-life data as opposed to simulated data.

  2. Methodological modifications on quantification of phosphatidylethanol in blood from humans abusing alcohol, using high-performance liquid chromatography and evaporative light scattering detection

    Directory of Open Access Journals (Sweden)

    Aradottir Steina

    2005-09-01

    Full Text Available Abstract Background Phosphatidylethanol (PEth is an abnormal phospholipid formed slowly in cell membranes by a transphosphatidylation reaction from phosphatidylcholine in the presence of ethanol and catalyzed by the enzyme phospholipase D. PEth in blood is a promising new marker of ethanol abuse depending on the high specificity and sensitivity of this marker. None of the biological markers used in clinical routine at the present time are sensitive and specific enough for the diagnosis of alcohol abuse. The method for PEth analysis includes lipid extraction of whole blood, a one-hour HPLC separation of lipids and ELSD (evaporative light scattering detection of PEth. Results Methodological improvements are presented which comprise a simpler extraction procedure, the use of phosphatidylbutanol as internal standard and a new algorithm for evaluation of unknown samples. It is further demonstrated that equal test results are obtained with blood collected in standard test tubes with EDTA as with the previously used heparinized test tubes. The PEth content in blood samples is stable for three weeks in the refrigerator. Conclusion Methodological changes make the method more suitable for routine laboratory use, lower the limit of quantification (LOQ and improve precision.

  3. Application of analytic methodologies for image quantification in neuroendocrine tumor therapy with {sup 177}Lu-DOTA

    Energy Technology Data Exchange (ETDEWEB)

    Kubo, T.T.A.; Oliveira, S.M.V. [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Marco, L.; Mamede, M., E-mail: tadeukubo@gmail.com [Instituto Nacional do Cancer, Rio de Janeiro, RJ (Brazil)

    2012-07-01

    Neuroendocrine tumors have annual incidence of 1 to 2 cases per one hundred thousand inhabitants. The {sup 177}Lu-DOTA-octreotate treatments in 3 or 4 cycles has been effective in controlling disease progression and, in some cases, promote tumor remission. To estimate radiation side effects in healthy organs, image quantification techniques have been broadcast for individualized patient dosimetry. In this paper, image data processing methods are presented to allowing comparisons between different image conjugate views, combined with attenuation correction and system sensitivity. Images were acquired 24, 72 and 192 h after administration of 74 GBq of {sup 177}Lu-DOTA using a dual-head gamma camera detection system and they were evaluated with ImageJ software. 4 female patients underwent to two cycles of treatment. The kidneys, liver and whole-body regions of interest were separately assessed by 4 techniques for counts method and 12 techniques for pixel intensity method, considering the main photopeak separately and aided by the attenuation correction map and adjacent windows to photopeak energy. The pixel intensity method was combined with mathematical correction for pixels with null value. The results obtained by the two methods were strongly correlated (r>0.9) (p<0.001). The paired t-test accepted the null hypothesis of compatibility between the two methods (with and without attenuation correction map) (p<0.05), but rejected it when the adjacent windows were combined. No significant tumor reduction (p>0.05) was found between the treatment cycles. In conclusion, the pixel intensity method is faster and allows macros, minimizing operator error, and may optimize dosimetry in tumor therapies with {sup 177}Lu-DOTA-octreotate. (author)

  4. Towards a common disability assessment framework: theoretical and methodological issues for providing public services and benefits using ICF.

    Science.gov (United States)

    Francescutti, Carlo; Frattura, Lucilla; Troiano, Raffaella; Gongolo, Francesco; Martinuzzi, Andrea; Sala, Marina; Meucci, Paolo; Raggi, Alberto; Russo, Emanuela; Buffoni, Mara; Gorini, Giovanna; Conclave, Mario; Petrangeli, Agostino; Solipaca, Alessandro; Leonardi, Matilde

    2009-01-01

    To report on the preliminary results of an Italian project on the implementation of an ICF-based protocol for providing public services and benefits for persons with disabilities. The UN Convention on the Rights of persons with disabilities (UNC) was mapped to the ICF, and core elements were implemented in an ICF-based evaluation protocol. A person-environment interaction classification (PEIC) tree was also developed for defining evaluation outputs. The PEIC and the ICF-based protocol are the guideline and the data interpretation source, respectively, for providing public services and benefits. They enable to assign persons to different services, from surveillance and monitoring to facilitator provision or sustain over time, to barrier removal or to the reorganisation of environmental factors provision. A detailed description of the target intervention is made available through the implementation of a protocol, which points out the effect of personal support and other environmental factors. The detailed description of functioning and disability provided by our methodology can help policy makers and administrators in decision making, on the basis of a description of real needs, and in targeting person-tailored interventions.

  5. Development and validation of methodologies for the quantification of phytosterols and phytosterol oxidation products in cooked and baked food products.

    Science.gov (United States)

    Menéndez-Carreño, María; Knol, Diny; Janssen, Hans-Gerd

    2016-01-08

    Chromatography-mass spectrometry (GC-MS) methodologies for the analysis of the main phytosterols (PS) and phytosterol oxidation products (POPs) present in 19 different foodstuffs cooked or baked using margarines with or without added plant sterols are presented. Various methods for fat extraction were evaluated to allow the GC-MS analysis of large numbers of prepared vegetable, fish and meat products, egg and bakery items in a practically feasible manner. The optimized methods resulted in a good sensitivity and allowed the analysis of both PS and POPs in the broad selection of foods at a wide range of concentrations. Calibration curves for both PS and POPs showed correlation coefficients (R(2)) better than 0.99. Detection limits were below 0.24mgkg(-1) for PS and 0.02mgkg(-1) for POPs, respectively. Average recovery data were between 81% and 105.1% for PS and between 65.5 and 121.8% for POPs. Good results were obtained for within- and between-day repeatability, with most values being below 10%. Entire sample servings were analyzed, avoiding problems with inhomogeneity and making the method an exact representation of the typical use of the food by the consumer. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Quantification of Greenhouse Gas Emission Rates from strong Point Sources by Airborne IPDA-Lidar Measurements: Methodology and Experimental Results

    Science.gov (United States)

    Ehret, G.; Amediek, A.; Wirth, M.; Fix, A.; Kiemle, C.; Quatrevalet, M.

    2016-12-01

    We report on a new method and on the first demonstration to quantify emission rates from strong greenhouse gas (GHG) point sources using airborne Integrated Path Differential Absorption (IPDA) Lidar measurements. In order to build trust in the self-reported emission rates by countries, verification against independent monitoring systems is a prerequisite to check the reported budget. A significant fraction of the total anthropogenic emission of CO2 and CH4 originates from localized strong point sources of large energy production sites or landfills. Both are not monitored with sufficiently accuracy by the current observation system. There is a debate whether airborne remote sensing could fill in the gap to infer those emission rates from budgeting or from Gaussian plume inversion approaches, whereby measurements of the GHG column abundance beneath the aircraft can be used to constrain inverse models. In contrast to passive sensors, the use of an active instrument like CHARM-F for such emission verification measurements is new. CHARM-F is a new airborne IPDA-Lidar devised for the German research aircraft HALO for the simultaneous measurement of the column-integrated dry-air mixing ratio of CO2 and CH4 commonly denoted as XCO2 und XCH4, respectively. It has successfully been tested in a serious of flights over Central Europe to assess its performance under various reflectivity conditions and in a strongly varying topography like the Alps. The analysis of a methane plume measured in crosswind direction of a coal mine ventilation shaft revealed an instantaneous emission rate of 9.9 ± 1.7 kt CH4 yr-1. We discuss the methodology of our point source estimation approach and give an outlook on the CoMet field experiment scheduled in 2017 for the measurement of anthropogenic and natural GHG emissions by a combination of active and passive remote sensing instruments on research aircraft.

  7. Methodology on quantification of sonication duration for safe application of MR guided focused ultrasound for liver tumour ablation.

    Science.gov (United States)

    Mihcin, Senay; Karakitsios, Ioannis; Le, Nhan; Strehlow, Jan; Demedts, Daniel; Schwenke, Michael; Haase, Sabrina; Preusser, Tobias; Melzer, Andreas

    2017-12-01

    Magnetic Resonance Guided Focused Ultrasound (MRgFUS) for liver tumour ablation is a challenging task due to motion caused by breathing and occlusion due the ribcage between the transducer and the tumour. To overcome these challenges, a novel system for liver tumour ablation during free breathing has been designed. The novel TRANS-FUSIMO Treatment System (TTS, EUFP7) interacts with a Magnetic Resonance (MR) scanner and a focused ultrasound transducer to sonicate to a moving target in liver. To meet the requirements of ISO 13485; a quality management system for medical device design, the system needs to be tested for certain process parameters. The duration of sonication and, the delay after the sonication button is activated, are among the parameters that need to be quantified for efficient and safe ablation of tumour tissue. A novel methodology is developed to quantify these process parameters. A computerised scope is programmed in LabVIEW to collect data via hydrophone; where the coordinates of fiber-optic sensor assembly was fed into the TRANS-FUSIMO treatment software via Magnetic Resonance Imaging (MRI) to sonicate to the tip of the sensor, which is synchronised with the clock of the scope, embedded in a degassed water tank via sensor assembly holder. The sonications were executed for 50 W, 100 W, 150 W for 10 s to quantify the actual sonication duration and the delay after the emergency stop by two independent operators for thirty times. The deviation of the system from the predefined specs was calculated. Student's-T test was used to investigate the user dependency. The duration of sonication and the delay after the sonication were quantified successfully with the developed method. TTS can sonicate with a maximum deviation of 0.16 s (Std 0.32) from the planned duration and with a delay of 14 ms (Std 0.14) for the emergency stop. Student's T tests indicate that the results do not depend on operators (p > .05). The evidence obtained via this

  8. Theoretical and methodological aspects of assessing economic effectiveness of nuclear power plant construction using cost-benefit analysis

    International Nuclear Information System (INIS)

    Moravcik, A.

    1984-01-01

    The cost benefit of investments is devided into social and economic benefits. The postulates are discussed for the assessment of the cost benefit of capital costs of nuclear power plants. The relations are given for total cost benefit of capital costs expressed by the total profit rate of capital costs, and the absolute effectiveness exoressed by the socio-economic benefit of capital costs. The absolute cost benefit of capital costs is characterized by several complex indexes. Comparable capital cost benefit is used for assessing the effectiveness of interchangeable variants of solution. The minimum calculated costs serve as the criterion for selecting the optimal variant. (E.S.)

  9. Environmental costs and benefits case study: nuclear power plant. Quantification and economic valuation of selected environmental impacts/effects. Final report

    International Nuclear Information System (INIS)

    1984-02-01

    This case study is an application, to a nuclear power plant, of the methodology for quantifying environmental costs and benefits, contained in the regional energy plan, adopted in April, 1983, by the Northwest Power Planning Council, pursuant to Public Law 96-501.The study is based on plant number 2 of the Washington Public Power Supply System (WNP-2), currently nearing completion on the Hanford Nuclear Reservation in eastern Washington State. This report describes and documents efforts to quantify and estimate monetary values for the following seven areas of environmental effects: radiation/health effects, socioeconomic/infrastructure effects, consumptive use of water, psychological/health effects (fear/stress), waste management, nuclear power plant accidents, and decommissioning costs. 103 references

  10. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    Energy Technology Data Exchange (ETDEWEB)

    Laborda, Francisco, E-mail: flaborda@unizar.es; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-21

    dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. - Highlights: • The challenge to analyze inorganic nanomaterials is described. • Techniques for detection, characterization and quantification of inorganic nanomaterials are presented. • Sample preparation methods for the analysis of nanomaterials in complex samples are presented. • Methodological approaches posed by stakeholders for solving nanometrological problems are discussed.

  11. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    International Nuclear Information System (INIS)

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-01

    dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. - Highlights: • The challenge to analyze inorganic nanomaterials is described. • Techniques for detection, characterization and quantification of inorganic nanomaterials are presented. • Sample preparation methods for the analysis of nanomaterials in complex samples are presented. • Methodological approaches posed by stakeholders for solving nanometrological problems are discussed.

  12. Cost-Benefit Analysis of Rail-Noise Mitigation Programmes at European Level: Methodological Innovations from EURANO to STAIRRS

    OpenAIRE

    Aude Lenders; Nancy Da Silva; Walter Hecq; Baumgartner Thomas

    2001-01-01

    The STAIRRS project (2000-2002) is a follow-up of EURANO [1] and a Swiss study [2], in which the authors evaluated the efficiency of noise reduction measures in two European freight corridors. STAIRRS includes a cost-benefit analysis based on about 10,000 km of track modelled in seven countries. The benefits are defined in terms of the dB(A) experienced by those living in the rail corridors modelled. They are to be weighted by the number of persons benefiting each year from a noise reduction ...

  13. Benefits of using an optimization methodology for identifying robust process integration investments under uncertainty-A pulp mill example

    International Nuclear Information System (INIS)

    Svensson, Elin; Berntsson, Thore; Stroemberg, Ann-Brith

    2009-01-01

    This paper presents a case study on the optimization of process integration investments in a pulp mill considering uncertainties in future electricity and biofuel prices and CO 2 emissions charges. The work follows the methodology described in Svensson et al. [Svensson, E., Berntsson, T., Stroemberg, A.-B., Patriksson, M., 2008b. An optimization methodology for identifying robust process integration investments under uncertainty. Energy Policy, in press, (doi:10.1016/j.enpol.2008.10.023)] where a scenario-based approach is proposed for the modelling of uncertainties. The results show that the proposed methodology provides a way to handle the time dependence and the uncertainties of the parameters. For the analyzed case, a robust solution is found which turns out to be a combination of two opposing investment strategies. The difference between short-term and strategic views for the investment decision is analyzed and it is found that uncertainties are increasingly important to account for as a more strategic view is employed. Furthermore, the results imply that the obvious effect of policy instruments aimed at decreasing CO 2 emissions is, in applications like this, an increased profitability for all energy efficiency investments, and not as much a shift between different alternatives

  14. Benefits of using an optimization methodology for identifying robust process integration investments under uncertainty-A pulp mill example

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden)], E-mail: elin.svensson@chalmers.se; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden)

    2009-03-15

    This paper presents a case study on the optimization of process integration investments in a pulp mill considering uncertainties in future electricity and biofuel prices and CO{sub 2} emissions charges. The work follows the methodology described in Svensson et al. [Svensson, E., Berntsson, T., Stroemberg, A.-B., Patriksson, M., 2008b. An optimization methodology for identifying robust process integration investments under uncertainty. Energy Policy, in press, (doi:10.1016/j.enpol.2008.10.023)] where a scenario-based approach is proposed for the modelling of uncertainties. The results show that the proposed methodology provides a way to handle the time dependence and the uncertainties of the parameters. For the analyzed case, a robust solution is found which turns out to be a combination of two opposing investment strategies. The difference between short-term and strategic views for the investment decision is analyzed and it is found that uncertainties are increasingly important to account for as a more strategic view is employed. Furthermore, the results imply that the obvious effect of policy instruments aimed at decreasing CO{sub 2} emissions is, in applications like this, an increased profitability for all energy efficiency investments, and not as much a shift between different alternatives.

  15. Benefits of using an optimization methodology for identifying robust process integration investments under uncertainty. A pulp mill example

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden)

    2009-03-15

    This paper presents a case study on the optimization of process integration investments in a pulp mill considering uncertainties in future electricity and biofuel prices and CO{sub 2} emissions charges. The work follows the methodology described in Svensson et al. [Svensson, E., Berntsson, T., Stroemberg, A.-B., Patriksson, M., 2008b. An optimization methodology for identifying robust process integration investments under uncertainty. Energy Policy, in press, doi:10.1016/j.enpol.2008.10.023] where a scenario-based approach is proposed for the modelling of uncertainties. The results show that the proposed methodology provides a way to handle the time dependence and the uncertainties of the parameters. For the analyzed case, a robust solution is found which turns out to be a combination of two opposing investment strategies. The difference between short-term and strategic views for the investment decision is analyzed and it is found that uncertainties are increasingly important to account for as a more strategic view is employed. Furthermore, the results imply that the obvious effect of policy instruments aimed at decreasing CO{sub 2} emissions is, in applications like this, an increased profitability for all energy efficiency investments, and not as much a shift between different alternatives. (author)

  16. Use of cost benefit analysis methodology in the meaning of motorization level from small and medium hydroelectric power plants

    International Nuclear Information System (INIS)

    Mazzon, J.G.; Simoes, N.S.; Ramos, D.S.; Ishida, S.

    1989-01-01

    The technical and economic justifications that bringing the waterfall division reformulation between Lucas Nogueira Garcez Plant and Capivara Plant in Paranapanema River (Brazil) are described, including a comparative economic of Canoas (Alta), Canoas I and Canoas II passages, motorization study and energetic benefits. The reasons of the Bulbo turbines choice and dimensioning definition of the installed power by the new reference economic parameters are also presented. (C.G.C.). 5 refs, 11 tabs

  17. Is law enforcement of drug-impaired driving cost-efficient? An explorative study of a methodology for cost-benefit analysis.

    Science.gov (United States)

    Veisten, Knut; Houwing, Sjoerd; Mathijssen, M P M René; Akhtar, Juned

    2013-03-01

    Road users driving under the influence of psychoactive substances may be at much higher relative risk (RR) in road traffic than the average driver. Legislation banning blood alcohol concentrations above certain threshold levels combined with roadside breath-testing of alcohol have been in lieu for decades in many countries, but new legislation and testing of drivers for drug use have recently been implemented in some countries. In this article we present a methodology for cost-benefit analysis (CBA) of increased law enforcement of roadside drug screening. This is an analysis of the profitability for society, where costs of control are weighed against the reduction in injuries expected from fewer drugged drivers on the roads. We specify assumptions regarding costs and the effect of the specificity of the drug screening device, and quantify a deterrence effect related to sensitivity of the device yielding the benefit estimates. Three European countries with different current enforcement levels were studied, yielding benefit-cost ratios in the approximate range of 0.5-5 for a tripling of current levels of enforcement, with costs of about 4000 EUR per convicted and in the range of 1.5 and 13 million EUR per prevented fatality. The applied methodology for CBA has involved a simplistic behavioural response to enforcement increase and control efficiency. Although this methodology should be developed further, it is clearly indicated that the cost-efficiency of increased law enforcement of drug driving offences is dependent on the baseline situation of drug-use in traffic and on the current level of enforcement, as well as the RR and prevalence of drugs in road traffic. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Wedding Rigorous Scientific Methodology and Ancient Herbal Wisdom to Benefit Cancer Patients: The Development of PHY906.

    Science.gov (United States)

    Chu, Edward

    2018-02-15

    Our research group has extensively characterized the preclinical and clinical activities of PHY906, a traditional Chinese herbal medicine, as a modulator of irinotecan-based chemotherapy for the treatment of colorectal cancer. This article reviews the critical issues of quality control and standardization of PHY906 and highlights the importance of high-quality material for the conduct of preclinical and clinical studies. Studies to investigate the potential biological mechanisms of action using a systems biology approach play a pivotal role in providing the preclinical rationale to move forward with clinical studies. For early-phase clinical studies, translational biomarkers should be incorporated to characterize the biological effects of the herbal medicine. These biomarkers include tumor mutational load, cytokine/chemokine expression, metabolomic profiling, and the presence of key herbal metabolites. Sophisticated bioinformatic approaches are critical for mining the data and identifying those biomarkers that can define the subset of patients who will benefit from PHY906 or any other herbal medicine, in terms of reduced treatment toxicity, improved quality of life, and/or enhanced clinical activity of treatment.

  19. Improving the spatial and temporal resolution with quantification of uncertainty and errors in earth observation data sets using Data Interpolating Empirical Orthogonal Functions methodology

    Science.gov (United States)

    El Serafy, Ghada; Gaytan Aguilar, Sandra; Ziemba, Alexander

    2016-04-01

    There is an increasing use of process-based models in the investigation of ecological systems and scenario predictions. The accuracy and quality of these models are improved when run with high spatial and temporal resolution data sets. However, ecological data can often be difficult to collect which manifests itself through irregularities in the spatial and temporal domain of these data sets. Through the use of Data INterpolating Empirical Orthogonal Functions(DINEOF) methodology, earth observation products can be improved to have full spatial coverage within the desired domain as well as increased temporal resolution to daily and weekly time step, those frequently required by process-based models[1]. The DINEOF methodology results in a degree of error being affixed to the refined data product. In order to determine the degree of error introduced through this process, the suspended particulate matter and chlorophyll-a data from MERIS is used with DINEOF to produce high resolution products for the Wadden Sea. These new data sets are then compared with in-situ and other data sources to determine the error. Also, artificial cloud cover scenarios are conducted in order to substantiate the findings from MERIS data experiments. Secondly, the accuracy of DINEOF is explored to evaluate the variance of the methodology. The degree of accuracy is combined with the overall error produced by the methodology and reported in an assessment of the quality of DINEOF when applied to resolution refinement of chlorophyll-a and suspended particulate matter in the Wadden Sea. References [1] Sirjacobs, D.; Alvera-Azcárate, A.; Barth, A.; Lacroix, G.; Park, Y.; Nechad, B.; Ruddick, K.G.; Beckers, J.-M. (2011). Cloud filling of ocean colour and sea surface temperature remote sensing products over the Southern North Sea by the Data Interpolating Empirical Orthogonal Functions methodology. J. Sea Res. 65(1): 114-130. Dx.doi.org/10.1016/j.seares.2010.08.002

  20. Methodology to estimate the cost of the severe accidents risk / maximum benefit; Metodologia para estimar el costo del riesgo de accidentes severos / beneficio maximo

    Energy Technology Data Exchange (ETDEWEB)

    Mendoza, G.; Flores, R. M.; Vega, E., E-mail: gozalo.mendoza@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2016-09-15

    For programs and activities to manage aging effects, any changes to plant operations, inspections, maintenance activities, systems and administrative control procedures during the renewal period should be characterized, designed to manage the effects of aging as required by 10 Cfr Part 54 that could impact the environment. Environmental impacts significantly different from those described in the final environmental statement for the current operating license should be described in detail. When complying with the requirements of a license renewal application, the Severe Accident Mitigation Alternatives (SAMA) analysis is contained in a supplement to the environmental report of the plant that meets the requirements of 10 Cfr Part 51. In this paper, the methodology for estimating the cost of severe accidents risk is established and discussed, which is then used to identify and select the alternatives for severe accident mitigation, which are analyzed to estimate the maximum benefit that an alternative could achieve if this eliminate all risk. Using the regulatory analysis techniques of the US Nuclear Regulatory Commission (NRC) estimates the cost of severe accidents risk. The ultimate goal of implementing the methodology is to identify candidates for SAMA that have the potential to reduce the severe accidents risk and determine if the implementation of each candidate is cost-effective. (Author)

  1. Ultrasensitive liquid chromatography-tandem mass spectrometric methodologies for quantification of five HIV-1 integrase inhibitors in plasma for a microdose clinical trial.

    Science.gov (United States)

    Sun, Li; Li, Hankun; Willson, Kenneth; Breidinger, Sheila; Rizk, Matthew L; Wenning, Larissa; Woolf, Eric J

    2012-10-16

    HIV-1 integrase strand transfer inhibitors are an important class of compounds targeted for the treatment of HIV-1 infection. Microdosing has emerged as an attractive tool to assist in drug candidate screening for clinical development, but necessitates extremely sensitive bioanalytical assays, typically in the pg/mL concentration range. Currently, accelerator mass spectrometry is the predominant tool for microdosing support, which requires a specialized facility and synthesis of radiolabeled compounds. There have been few studies attempted to comprehensively assess a liquid chromatography-tandem mass spectrometry (LC-MS/MS) approach in the context of microdosing applications. Herein, we describe the development of automated LC-MS/MS methods to quantify five integrase inhibitors in plasma with the limits of quantification at 1 pg/mL for raltegravir and 2 pg/mL for four proprietary compounds. The assays involved double extractions followed by UPLC coupled with negative ion electrospray MS/MS analysis. All methods were fully validated to the rigor of regulated bioanalysis requirements, with intraday precision between 1.20 and 14.1% and accuracy between 93.8 and 107% at the standard curve concentration range. These methods were successfully applied to a human microdose study and demonstrated to be accurate, reproducible, and cost-effective. Results of the study indicate that raltegravir displayed linear pharmacokinetics between a microdose and a pharmacologically active dose.

  2. Identification and quantification of anthocyanins in fruits from Neomitranthes obscura (DC.) N. Silveira an endemic specie from Brazil by comparison of chromatographic methodologies.

    Science.gov (United States)

    Gouvêa, Ana Cristina M S; Melo, Armindo; Santiago, Manuela C P A; Peixoto, Fernanda M; Freitas, Vitor; Godoy, Ronoel L O; Ferreira, Isabel M P L V O

    2015-10-15

    Neomitranthes obscura (DC.) N. Silveira is a Brazilian fruit belonging to the Myrtaceae family that contains anthocyanins in the peel and was studied for the first time in this work. Delphinidin-3-O-galactoside, delphinidin-3-O-glucoside, cyanidin-3-O-galactoside, cyanidin-3-O-glucoside, cyanidin-3-O-arabinoside, petunidin-3-O-glucoside, pelargonidin-3-O-glucoside, peonidin-3-O-galactoside, peonidin-3-O-glucoside, cyanidin-3-O-xyloside were separated and identified by LC/DAD/MS and by co-elution with standards. Reliable quantification of anthocyanins in the mature fruits was performed by HPLC/DAD using weighted linear regression model from 0.05 to 50mg of cyaniding-3-O-glucoside L(-1) because it gave better fit quality than least squares linear regression. Good precision and accuracy were obtained. The total anthocyanin content of mature fruits was 263.6 ± 8.2 mg of cyanidin-3-O-glucoside equivalents 100 g(-1) fresh weight, which was in the same range found in literature for anthocyanin rich fruits. Copyright © 2015. Published by Elsevier Ltd.

  3. Rapid quantification of imidazolium-based ionic liquids by hydrophilic interaction liquid chromatography: Methodology and an investigation of the retention mechanisms.

    Science.gov (United States)

    Hawkins, Cory A; Rud, Anna; Guthrie, Margaret L; Dietz, Mark L

    2015-06-26

    The separation of nine N,N'-dialkylimidazolium-based ionic liquids (ILs) by an isocratic hydrophilic interaction high-performance liquid chromatographic method using an unmodified silica column was investigated. The chosen analytical conditions using a 90:10 acetonitrile-ammonium formate buffer mobile phase on a high-purity, unmodified silica column were found to be efficient, robust, and sensitive for the determination of ILs in a variety of solutions. The retention window (k' = 2-11) was narrower than that of previous methods, resulting in a 7-min runtime for the nine IL homologues. The lower limit of quantification of the method, 2-3 μmol L(-1), was significantly lower than those reported previously for HPLC-UV methods. The effects of systematically modifying the IL cation alkyl chain length, column temperature, and mobile-phase water and buffer content on solute retention were examined. Cation exchange was identified as the dominant retention mechanism for most of the solutes, with a distinct (single methylene group) transition to a dominant partitioning mode at the highest solute polarity. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Lung involvement quantification in chest radiographs; Quantificacao de comprometimento pulmonar em radiografias de torax

    Energy Technology Data Exchange (ETDEWEB)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Instituto de Biociencias. Departamento de Fisica e Biofisica; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M., E-mail: giacomini@ibb.unesp.br [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Faculdade de Medicina. Departamento de Doencas Tropicais e Diagnostico por Imagem

    2014-12-15

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  5. Human error probability quantification using fuzzy methodology in nuclear plants; Aplicacao da metodologia fuzzy na quantificacao da probabilidade de erro humano em instalacoes nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, Claudio Souza do

    2010-07-01

    This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations a hypothesis on Research Reactor IEA-R1 from IPEN. It was also obtained a Performance Shaping Factors (PSF) evaluation in order to classify them according to their influence level onto the operator's actions and to determine these PSF actual states over the plant. Both HEP estimation and PSF evaluation were done based on Specialists Evaluation using interviews and questionnaires. Specialists group was composed from selected IEA-R1 operators. Specialist's knowledge representation into linguistic variables and group evaluation values were obtained through Fuzzy Logic and Fuzzy Set Theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on Human Reliability Analysis (HRA). (author)

  6. An overview to CERSSO's self evaluation of the cost-benefit on the investment in occupational safety and health in the textile factories: "a step by step methodology".

    Science.gov (United States)

    Amador-Rodezno, Rafael

    2005-01-01

    The Pan American Health Organization (PAHO) and CERSSO collaborated to develop a new Tool Kit (TK), which became available in May 2002. PAHO already had a TK in place, and CERSSO requested that one be developed for their needs. CERSSO wanted to enable managers and line workers in garment factories to self-diagnose plant and workstation hazards and to estimate the costs and benefits of investing in occupational safety and health (OSH) as a way to improve productivity and competitiveness. For consistency, the collaborating organizations agreed to construct the TK according to PAHO's methodology. The instrument was developed to be comprehensive enough that any user can collect the data easily. It integrates epidemiologic, risk assessment, clinic, engineering, and accountability issues, organized to include step-by-step training in: (a) performing risk assessments in the workplaces (risk factors); (b) making cause-effect relationships; (c) improving decision making on OSH interventions; (d) doing calculations of direct and indirect costs and savings; and (e) doing calculation of the overall cost-benefit of OSH interventions. Since July 2002, about 2,400 employees and officials from 736 garment factories, Ministries of Labor, Health, Social Security Institutes, and Technical Training Institutions of Central America and the Dominican Republic have used this instrument. Systematically, they have calculated a positive relationship of the investment (3 to 33 times). Employers are now aware of the financial rewards of investing in OSH. The TK is available in Spanish, Korean, and English. In July 2003, a software program in Spanish and English was developed (180 persons have been trained in the region), which requires less time to execute with better reliability.

  7. Grid of the Future: Quantification of Benefits from Flexible Energy Resources in Scenarios With Extra-High Penetration of Renewable Energy

    Energy Technology Data Exchange (ETDEWEB)

    Bebic, Jovan [General Electric International, Inc., Schenectady, NY (United States). Energy Consulting; Hinkle, Gene [General Electric International, Inc., Schenectady, NY (United States). Energy Consulting; Matic, Slobodan [General Electric International, Inc., Schenectady, NY (United States). Energy Consulting; Schmitt, William [General Electric International, Inc., Schenectady, NY (United States). Energy Consulting

    2015-01-15

    The main objective of this study is to quantify the entitlement for system benefits attainable by pervasive application of flexible energy resources in scenarios with extra-high penetration of renewable energy. The quantified benefits include savings in thermal energy and reduction of CO2 emissions. Both are primarily a result of displacement of conventional thermal generation by renewable energy production, but there are secondary improvements that arise from lowering operating reserves, removing transmission constraints, and by partially removing energy-delivery losses due to energy production by distributed solar. The flexible energy resources in the context of this study include energy storage and adjustable loads. The flexibility of both was constrained to a time horizon of one day. In case of energy storage this means that the state of charge is restored to the starting value at the end of each day, while for load this means that the daily energy consumed is maintained constant. The extra-high penetration of renewable energy in the context of this study means the level of penetration resulting in significant number of hours where instantaneous power output from renewable resources added to the power output from baseload nuclear fleet surpasses the instantaneous power consumption by the load.

  8. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  9. Methodological Aspects of the IAEA State Level Concept and Acquisition Path Analysis: A State’s Nuclear Fuel Cycle, Related Capabilities, and the Quantification of Acquisition Paths

    International Nuclear Information System (INIS)

    Lance, K. Kim; Renda, Guido; Cojazzi, Giacomo G. M.

    2015-01-01

    Within its State Level Concept (SLC), the International Atomic Energy Agency (IAEA) envisions a State Level Approach (SLA) for safeguards implementation that considers, inter alia, a State’s nuclear and nuclear-related activities and capabilities as a whole when developing an annual implementation plan. Based on the assessed nuclear fuel cycle and related capabilities of a State, Acquisition Path Analysis (APA) identifies, characterizes, and prioritizes plausible routes for acquiring weapons-usable material to aid in safeguards implementation planning. A review of proposed APA methods and historical evidence indicates that assessments of pathway completion time can be fraught with uncertainty and subject to bias, potentially undermining safeguards effectiveness and efficiency. Based on considerations of theory and evidence, a number of methodological insights are identified to support consistent implementation and ongoing APA development. The use of algorithms to support APA and SLA processes in lieu of human judgement is a contentious issue requiring an evidence- based assessment and is also briefly discussed. This paper captures concepts derived primarily from open sources of information, including publications, presentations, and workshops on on-going APA development by the IAEA and various Member States Support Programs (MSSP) as well as relevant work found in the open literature. While implementation of the SLA has begun for a number of States, these SLAs are being updated and developed for other States. In light of these ongoing developments, the topics covered here should be considered a snapshot in time that does not reflect finished products and does not necessarily reflect official views.

  10. 76 FR 34270 - Federal-State Extended Benefits Program-Methodology for Calculating “on” or “off” Total...

    Science.gov (United States)

    2011-06-13

    ... requirement. The Department plans to promulgate regulations about this methodology in the near future. In the...--Methodology for Calculating ``on'' or ``off'' Total Unemployment Rate Indicators for Purposes of Determining..., Labor. ACTION: Notice. SUMMARY: UIPL 16-11 informs states of the methodology used to calculate the ``on...

  11. Quantification of the volumetric benefit of image-guided radiotherapy (I.G.R.T.) in prostate cancer: Margins and presence probability map

    International Nuclear Information System (INIS)

    Cazoulat, G.; Crevoisier, R. de; Simon, A.; Louvel, G.; Manens, J.P.; Haigron, P.; Crevoisier, R. de; Louvel, G.; Manens, J.P.; Lafond, C.

    2009-01-01

    Purpose: To quantify the prostate and seminal vesicles (S.V.) anatomic variations in order to choose appropriate margins including intrapelvic anatomic variations. To quantify volumetric benefit of image-guided radiotherapy (I.G.R.T.). Patients and methods: Twenty patients, receiving a total dose of 70 Gy in the prostate, had a planning CT scan and eight weekly CT scans during treatment. Prostate and S.V. were manually contoured. Each weekly CT scan was registered to the planning CT scan according to three modalities: radiopaque skin marks, pelvis bone or prostate. For each patient, prostate and S.V. displacements were quantified. 3-dimensional maps of prostate and S.V. presence probability were established. Volumes including minimal presence probabilities were compared between the three modalities of registration. Result: For the prostate intrapelvic displacements, systematic and random variations and maximal displacements for the entire population were: 5 mm, 2.7 mm and 16.5 mm in anteroposterior axis; 2.7 mm, 2.4 mm and 11.4 mm in supero-inferior axis and 0.5 mm, 0.8 mm and 3.3 mm laterally. Margins according to van Herk recipe (to cover the prostate for 90% of the patients with the 95% isodose) were: 8 mm, 8.3 mm and 1.9 mm, respectively. The 100% prostate presence probability volumes correspond to 37%, 50% and 61% according to the registration modality. For the S.V., these volumes correspond to 8%, 14% and 18% of the S.V. volume. Conclusions: Without I.G.R.T., 5 mm prostate posterior margins are insufficient and should be at least 8 mm, to account for intrapelvic anatomic variations. Prostate registration almost doubles the 100% presence probability volume compared to skin registration. Deformation of S.V. will require either to increase dramatically margins (simple) or new planning (not realistic). (authors)

  12. Superposition Quantification

    Science.gov (United States)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  13. Comparative analysis of cost benefit division methodologies in a hydrothermal generation system; Analise comparativa de metodologias de reparticao de custos e beneficios num sistema de geracao hidrotermico

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, M V.F. [Pontificia Univ. Catolica do Rio de Janeiro, RJ (Brazil); Gorenstin, B G; Campodonico, N M; Costa, J.P. da; Kelman, J [Centro de Pesquisas de Energia Eletrica, Rio de Janeiro, RJ (Brazil)

    1990-12-31

    The development and operation planning of the Brazilian generation system has been realized in a coordinate way by several years, due to some organizations, where the main generating companies from the country take part. The benefit share of the system to each participant of the planning and integrated operation has aroused interest. This paper describes the alternate forms of cost benefit allocation, between the participant companies of a coordinate operation, in order to reach an adequateness of remuneration and incentives. It was analysed two proposal of benefit allocation for energy export/import contracts: share by generation value and share by marginal benefit, concluding that the second one represents the best way of contribution for the several factors that comprising a hydroelectric power plant (storage capacity, effective storage and turbine capacity). (C.G.C.). 1 tab.

  14. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    International Nuclear Information System (INIS)

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    Highlights: ► Survey of bio-analytical approaches utilizing biomolecule labelling. ► Detailed discussion of methodology and chemistry of elemental labelling. ► Biomedical and bio-analytical applications of elemental labelling. ► FI-ICP-MS and LC–ICP-MS for quantification of elemental labelled biomolecules. ► Review of selected applications. - Abstract: This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given.

  15. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    , multiple development goals can be reinforced by specific climate funding granted on the basis of multiple benefits and synergies, for instance through currently negotiated mechanisms such as Nationally Appropriate Mitigation Actions (NAMAs) (REDD+, Kissinger et al 2012). 3. Challenges to quantifying GHG information for the agricultural sector The quantification of GHG emissions from agriculture is fundamental to identifying mitigation solutions that are consistent with the goals of achieving greater resilience in production systems, food security, and rural welfare. GHG emissions data are already needed for such varied purposes as guiding national planning for low-emissions development, generating and trading carbon credits, certifying sustainable agriculture practices, informing consumers' choices with regard to reducing their carbon footprints, assessing product supply chains, and supporting farmers in adopting less carbon-intensive farming practices. Demonstrating the robustness, feasibility, and cost effectiveness of agricultural GHG inventories and monitoring is a necessary technical foundation for including agriculture in the international negotiations under the United Nations Framework Convention on Climate Change (UNFCCC), and is needed to provide robust data and methodology platforms for global corporate supply-chain initiatives (e.g., SAFA, FAO 2012). Given such varied drivers for GHG reductions, there are a number of uses for agricultural GHG information, including (1) reporting and accounting at the national or company level, (2) land-use planning and management to achieve specific objectives, (3) monitoring and evaluating impact of management, (4) developing a credible and thus tradable offset credit, and (5) research and capacity development. The information needs for these uses is likely to differ in the required level of certainty, scale of analysis, and need for comparability across systems or repeatability over time, and they may depend on whether

  16. The quantification of risk and tourism

    Directory of Open Access Journals (Sweden)

    Piet Croucamp

    2014-01-01

    Full Text Available Tourism in South Africa comprises 9.5% of Gross Domestic Product (GDP, but remains an underresearched industry, especially regarding the quantification of the risks prevailing in the social, political and economic environment in which the industry operates. Risk prediction, extrapolation forecasting is conducted largely in the context of a qualitative methodology. This article reflects on the quantification of social constructs as variables of risk in the tourism industry with reference to South Africa. The theory and methodology of quantification is briefly reviewed and the indicators of risk are conceptualized and operationalized. The identified indicators are scaled in indices for purposes of quantification. Risk assessments and the quantification of constructs rely heavily on the experience - often personal - of the researcher and this scholarly endeavour is, therefore, not inclusive of all possible identified indicators of risk. It is accepted that tourism in South Africa is an industry comprising of a large diversity of sectors, each with a different set of risk indicators and risk profiles. The emphasis of this article is thus on the methodology to be applied to a risk profile. A secondary endeavour is to provide for clarity about the conceptual and operational confines of risk in general, as well as how quantified risk relates to the tourism industry. The indices provided include both domesticand international risk indicators. The motivation for the article is to encourage a greater emphasis on quantitative research in our efforts to understand and manage a risk profile for the tourist industry.

  17. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.; Adams, M.L.; McClarren, R.G.; Mallick, B.K.

    2011-01-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework

  18. Can distributed generation offer substantial benefits in a Northeastern American context? A case study of small-scale renewable technologies using a life cycle methodology

    International Nuclear Information System (INIS)

    Amor, Mourad Ben; Samson, Rejean; Lesage, Pascal; Pineau, Pierre-Olivier

    2010-01-01

    Renewable distributed electricity generation can play a significant role in meeting today's energy policy goals, such as reducing greenhouse gas emissions, improving energy security, while adding supply to meet increasing energy demand. However, the exact potential benefits are still a matter of debate. The objective of this study is to evaluate the life cycle implications (environmental, economic and energy) of distributed generation (DG) technologies. A complementary objective is to compare the life cycle implications of DG technologies with the centralized electricity production representing the Northeastern American context. Environmental and energy implications are modeled according to the recommendations in the ISO 14040 standard and this, using different indicators: Human Health; Ecosystem Quality; Climate Change; Resources and Non-Renewable Energy Payback Ratio. Distinctly, economic implications are modeled using conventional life cycle costing. DG technologies include two types of grid-connected photovoltaic panels (3 kWp mono-crystalline and poly-crystalline) and three types of micro-wind turbines (1, 10 and 30 kW) modeled for average, below average and above average climatic conditions in the province of Quebec (Canada). A sensitivity analysis was also performed using different scenarios of centralized energy systems based on average and marginal (short- and long-term) technology approaches. Results show the following. First, climatic conditions (i.e., geographic location) have a significant effect on the results for the environmental, economic and energy indicators. More specifically, it was shown that the 30 kW micro-wind turbine is the best technology for above average conditions, while 3 kWp poly-crystalline photovoltaic panels are preferable for below average conditions. Second, the assessed DG technologies do not show benefits in comparison to the centralized Quebec grid mix (average technology approach). On the other hand, the 30 kW micro

  19. Can distributed generation offer substantial benefits in a Northeastern American context? A case study of small-scale renewable technologies using a life cycle methodology

    Energy Technology Data Exchange (ETDEWEB)

    Amor, Mourad Ben; Samson, Rejean [CIRAIG, Department of Chemical Engineering, P.O. Box 6079, Ecole Polytechnique de Montreal (Qc) (Canada); Lesage, Pascal [CIRAIG, Department of Chemical Engineering, P.O. Box 6079, Ecole Polytechnique de Montreal (Qc) (Canada); Sylvatica, 7379 St-Hubert, Montreal (Qc) (Canada); Pineau, Pierre-Olivier [HEC Montreal, 3000 Chemin de la Cote-Sainte-Catherine, Montreal (Qc) (Canada)

    2010-12-15

    Renewable distributed electricity generation can play a significant role in meeting today's energy policy goals, such as reducing greenhouse gas emissions, improving energy security, while adding supply to meet increasing energy demand. However, the exact potential benefits are still a matter of debate. The objective of this study is to evaluate the life cycle implications (environmental, economic and energy) of distributed generation (DG) technologies. A complementary objective is to compare the life cycle implications of DG technologies with the centralized electricity production representing the Northeastern American context. Environmental and energy implications are modeled according to the recommendations in the ISO 14040 standard and this, using different indicators: Human Health; Ecosystem Quality; Climate Change; Resources and Non-Renewable Energy Payback Ratio. Distinctly, economic implications are modeled using conventional life cycle costing. DG technologies include two types of grid-connected photovoltaic panels (3 kWp mono-crystalline and poly-crystalline) and three types of micro-wind turbines (1, 10 and 30 kW) modeled for average, below average and above average climatic conditions in the province of Quebec (Canada). A sensitivity analysis was also performed using different scenarios of centralized energy systems based on average and marginal (short- and long-term) technology approaches. Results show the following. First, climatic conditions (i.e., geographic location) have a significant effect on the results for the environmental, economic and energy indicators. More specifically, it was shown that the 30 kW micro-wind turbine is the best technology for above average conditions, while 3 kWp poly-crystalline photovoltaic panels are preferable for below average conditions. Second, the assessed DG technologies do not show benefits in comparison to the centralized Quebec grid mix (average technology approach). On the other hand, the 30 kW micro

  20. Public health benefits of reducing air pollution in Shanghai: a proof-of-concept methodology with application to BenMAP.

    Science.gov (United States)

    Voorhees, A Scott; Wang, Jiandong; Wang, Cuicui; Zhao, Bin; Wang, Shuxiao; Kan, Haidong

    2014-07-01

    In recent years, levels of particulate matter (PM) air pollution in China have been relatively high, exceeding China's Class II standards in many cities and impacting public health. This analysis takes Chinese health impact functions and underlying health incidence, applies 2010-2012 modeled and monitored PM air quality data, and estimates avoided cases of mortality and morbidity in Shanghai, assuming achievement of China's Class II air quality standards. In Shanghai, the estimated avoided all cause mortality due to PM10 ranged from 13 to 55 cases per day and from 300 to 800 cases per year. The estimated avoided impact on hospital admissions due to PM10 ranged from 230 cases to 580 cases per day and from 5400 to 7900 per year. The estimated avoided impact on all cause mortality due to PM2.5 ranged from 6 to 26 cases per day and from 39 to 1400 per year. The estimated impact on all cause mortality of a year exposure to an annual or monthly mean PM2.5 concentration ranged from 180 to 3500 per year. In Shanghai, the avoided cases of all cause mortality had an estimated monetary value ranging from 170 million yuan (1 US dollar=4.2 yuan Purchasing Power Parity) to 1200 million yuan. Avoided hospital admissions had an estimated value from 20 to 43 million yuan. Avoided emergency department visits had an estimated value from 5.6 million to 15 million yuan. Avoided outpatient visits had an estimated value from 21 million to 31 million yuan. In this analysis, available data were adequate to estimate avoided health impacts and assign monetary value. Sufficient supporting documentation was available to construct and format data sets for use in the United States Environmental Protection Agency's health and environmental assessment model, known as the Environmental Benefits Mapping and Analysis Program - Community Edition ("BenMAP-CE"). Published by Elsevier B.V.

  1. Senior Benefits

    Science.gov (United States)

    Information Medicaid Public Health Centers Temporary "Cash" Assistance Senior Benefits Program GovDelivery Skip Navigation Links Health and Social Services > Public Assistance > Senior Benefits Page Content Senior Benefits Senior Benefits Logo Senior Benefits Fact Sheet - June, 2016 Reduction Information

  2. Quantification In Neurology

    Directory of Open Access Journals (Sweden)

    Netravati M

    2005-01-01

    Full Text Available There is a distinct shift of emphasis in clinical neurology in the last few decades. A few years ago, it was just sufficient for a clinician to precisely record history, document signs, establish diagnosis and write prescription. In the present context, there has been a significant intrusion of scientific culture in clinical practice. Several criteria have been proposed, refined and redefined to ascertain accurate diagnosis for many neurological disorders. Introduction of the concept of impairment, disability, handicap and quality of life has added new dimension to the measurement of health and disease and neurological disorders are no exception. "Best guess" treatment modalities are no more accepted and evidence based medicine has become an integral component of medical care. Traditional treatments need validation and new therapies require vigorous trials. Thus, proper quantification in neurology has become essential, both in practice and research methodology in neurology. While this aspect is widely acknowledged, there is a limited access to a comprehensive document pertaining to measurements in neurology. This following description is a critical appraisal of various measurements and also provides certain commonly used rating scales/scores in neurological practice.

  3. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  4. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    International Nuclear Information System (INIS)

    Seebauer, Matthias

    2014-01-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO 2  ha −1  yr −1 with significantly different mitigation benefits depending on typologies of the crop–livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms. (paper)

  5. Digital PCR for direct quantification of viruses without DNA extraction.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration material and it has higher tolerance to inhibitors. DNA quantification without an extraction step (i.e. direct quantification) was performed here using dPCR and two different human cytomegalovirus whole-virus materials. Two dPCR platforms were used for this direct quantification of the viral DNA, and these were compared with quantification of the extracted viral DNA in terms of yield and variability. Direct quantification of both whole-virus materials present in simple matrices like cell lysate or Tris-HCl buffer provided repeatable measurements of virus concentrations that were probably in closer agreement with the actual viral load than when estimated through quantification of the extracted DNA. Direct dPCR quantification of other viruses, reference materials and clinically relevant matrices is now needed to show the full versatility of this very promising and cost-efficient development in virus quantification.

  6. In vivo MRS metabolite quantification using genetic optimization

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure.

  7. In vivo MRS metabolite quantification using genetic optimization

    International Nuclear Information System (INIS)

    Papakostas, G A; Mertzios, B G; Karras, D A; Van Ormondt, D; Graveron-Demilly, D

    2011-01-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure

  8. Quantification in emission tomography

    International Nuclear Information System (INIS)

    Buvat, Irene

    2011-11-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena 2 - Main problems impacting quantification in PET and SPECT: problems, consequences, correction methods, results (Attenuation, scattering, partial volume effect, movement, un-stationary spatial resolution in SPECT, fortuitous coincidences in PET, standardisation in PET); 3 - Synthesis: accessible efficiency, know-how, Precautions, beyond the activity measurement

  9. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    Science.gov (United States)

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given. PMID:23062431

  10. Stochastic approach for radionuclides quantification

    Science.gov (United States)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  11. Quantification of local mobilities

    DEFF Research Database (Denmark)

    Zhang, Y. B.

    2018-01-01

    A new method for quantification of mobilities of local recrystallization boundary segments is presented. The quantification is based on microstructures characterized using electron microscopy and on determination of migration velocities and driving forces for local boundary segments. Pure aluminium...... is investigated and the results show that even for a single recrystallization boundary, different boundary segments migrate differently, and the differences can be understood based on variations in mobilities and local deformed microstructures. The present work has important implications for understanding...

  12. Analytical methodologies for the determination of nutraceuticals in foods

    International Nuclear Information System (INIS)

    Rosanna, Gatti; Domenica Masci

    2015-01-01

    The term nutraceutical was coined almost thirty years ago (Stephen De Felice, 1989) by the union of the two terms nutrition and pharmaceutical. According to the definition, for 'Nutraceutical' refers to 'any substance that can be considered a food (or part of a food), and which provides medical or health benefits, including the prevention and / or the treatment of a disease'. At the Casaccia Research Centre ENEA, are developed and validated methodologies analytics for detection and quantification of nutraceutical substances. This is to highlight some cultivars in relation to genotype, geographical area production, cultural practices, or for the purpose to assess the content relative to conservation techniques or transport of raw materials and processed products. [it

  13. Design of an Integrated Methodology for Analytical Design of Complex Supply Chains

    Directory of Open Access Journals (Sweden)

    Shahid Rashid

    2012-01-01

    Full Text Available A literature review and gap analysis indentifies key limitations of industry best practice when modelling of supply chains. To address these limitations the paper reports on the conception and development of an integrated modelling methodology designed to underpin the analytical design of complex supply chains. The methodology is based upon a systematic deployment of EM, CLD, and SM techniques; the integration of which is achieved via common modelling concepts and decomposition principles. Thereby the methodology facilitates: (i graphical representation and description of key “processing”, “resourcing” and “work flow” properties of supply chain configurations; (ii behavioural exploration of currently configured supply chains, to facilitate reasoning about uncertain demand impacts on supply, make, delivery, and return processes; (iii predictive quantification about relative performances of alternative complex supply chain configurations, including risk assessments. Guidelines for the application of each step of the methodology are described. Also described are recommended data collection methods and expected modelling outcomes for each step. The methodology is being extensively case tested to quantify potential benefits & costs relative to current best industry practice. The paper reflects on preliminary benefits gained during industry based case study modelling and identifies areas of potential improvement.

  14. Fluorescent quantification of melanin.

    Science.gov (United States)

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  16. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very ...

  17. A posteriori uncertainty quantification of PIV-based pressure data

    NARCIS (Netherlands)

    Azijli, I.; Sciacchitano, A.; Ragni, D.; Palha Da Silva Clérigo, A.; Dwight, R.P.

    2016-01-01

    A methodology for a posteriori uncertainty quantification of pressure data retrieved from particle image velocimetry (PIV) is proposed. It relies upon the Bayesian framework, where the posterior distribution (probability distribution of the true velocity, given the PIV measurements) is obtained from

  18. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  19. Co-benefits of climate mitigation: Counting statistical lives or life-years?

    DEFF Research Database (Denmark)

    Andersen, Mikael Skou

    2017-01-01

    Making up for air pollution related mortality and accounting for the number of deaths has become an important environmental indicator in its own right, but differences across the Atlantic over how to account for these are making it difficult to find common ground in climate policy appraisals, where...... co-benefits from reducing air pollution of fossil fuels is to be factored in. This article revisits established quantification methodologies for air pollution related mortality applied by government agencies in USA and EU. Demographic lifetables are applied to explore uncertainties over latency....... With a common OECD base value approach the air pollution costs related to fossil fuels are found to be about 3 times lower with EU versus US methodology....

  20. Accelerating time to benefit

    DEFF Research Database (Denmark)

    Svejvig, Per; Geraldi, Joana; Grex, Sara

    Despite the ubiquitous pressure for speed, our approaches to accelerate projects remain constrained to the old-fashioned understanding of the project as a vehicle to deliver products and services, not value. This article explores an attempt to accelerate time to benefit. We describe and deconstruct...... of the time. Although all cases valued speed and speed to benefit, and implemented most practices proposed by the methodology, only three of the five projects were more successful in decreasing time to speed. Based on a multi-case study comparison between these five different projects and their respective...

  1. Image cytometry: nuclear and chromosomal DNA quantification.

    Science.gov (United States)

    Carvalho, Carlos Roberto; Clarindo, Wellington Ronildo; Abreu, Isabella Santiago

    2011-01-01

    Image cytometry (ICM) associates microscopy, digital image and software technologies, and has been particularly useful in spatial and densitometric cytological analyses, such as DNA ploidy and DNA content measurements. Basically, ICM integrates methodologies of optical microscopy calibration, standard density filters, digital CCD camera, and image analysis softwares for quantitative applications. Apart from all system calibration and setup, cytological protocols must provide good slide preparations for efficient and reliable ICM analysis. In this chapter, procedures for ICM applications employed in our laboratory are described. Protocols shown here for human DNA ploidy determination and quantification of nuclear and chromosomal DNA content in plants could be used as described, or adapted for other studies.

  2. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  3. Multiple Benefits.

    Science.gov (United States)

    Kreider, Beth

    1997-01-01

    Discusses the benefits of dome architecture for a community's middle- and high-school multi-purpose facility. The dome construction is revealed as being cost effective in construction and in maintenance and energy costs. (GR)

  4. Quantification of lung fibrosis and emphysema in mice using automated micro-computed tomography.

    Directory of Open Access Journals (Sweden)

    Ellen De Langhe

    Full Text Available BACKGROUND: In vivo high-resolution micro-computed tomography allows for longitudinal image-based measurements in animal models of lung disease. The combination of repetitive high resolution imaging with fully automated quantitative image analysis in mouse models of lung fibrosis lung benefits preclinical research. This study aimed to develop and validate such an automated micro-computed tomography analysis algorithm for quantification of aerated lung volume in mice; an indicator of pulmonary fibrosis and emphysema severity. METHODOLOGY: Mice received an intratracheal instillation of bleomycin (n = 8, elastase (0.25 U elastase n = 9, 0.5 U elastase n = 8 or saline control (n = 6 for fibrosis, n = 5 for emphysema. A subset of mice was scanned without intervention, to evaluate potential radiation-induced toxicity (n = 4. Some bleomycin-instilled mice were treated with imatinib for proof of concept (n = 8. Mice were scanned weekly, until four weeks after induction, when they underwent pulmonary function testing, lung histology and collagen quantification. Aerated lung volumes were calculated with our automated algorithm. PRINCIPAL FINDINGS: Our automated image-based aerated lung volume quantification method is reproducible with low intra-subject variability. Bleomycin-treated mice had significantly lower scan-derived aerated lung volumes, compared to controls. Aerated lung volume correlated with the histopathological fibrosis score and total lung collagen content. Inversely, a dose-dependent increase in lung volume was observed in elastase-treated mice. Serial scanning of individual mice is feasible and visualized dynamic disease progression. No radiation-induced toxicity was observed. Three-dimensional images provided critical topographical information. CONCLUSIONS: We report on a high resolution in vivo micro-computed tomography image analysis algorithm that runs fully automated and allows quantification of aerated lung volume in mice. This

  5. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  6. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  7. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  8. Quantification results from an application of a new technique for human event analysis (ATHEANA) at a pressurized water reactor

    International Nuclear Information System (INIS)

    Whitehead, D.W.; Kolaczkowski, A.M.; Thompson, C.M.

    1998-05-01

    This paper presents results from the quantification of the three human failure events (HFEs) identified using the ATHEANA methodology as discussed in an earlier companion paper presented at this conference. Sections describe the quantification task, important basic events, and the results obtained from quantifying the three HFEs that were identified -- the first two of which were simulated at the Seabrook Station Simulator

  9. Benefits | NREL

    Science.gov (United States)

    flexible work environment that enables and encourages a good work/life balance A growing, changing exceptional work. A woman riding her bike past the NREL entrance sign. Hundreds of NREL employees opt out of their cars, cycling to work, to take part in Bike To Work Day each year. Benefits Package NREL's

  10. Fringe Benefits.

    Science.gov (United States)

    Podgursky, Michael

    2003-01-01

    Uses statistics from the National Center for Education Statistics and the Bureau of Labor Statistics to examine teacher salaries and benefits. Discusses compensation of teachers compared with nonteachers. Asserts that statistics from the American Federation of Teachers and the National Education Association underestimate teacher compensation…

  11. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  12. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  13. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  14. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong.

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  15. Accident sequence quantification with KIRAP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP`s cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs.

  16. Use of cost benefit analysis methodology in the meaning of motorization level from small and medium hydroelectric power plants; Aplicacao de metodologia do tipo analise custo x beneficio na definicao do nivel de motorizacao de pequenas e medias centrais hidroeletricas

    Energy Technology Data Exchange (ETDEWEB)

    Mazzon, J G; Simoes, N S; Ramos, D S; Ishida, S [Companhia Energetica de Sao Paulo, SP (Brazil)

    1990-12-31

    The technical and economic justifications that bringing the waterfall division reformulation between Lucas Nogueira Garcez Plant and Capivara Plant in Paranapanema River (Brazil) are described, including a comparative economic of Canoas (Alta), Canoas I and Canoas II passages, motorization study and energetic benefits. The reasons of the Bulbo turbines choice and dimensioning definition of the installed power by the new reference economic parameters are also presented. (C.G.C.). 5 refs, 11 tabs.

  17. Who benefits?

    DEFF Research Database (Denmark)

    Hjorth, Frederik Georg

    2016-01-01

    Cross-border welfare rights for citizens of European Union member states are intensely contested, yet there is limited research into voter opposition to such rights, sometimes denoted ‘welfare chauvinism’. We highlight an overlooked aspect in scholarly work: the role of stereotypes about benefici...... recipient identity. These effects are strongest among respondents high in ethnic prejudice and economic conservatism. The findings imply that stereotypes about who benefits from cross-border welfare rights condition public support for those rights....

  18. Cost-benefit

    International Nuclear Information System (INIS)

    1975-01-01

    A critical review of the cost benefit analysis is given for the LMFBR-type reactor development program given in an environmental impact statement of AEC. Several methodological shortcomings are signalled. As compared with a HTGR-type/LWR-type mix of reactors the LMFBR-type reactor will not be competitive until the U 3 O 8 prices reach a level of $ 50/lb which is not likely to happen before the year 2020. It is recommended to review the draft of the ZEC document and include timing as one of the issues. Deferal of the LMFBR-type reactor development program if necessary will not be intolerably costly

  19. Real-time PCR for the quantification of fungi in planta.

    Science.gov (United States)

    Klosterman, Steven J

    2012-01-01

    Methods enabling quantification of fungi in planta can be useful for a variety of applications. In combination with information on plant disease severity, indirect quantification of fungi in planta offers an additional tool in the screening of plants that are resistant to fungal diseases. In this chapter, a method is described for the quantification of DNA from a fungus in plant leaves using real-time PCR (qPCR). Although the method described entails quantification of the fungus Verticillium dahliae in lettuce leaves, the methodology described would be useful for other pathosystems as well. The method utilizes primers that are specific for amplification of a β-tubulin sequence from V. dahliae and a lettuce actin gene sequence as a reference for normalization. This approach enabled quantification of V. dahliae in the amount of 2.5 fg/ng of lettuce leaf DNA at 21 days following plant inoculation.

  20. Digital PCR for direct quantification of viruses without DNA extraction

    OpenAIRE

    Pav?i?, Jernej; ?el, Jana; Milavec, Mojca

    2015-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration mat...

  1. External Costs and Benefits of Energy. Methodologies, Results and Effects on Renewable Energies Competitivity; Costes y Beneficios Externos de la Energia. Metodologias, Resultados e Influencia sobre la Competitividad de las Energias Renovables

    Energy Technology Data Exchange (ETDEWEB)

    Saez, R; Cabal, H; Varela, M [CIEMAT. Madrid (Spain)

    1999-09-01

    This study attempts to give a summarised vision of the concept of externally in energy production, the social and economic usefulness of its evaluation and consideration as support to the political decision-marking in environmental regulation matters, technologies selection of new plants, priorities establishment on energy plans, etc. More relevant environmental externalities are described, as are the effects on the health, ecosystems, materials and climate, as well as some of the socioeconomic externalities such as the employment, increase of the GDP and the reduction and depletion of energy resources. Different methodologies used during the last years have been reviewed as well as the principals results obtained in the most relevant studies accomplished internationally on this topic. Special mention has deserved the European study National Implementation of the ExternE Methodology in the EU. Results obtained are represented in Table 2 of this study. Also they are exposed, in a summarised way, the results obtained in the evaluation of environmental externalities of the Spanish electrical system in function of the fuel cycle. In this last case the obtained results are more approximated since have been obtained by extrapolation from the obtained for ten representative plants geographically distributed trough the Peninsula. Finally it has been analysed the influence that the internalization of the external costs of conventional energies can have in the competitiveness and in the market of renewable energy, those which originate less environmental effects and therefore produce much smaller external costs. The mechanisms of internalization and the consideration on the convenience or not of their incorporation in the price of energy have been also discussed. (Author) 30 refs.

  2. Cost-benefit analysis of alternative LNG vapor-mitigation measures. Topical report, September 14, 1987-January 15, 1991

    International Nuclear Information System (INIS)

    Atallah, S.

    1992-01-01

    A generalized methodology is presented for comparing the costs and safety benefits of alternative hazard mitigation measures for a large LNG vapor release. The procedure involves the quantification of the risk to the public before and after the application of LNG vapor mitigation measures. In the study, risk was defined as the product of the annual accident frequency, estimated from a fault tree analysis, and the severity of the accident. Severity was measured in terms of the number of people who may be exposed to 2.5% or higher concentration. The ratios of the annual costs of the various mitigation measures to their safety benefits (as determined by the differences between the risk before and after mitigation measure implementation), were then used to identify the most cost-effective approaches to vapor cloud mitigation

  3. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  4. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  5. Quantification of prebiotics in commercial infant formulas.

    Science.gov (United States)

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  7. Methodology for evaluation of railroad technology research projects

    Science.gov (United States)

    1981-04-01

    This Project memorandum presents a methodology for evaluating railroad research projects. The methodology includes consideration of industry and societal benefits, with special attention given to technical risks, implementation considerations, and po...

  8. Quantification of informed opinion

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1985-01-01

    The objective of this session, Quantification of Informed Opinion, is to provide the statistician with a better understanding of this important area. The NRC uses informed opinion, sometimes called engineering judgment or subjective judgment, in many areas. Sometimes informed opinion is the only source of information that exists, especially in phenomenological areas, such as steam explosions, where experiments are costly and phenomena are very difficult to measure. There are many degrees of informed opinion. These vary from the weatherman who makes predictions concerning relatively high probability events with a large data base to the phenomenological expert who must use his intuition tempered with basic knowledge and little or no measured data to predict the behavior of events with a low probability of occurrence. The first paper in this session provides the reader with an overview of the subject area. The second paper provides some aspects that must be considered in the collection of informed opinion to improve the quality of the information. The final paper contains an example of the use of informed opinion in the area of seismic hazard characterization. These papers should be useful to researchers and statisticians who need to collect and use informed opinion in their work

  9. Estimating the Economic Benefits of Regional Ocean Observing Systems

    National Research Council Canada - National Science Library

    Kite-Powell, Hauke L; Colgan, Charles S; Wellman, Katharine F; Pelsoci, Thomas; Wieand, Kenneth; Pendleton, Linwood; Kaiser, Mark J; Pulsipher, Allan G; Luger, Michael

    2005-01-01

    We develop a methodology to estimate the potential economic benefits from new investments in regional coastal ocean observing systems in US waters, and apply this methodology to generate preliminary...

  10. An LWR design decision Methodology

    International Nuclear Information System (INIS)

    Leahy, T.J.; Rees, D.C.; Young, J.

    1982-01-01

    While all parties involved in nuclear plant regulation endeavor to make decisions which optimize the considerations of plant safety and financial impacts, these decisions are generally made without the benefit of a systematic and rigorous approach to the questions confronting the decision makers. A Design Decision Methodology has been developed which provides such a systematic approach. By employing this methodology, which makes use of currently accepted probabilistic risk assessment techniques and cost estimation, informed decisions may be made against a background of comparisons between the relative levels of safety and costs associated with various design alternatives

  11. The affect heuristic in judgments of risks and benefits

    International Nuclear Information System (INIS)

    Finucane, M.; Slovic, P.; Johnson, S.M.; Alhakami, A.

    1998-01-01

    The role of affect in judgment of risks and benefits is examined in two studies. Despite using different methodologies the two studies suggest that risk and benefit are linked somehow in people's perception, consequently influencing their judgments. Short paper

  12. Flood Protection Through Landscape Scale Ecosystem Restoration- Quantifying the Benefits

    Science.gov (United States)

    Pinero, E.

    2017-12-01

    Hurricane Harvey illustrated the risks associated with storm surges on coastal areas, especially during severe storms. One way to address storm surges is to utilize the natural ability of offshore coastal land to dampen their severity. In addition to helping reduce storm surge intensity and related damage, restoring the land will generate numerous co-benefits such as carbon sequestration and water quality improvement. The session will discuss the analytical methodology that helps define what is the most resilient species to take root, and to calculate quantified benefits. It will also address the quantification and monetization of benefits to make the business case for restoration. In 2005, Hurricanes Katrina and Rita damaged levees along the Gulf of Mexico, leading to major forest degradation, habitat deterioration and reduced wildlife use. As a result, this area lost an extensive amount of land, with contiguous sections of wetlands being converted to open water. The Restore the Earth Foundation's North American Amazon project intends to restore one million acres of forests and forested wetlands in the lower Mississippi River Valley. The proposed area for the first phase of this project was once an historic bald cypress forested wetland, which was degraded due to increased salinity levels and extreme fluctuations in hydrology. The Terrebonne and Lafourche Parishes, the "bayou parishes", communities with a combined population of over 200,000, sit on thin fingers of land that are protected by surrounding wetland swamps and wetlands, beyond which is the Gulf of Mexico. The Parishes depend on fishing, hunting, trapping, boat building, off-shore oil and gas production and support activities. Yet these communities are highly vulnerable to risks from natural hazards and future land loss. The ground is at or near sea level and therefore easily inundated by storm surges if not protected by wetlands. While some communities are protected by a levee system, the Terrebonne and

  13. On uncertainty quantification in hydrogeology and hydrogeophysics

    Science.gov (United States)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  14. Wider benefits of adult education

    DEFF Research Database (Denmark)

    Schuller, Tom; Desjardins, Richard

    2010-01-01

    This article discusses the measurement of the social outcomes of learning. It extends the discussion beyond employment and labor market outcomes to consider the impact of adult learning on social domains, with particular focus on health and civic engagement. It emphasizes the distinction between ...... public and private, and monetary and nonmonetary benefits. It reviews methodological issues on measuring outcomes, and identifies a number of channels through which adult learning has its effects....

  15. A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.

    MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.

  16. Validação de metodologia analítica por cromatografia líquida para doseamento e estudo da estabilidade de pantoprazol sódico Validation of analytical methodology by hplc for quantification and stability evaluation of sodium pantoprazole

    Directory of Open Access Journals (Sweden)

    Renata Platcheck Raffin

    2007-08-01

    Full Text Available Pantoprazole is a proton pump inhibitor used in the treatment of digestive ulcers, gastro-esophageal reflux disease and in the eradication of Helicobacter pylori. In this work, an analytical method was developed and validated for the quantification of sodium pantoprazole by HPLC. The method was specific, linear, precise and exact. In order to verify the stability of pantoprazole during dissolution assays, pantoprazole solution in phosphate buffer pH 7.4 was kept at room temperature and protected from light for 22 days. Pantoprazole presented less than 5% of degradation in 6 hours and the half live of the degradation was 124 h.

  17. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  18. Chromatic and anisotropic cross-recurrence quantification analysis of interpersonal behavior

    NARCIS (Netherlands)

    Cox, R.F.A; van der Steen, Stephanie; Guevara Guerrero, Marlenny; Hoekstra, Lisette; van Dijk, Marijn; Webber, Charles; Ioana, Cornel; Marwan, Norbert

    Cross-recurrence quantification analysis (CRQA) is a powerful nonlinear time-series method to study coordination and cooperation between people. This chapter concentrates on two methodological issues related to CRQA on categorical data streams, which are commonly encountered in the behavioral

  19. Environmental and Sustainability Education Policy Research: A Systematic Review of Methodological and Thematic Trends

    Science.gov (United States)

    Aikens, Kathleen; McKenzie, Marcia; Vaughter, Philip

    2016-01-01

    This paper reports on a systematic literature review of policy research in the area of environmental and sustainability education. We analyzed 215 research articles, spanning four decades and representing 71 countries, and which engaged a range of methodologies. Our analysis combines quantification of geographic and methodological trends with…

  20. Biomass to energy : GHG reduction quantification protocols and case study

    International Nuclear Information System (INIS)

    Reusing, G.; Taylor, C.; Nolan, W.; Kerr, G.

    2009-01-01

    With the growing concerns over greenhouses gases and their contribution to climate change, it is necessary to find ways of reducing environmental impacts by diversifying energy sources to include non-fossil fuel energy sources. Among the fastest growing green energy sources is energy from waste facilities that use biomass that would otherwise be landfilled or stockpiled. The quantification of greenhouse gas reductions through the use of biomass to energy systems can be calculated using various protocols and methodologies. This paper described each of these methodologies and presented a case study comparing some of these quantification methodologies. A summary and comparison of biomass to energy greenhouse gas reduction protocols in use or under development by the United Nations, the European Union, the Province of Alberta and Environment Canada was presented. It was concluded that regulatory, environmental pressures, and public policy will continue to impact the practices associated with biomass processing or landfill operations, such as composting, or in the case of landfills, gas collection systems, thus reducing the amount of potential credit available for biomass to energy facility offset projects. 10 refs., 2 tabs., 6 figs

  1. Biomass to energy : GHG reduction quantification protocols and case study

    Energy Technology Data Exchange (ETDEWEB)

    Reusing, G.; Taylor, C. [Conestoga - Rovers and Associates, Waterloo, ON (Canada); Nolan, W. [Liberty Energy, Hamilton, ON (Canada); Kerr, G. [Index Energy, Ajax, ON (Canada)

    2009-07-01

    With the growing concerns over greenhouses gases and their contribution to climate change, it is necessary to find ways of reducing environmental impacts by diversifying energy sources to include non-fossil fuel energy sources. Among the fastest growing green energy sources is energy from waste facilities that use biomass that would otherwise be landfilled or stockpiled. The quantification of greenhouse gas reductions through the use of biomass to energy systems can be calculated using various protocols and methodologies. This paper described each of these methodologies and presented a case study comparing some of these quantification methodologies. A summary and comparison of biomass to energy greenhouse gas reduction protocols in use or under development by the United Nations, the European Union, the Province of Alberta and Environment Canada was presented. It was concluded that regulatory, environmental pressures, and public policy will continue to impact the practices associated with biomass processing or landfill operations, such as composting, or in the case of landfills, gas collection systems, thus reducing the amount of potential credit available for biomass to energy facility offset projects. 10 refs., 2 tabs., 6 figs.

  2. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    Science.gov (United States)

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.

  3. Risk assessment methodology for Hanford high-level waste tanks

    International Nuclear Information System (INIS)

    Bott, T.F.; Mac Farlane, D.R.; Stack, D.W.; Kindinger, J.

    1992-01-01

    A methodology is presented for applying Probabilistic Safety Assessment techniques to quantification of the health risks posed by the high-level waste (HLW) underground tanks at the Department of Energy's Hanford reservation. This methodology includes hazard screening development of a list of potential accident initiators, systems fault trees development and quantification, definition of source terms for various release categories, and estimation of health consequences from the releases. Both airborne and liquid pathway releases to the environment, arising from aerosol and spill/leak releases from the tanks, are included in the release categories. The proposed methodology is intended to be applied to a representative subset of the total of 177 tanks, thereby providing a baseline risk profile for the HLW tank farm that can be used for setting clean-up/remediation priorities. Some preliminary results are presented for Tank 101-SY

  4. Technical Report on Methodology: Cost Benefit Analysis and Policy Responses

    NARCIS (Netherlands)

    Pearce DW; Howarth A; MNV

    2001-01-01

    The economic assessment of priorities for a European environmental policy plan focuses on twelve identified Prominent European Environmental Problems such as climate change, chemical risks and biodiversity. The study, commissioned by the European Commission (DG Environment) to a European consortium

  5. Usle systematization of the factors in gis to the quantification the of laminate erosion in the jirau river watershed

    Directory of Open Access Journals (Sweden)

    Elisete Guimarães

    2005-12-01

    Full Text Available The present paper demonstrates the use of USLE (Universal Equation of Soil Losses in GIS (Geographic Information System as a tool for the quantification of soil losses by laminate erosion. The study area is the Jirau River watershed, which is located in the district of Dois Vizinhos, Southwestern Parana. Our results present a contribution to the development and implementation of automated methodologies focused on the characterization, quantification, and control of the laminate erosion process.

  6. Nuclear Data Uncertainty Quantification: Past, Present and Future

    International Nuclear Information System (INIS)

    Smith, D.L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested

  7. Nuclear Data Uncertainty Quantification: Past, Present and Future

    Science.gov (United States)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  8. QUANTIFICATION OF ANGIOGENESIS IN THE CHICKEN CHORIOALLANTOIC MEMBRANE (CAM

    Directory of Open Access Journals (Sweden)

    Silvia Blacher

    2011-05-01

    Full Text Available The chick chorioallantoic membrane (CAM provides a suitable in vivo model to study angiogenesis and evaluate several pro- and anti-angiogenic factors and compounds. In the present work, new developments in image analysis are used to quantify CAM angiogenic response from optical microscopic observations, covering all vascular components, from the large supplying and feeding vessels down to the capillary plexus. To validate our methodology angiogenesis is quantified during two phases of CAM development (day 7 and 13 and after treatment with an antiangiogenic modulator of the angiogenesis. Our morphometric analysis emphasizes that an accurate quantification of the CAM vasculature needs to be performed at various scales.

  9. Collagen Quantification in Tissue Specimens.

    Science.gov (United States)

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  10. QUANTIFYING BENEFITS FOR COST-BENEFIT ANALYSIS

    OpenAIRE

    Attila GYORGY; Nicoleta VINTILA; Florian GAMAN

    2014-01-01

    Cost Benefit Analysis is one of the most widely used financial tools to select future investment projects in public and private sector. This method is based on comparing costs and benefits in terms of constant prices. While costs are easier to predict and monetize, the benefits should be identified not only in direct relation with the investment, but also widening the sphere of analysis to indirect benefits experienced by the community from the neighbourhood or the whole society. During finan...

  11. Organic and total mercury determination in sediments by cold vapor atomic absorption spectrometry: methodology validation and uncertainty measurements

    Directory of Open Access Journals (Sweden)

    Robson L. Franklin

    2012-01-01

    Full Text Available The purpose of the present study was to validate a method for organic Hg determination in sediment. The procedure for organic Hg was adapted from literature, where the organomercurial compounds were extracted with dichloromethane in acid medium and subsequent destruction of organic compounds by bromine chloride. Total Hg was performed according to 3051A USEPA methodology. Mercury quantification for both methodologies was then performed by CVAAS. Methodology validation was verified by analyzing certified reference materials for total Hg and methylmercury. The uncertainties for both methodologies were calculated. The quantification limit of 3.3 µg kg-1 was found for organic Hg by CVAAS.

  12. The measurement of employment benefits

    International Nuclear Information System (INIS)

    Burtraw, D.

    1994-01-01

    The consideration of employment effects and so-called 'hidden employment benefits' is one of the most confused and contentious issues in benefit-cost analysis and applied welfare economics generally. New investments create new employment opportunities, and often advocates for specific investments cite these employment opportunities as alleged benefits associated with the project. Indeed, from the local perspective, such employment opportunities may appear to be beneficial because they appear to come for free. If there is unemployment in the local area, then new investments create valuable employment opportunities for those in the local community. Even if there is full employment in the local area then new investments create incentives for immigrant from other locations that may have pecuniary benefits locally through increased property values, business revenues, etc. The focus in this study is on net economic benefits from a broad national perspective. From this perspective, many of the alleged employment benefits at the local level are offset by lost benefits at other locales, and do not count as benefits according to economic theory. This paper outlines a methodology for testing this rebuttable presumption with empirical data pertaining to labor markets that would be affected by a specific new investment. The theoretical question that is relevant is whether the social opportunity cost of new employment is less than the market wage. This would be the case, for example, if one expects unemployment or underemployment to persist in a specific region of the economy or occupational category affected by the new investment. In this case, new employment opportunities produce a net increase in social wealth rather than just a transfer of income

  13. The measurement of employment benefits

    Energy Technology Data Exchange (ETDEWEB)

    Burtraw, D

    1994-07-01

    The consideration of employment effects and so-called 'hidden employment benefits' is one of the most confused and contentious issues in benefit-cost analysis and applied welfare economics generally. New investments create new employment opportunities, and often advocates for specific investments cite these employment opportunities as alleged benefits associated with the project. Indeed, from the local perspective, such employment opportunities may appear to be beneficial because they appear to come for free. If there is unemployment in the local area, then new investments create valuable employment opportunities for those in the local community. Even if there is full employment in the local area then new investments create incentives for immigrant from other locations that may have pecuniary benefits locally through increased property values, business revenues, etc. The focus in this study is on net economic benefits from a broad national perspective. From this perspective, many of the alleged employment benefits at the local level are offset by lost benefits at other locales, and do not count as benefits according to economic theory. This paper outlines a methodology for testing this rebuttable presumption with empirical data pertaining to labor markets that would be affected by a specific new investment. The theoretical question that is relevant is whether the social opportunity cost of new employment is less than the market wage. This would be the case, for example, if one expects unemployment or underemployment to persist in a specific region of the economy or occupational category affected by the new investment. In this case, new employment opportunities produce a net increase in social wealth rather than just a transfer of income.

  14. Big Data Analytics Methodology in the Financial Industry

    Science.gov (United States)

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  15. Resonance self-shielding effect in uncertainty quantification of fission reactor neutronics parameters

    International Nuclear Information System (INIS)

    Chiba, Go; Tsuji, Masashi; Narabayashi, Tadashi

    2014-01-01

    In order to properly quantify fission reactor neutronics parameter uncertainties, we have to use covariance data and sensitivity profiles consistently. In the present paper, we establish two consistent methodologies for uncertainty quantification: a self-shielded cross section-based consistent methodology and an infinitely-diluted cross section-based consistent methodology. With these methodologies and the covariance data of uranium-238 nuclear data given in JENDL-3.3, we quantify uncertainties of infinite neutron multiplication factors of light water reactor and fast reactor fuel cells. While an inconsistent methodology gives results which depend on the energy group structure of neutron flux and neutron-nuclide reaction cross section representation, both the consistent methodologies give fair results with no such dependences.

  16. RESONANCE SELF-SHIELDING EFFECT IN UNCERTAINTY QUANTIFICATION OF FISSION REACTOR NEUTRONICS PARAMETERS

    Directory of Open Access Journals (Sweden)

    GO CHIBA

    2014-06-01

    Full Text Available In order to properly quantify fission reactor neutronics parameter uncertainties, we have to use covariance data and sensitivity profiles consistently. In the present paper, we establish two consistent methodologies for uncertainty quantification: a self-shielded cross section-based consistent methodology and an infinitely-diluted cross section-based consistent methodology. With these methodologies and the covariance data of uranium-238 nuclear data given in JENDL-3.3, we quantify uncertainties of infinite neutron multiplication factors of light water reactor and fast reactor fuel cells. While an inconsistent methodology gives results which depend on the energy group structure of neutron flux and neutron-nuclide reaction cross section representation, both the consistent methodologies give fair results with no such dependences.

  17. Detection and Quantification of Genetically Modified Soybean in Some Food and Feed Products. A Case Study on Products Available on Romanian Market

    Directory of Open Access Journals (Sweden)

    Elena Rosculete

    2018-04-01

    Full Text Available The aim of this paper is to trace genetically modified soybean in food and feed products present on the Romanian market by using molecular extraction, identification and quantification methodologies. Nine samples (3 food samples, 5 soybean samples and 1 soybean meal were analysed using the classical and real-time polymerase chain reaction (PCR method. DNA-genetically modified organism (GMO was not detected in two of the three analysed samples (food products. However, it could be found in four samples ranging below the limit of 0.9%, and in three samples, above the limit of 0.9%. The results obtained through real-time PCR quantification show that DNA-RRS was detectable in different amounts in different samples: ranging between 0.27% and 9.36% in soy beans, and reaching 50.98% in soybean meal. The current research focuses on how products containing GMO above the limit (it is common knowledge that it is necessary to label the products containing more than 0.9% Genetically Modified DNA are differentiated on the market with a view to labeling food and feed products in terms of the accidental presence of approved genetically modified plants. The benefits brought by genetic engineering in obtaining genetically modified organisms can be balanced with their public acceptance and with certain known or unknown risks that they can bring.

  18. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  19. GO methodology. Volume 1. Overview manual

    International Nuclear Information System (INIS)

    1983-06-01

    The GO methodology is a success-oriented probabilistic system performance analysis technique. The methodology can be used to quantify system reliability and availability, identify and rank critical components and the contributors to system failure, construct event trees, and perform statistical uncertainty analysis. Additional capabilities of the method currently under development will enhance its use in evaluating the effects of external events and common cause failures on system performance. This Overview Manual provides a description of the GO Methodology, how it can be used, and benefits of using it in the analysis of complex systems

  20. Benefits of Java

    Science.gov (United States)

    ... Wellness Preventing Illness Benefits of Coffee Print Email Benefits of Coffee Reviewed by Taylor Wolfram, MS, RDN, ... your daily cup (or three) provides some health benefits as well. Drinking moderate amounts of coffee (including ...

  1. Benefits of quitting tobacco

    Science.gov (United States)

    ... your risk of many serious health problems . THE BENEFITS OF QUITTING You may enjoy the following when ... about $2,000 a year on cigarettes. HEALTH BENEFITS Some health benefits begin almost immediately. Every week, ...

  2. Review of some aspects of human reliability quantification

    International Nuclear Information System (INIS)

    Lydell, B.O.Y.; Spurgin, A.J.; Hannaman, G.W.; Lukic, Y.D.

    1986-01-01

    An area in systems reliability considered to be weak, is the characterization and quantification of the role of the operations and maintenance staff in combatting accidents. Several R and D programs are underway to improve the modeling of human interactions and some progress has been made. This paper describes a specific aspect of human reliability analysis which is referred to as modeling of cognitive processes. In particular, the basis for the so- called Human Cognitive Reliability (HCR) model is described and the focus is on its validation and on its benefits and limitations

  3. Electronics Environmental Benefits Calculator

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Electronics Environmental Benefits Calculator (EEBC) was developed to assist organizations in estimating the environmental benefits of greening their purchase,...

  4. Tissue quantification for development of pediatric phantom

    International Nuclear Information System (INIS)

    Alves, A.F.F.; Miranda, J.R.A.; Pina, D.R.

    2013-01-01

    The optimization of the risk- benefit ratio is a major concern in the pediatric radiology, due to the greater vulnerability of children to the late somatic effects and genetic effects of exposure to radiation compared to adults. In Brazil, it is estimated that the causes of death from head trauma are 18 % for the age group between 1-5 years and the radiograph is the primary diagnostic test for the detection of skull fracture . Knowing that the image quality is essential to ensure the identification of structures anatomical and minimizing errors diagnostic interpretation, this paper proposed the development and construction of homogeneous phantoms skull, for the age group 1-5 years. The construction of the phantoms homogeneous was performed using the classification and quantification of tissue present in the skull of pediatric patients. In this procedure computational algorithms were used, using Matlab, to quantify distinct biological tissues present in the anatomical regions studied , using pictures retrospective CT scans. Preliminary data obtained from measurements show that between the ages of 1-5 years, assuming an average anteroposterior diameter of the pediatric skull region of the 145.73 ± 2.97 mm, can be represented by 92.34 mm ± 5.22 of lucite and 1.75 ± 0:21 mm of aluminum plates of a provision of PEP (Pacient equivalent phantom). After its construction, the phantoms will be used for image and dose optimization in pediatric protocols process to examinations of computerized radiography

  5. Methodologies for tracking learning paths

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth; Gilje, Øystein; Lindstrand, Fredrik

    2009-01-01

    filmmakers: what furthers their interest and/or hinders it, and what learning patterns emerge. The aim of this article is to present and discuss issues regarding the methodology and meth- ods of the study, such as developing a relationship with interviewees when conducting inter- views online (using MSN). We...... suggest two considerations about using online interviews: how the interviewees value the given subject of conversation and their familiarity with being online. The benefit of getting online communication with the young filmmakers offers ease, because it is both practical and appropriates a meeting...

  6. Safety in relation to risk and benefit

    International Nuclear Information System (INIS)

    Siddall, E.

    1985-01-01

    The proper definition and quantification of human safety is discussed and from this basis the historical development of our present very high standard of safety is traced. It is shown that increased safety is closely associated with increased wealth, and the quantitative relationship between then is derived from different sources of evidence. When this factor is applied to the production of wealth by industry, a safety benefit is indicated which exceeds the asserted risks by orders of magnitude. It is concluded that present policies and attitudes in respect to the safety of industry may be diametrically wrong. (orig.) [de

  7. Economic benefits of metrology in manufacturing

    DEFF Research Database (Denmark)

    Savio, Enrico; De Chiffre, Leonardo; Carmignato, S.

    2016-01-01

    examples from industrial production, in which the added value of metrology in manufacturing is discussed and quantified. Case studies include: general manufacturing, forging, machining, and related metrology. The focus of the paper is on the improved effectiveness of metrology when used at product...... and process design stages, as well as on the improved accuracy and efficiency of manufacturing through better measuring equipment and process chains with integrated metrology for process control.......In streamlined manufacturing systems, the added value of inspection activities is often questioned, and metrology in particular is sometimes considered only as an avoidable expense. Documented quantification of economic benefits of metrology is generally not available. This work presents concrete...

  8. Methods for cost-benefit-risk analysis of material-accounting upgrades

    International Nuclear Information System (INIS)

    Fishbone, L.G.; Gordon, D.M.; Higinbotham, W.; Keisch, B.

    1988-01-01

    The authors have developed a cost-benefit-risk methodology for evaluating material-accounting upgrades at key measurement points in nuclear facilities. The focus of this methodology is on nuclear-material measurements and their effects on inventory differences and shipper/receiver differences. The methodology has three main components: cost, benefits, and risk factors. The fundamental outcome of the methodology is therefore cost-benefit ratios characterizing the proposed upgrades, with the risk factors applied as necessary to the benefits. Examples illustrate the methodology's use

  9. Geostatistical methodology for waste optimization of contaminated premises - 59344

    International Nuclear Information System (INIS)

    Desnoyers, Yvon; Dubot, Didier

    2012-01-01

    The presented methodological study illustrates a Geo-statistical approach suitable for radiological evaluation in nuclear premises. The waste characterization is mainly focused on floor concrete surfaces. By modeling the spatial continuity of activities, Geo-statistics provide sound methods to estimate and map radiological activities, together with their uncertainty. The multivariate approach allows the integration of numerous surface radiation measurements in order to improve the estimation of activity levels from concrete samples. This way, a sequential and iterative investigation strategy proves to be relevant to fulfill the different evaluation objectives. Waste characterization is performed on risk maps rather than on direct interpolation maps (due to bias of the selection on kriging results). The use of several estimation supports (punctual, 1 m 2 , room) allows a relevant radiological waste categorization thanks to cost-benefit analysis according to the risk of exceeding a given activity threshold. Global results, mainly total activity, are similarly quantified to precociously lead the waste management for the dismantling and decommissioning project. This paper recalled the geo-statistics principles and demonstrated how this methodology provides innovative tools for the radiological evaluation of contaminated premises. The relevance of this approach relies on the presence of a spatial continuity for radiological contamination. In this case, geo-statistics provides reliable activity estimates, uncertainty quantification and risk analysis, which are essential decision-making tools for decommissioning and dismantling projects of nuclear installations. Waste characterization is then performed taking all relevant information into account: historical knowledge, surface measurements and samples. Thanks to the multivariate processing, the different investigation stages can be rationalized as regards quantity and positioning. Waste characterization is finally

  10. Quantitative Assessment of Distributed Energy Resource Benefits

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, S.W.

    2003-05-22

    Distributed energy resources (DER) offer many benefits, some of which are readily quantified. Other benefits, however, are less easily quantifiable because they may require site-specific information about the DER project or analysis of the electrical system to which the DER is connected. The purpose of this study is to provide analytical insight into several of the more difficult calculations, using the PJM power pool as an example. This power pool contains most of Pennsylvania, New Jersey, Maryland, and Delaware. The techniques used here could be applied elsewhere, and the insights from this work may encourage various stakeholders to more actively pursue DER markets or to reduce obstacles that prevent the full realization of its benefits. This report describes methodologies used to quantify each of the benefits listed in Table ES-1. These methodologies include bulk power pool analyses, regional and national marginal cost evaluations, as well as a more traditional cost-benefit approach for DER owners. The methodologies cannot however determine which stakeholder will receive the benefits; that must be determined by regulators and legislators, and can vary from one location to another.

  11. CIAU methodology and BEPU applications

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.

    2009-01-01

    Best-Estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are unpredictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Uncertainties may have different origins ranging from the approximation of the models, to the approximation of the numerical solution, and to the lack of precision of the values adopted for boundary and initial conditions. The amount of uncertainty that affects a calculation may strongly depend upon the codes and the modeling techniques (i.e. the code's users). A consistent and robust uncertainty methodology must be developed taking into consideration all the above aspects. The CIAU (Code with the capability of Internal Assessment of Uncertainty) and the UMAE (Uncertainty Methodology based on Accuracy Evaluation) methods have been developed by University of Pisa (UNIPI) in the framework of a long lasting research activities started since 80's and involving several researchers. CIAU is extensively discussed in the available technical literature, Refs. [1, 2, 3, 4, 5, 6, 7], and tens of additional relevant papers, that provide comprehensive details about the method, can be found in the bibliography lists of the above references. Therefore, the present paper supplies only 'spot-information' about CIAU and focuses mostly on the applications to some cases of industrial interest. In particular the application of CIAU to the OECD BEMUSE (Best Estimate Methods Uncertainty and Sensitivity Evaluation, [8, 9]) project is discussed and a critical comparison respect with other uncertainty methods (in relation to items like: sources of uncertainties, selection of the input parameters and quantification of

  12. Comparison of Greenhouse Gas Offset Quantification Protocols for Nitrogen Management in Dryland Wheat Cropping Systems of the Pacific Northwest

    Directory of Open Access Journals (Sweden)

    Tabitha T. Brown

    2017-11-01

    Full Text Available In the carbon market, greenhouse gas (GHG offset protocols need to ensure that emission reductions are of high quality, quantifiable, and real. Lack of consistency across protocols for quantifying emission reductions compromise the credibility of offsets generated. Thus, protocol quantification methodologies need to be periodically reviewed to ensure emission offsets are credited accurately and updated to support practical climate policy solutions. Current GHG emission offset credits generated by agricultural nitrogen (N management activities are based on reducing the annual N fertilizer application rate for a given crop without reducing yield. We performed a “road test” of agricultural N management protocols to evaluate differences among protocol components and quantify nitrous oxide (N2O emission reductions under sample projects relevant to N management in dryland, wheat-based cropping systems of the inland Pacific Northwest (iPNW. We evaluated five agricultural N management offset protocols applicable to North America: two methodologies of American Carbon Registry (ACR1 and ACR2, Verified Carbon Standard (VCS, Climate Action Reserve (CAR, and Alberta Offset Credit System (Alberta. We found that only two protocols, ACR2 and VCS, were suitable for this study, in which four sample projects were developed representing feasible N fertilizer rate reduction activities. The ACR2 and VCS protocols had identical baseline and project emission quantification methodologies resulting in identical emission reduction values. Reducing N fertilizer application rate by switching to variable rate N (sample projects 1–3 or split N application (sample project 4 management resulted in a N2O emission reduction ranging from 0.07 to 0.16, and 0.26 Mg CO2e ha−1, respectively. Across the range of C prices considered ($5, $10, and $50 per metric ton of CO2 equivalent, we concluded that the N2O emission offset payment alone ($0.35–$13.0 ha−1 was unlikely to

  13. Strategy study of quantification harmonization of SUV in PET/CT images

    International Nuclear Information System (INIS)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-01-01

    and quantitative assessments in different scopes. We concluded that the harmonization strategy of the SUV quantification presented in this paper was effective in reducing the variability of small structures quantification. However, for the comparison of SUV quantification between different scanners and institutions, it is essential that, in addition to the harmonization of quantification, the standardization of the methodology of patient preparation is maintained, in order to minimize the SUV variability due to biological factors. (author)

  14. Sustainable Facility Development: Perceived Benefits and Challenges

    Science.gov (United States)

    Stinnett, Brad; Gibson, Fred

    2016-01-01

    Purpose: The purpose of this paper is to assess the perceived benefits and challenges of implementing sustainable initiatives in collegiate recreational sports facilities. Additionally, this paper intends to contribute to the evolving field of facility sustainability in higher education. Design/methodology/approach The design included qualitative…

  15. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...... standard curve in the Quantifiler Human DNA Quantification kit, the DNA quantification results of the human DNA preparations were 31% higher than expected based on the manufacturers' information. The results indicate a calibration problem with the Quantifiler human DNA standard for its use...

  16. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  17. Methodology for ranking restoration options

    International Nuclear Information System (INIS)

    Hedemann Jensen, Per

    1999-04-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developing generic methodologies for ranking restoration techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps: characterisation of relevant contaminated sites; identification and characterisation of relevant restoration techniques; assessment of the radiological impact; development and application of a selection methodology for restoration options; formulation of generic conclusions and development of a manual. The project is intended to apply to situations in which sites with nuclear installations have been contaminated with radioactive materials as a result of the operation of these installations. The areas considered for remedial measures include contaminated land areas, rivers and sediments in rivers, lakes, and sea areas. Five contaminated European sites have been studied. Various remedial measures have been envisaged with respect to the optimisation of the protection of the populations being exposed to the radionuclides at the sites. Cost-benefit analysis and multi-attribute utility analysis have been applied for optimisation. Health, economic and social attributes have been included and weighting factors for the different attributes have been determined by the use of scaling constants. (au)

  18. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  19. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  20. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    Science.gov (United States)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  1. Issues connected with indirect cost quantification: a focus on the transportation system

    Science.gov (United States)

    Křivánková, Zuzana; Bíl, Michal; Kubeček, Jan; Vodák, Rostislav

    2017-04-01

    Transportation and communication networks in general are vital parts of modern society. The economy relies heavily on transportation system performance. A number of people commutes to work regularly. Stockpiles in many companies are being reduced as the just-in-time production process is able to supply resources via the transportation network on time. Natural hazards have the potential to disturb transportation systems. Earthquakes, flooding or landsliding are examples of high-energetic processes which are capable of causing direct losses (i.e. physical damage to the infrastructure). We have focused on quantification of the indirect cost of natural hazards which are not easy to estimate. Indirect losses can also emerge as a result of meteorological hazards with low energy which only seldom cause direct losses, e.g. glaze, snowfall. Whereas evidence of repair work and general direct costs usually exist or can be estimated, indirect costs are much more difficult to identify particularly when they are not covered by insurance agencies. Delimitations of alternative routes (detours) are the most frequent responses to blocked road links. Indirect costs can then be related to increased fuel consumption and additional operating costs. Detours usually result in prolonged travel times. Indirect costs quantification has to therefore cover the value of the time. The costs from the delay are a nonlinear function of travel time, however. The existence of an alternative transportation pattern may also result in an increased number of traffic crashes. This topic has not been studied in depth but an increase in traffic crashes has been reported when people suddenly changed their traffic modes, e.g. when air traffic was not possible. The lost user benefit from those trips that were cancelled or suppressed is also difficult to quantify. Several approaches, based on post-event questioner surveys, have been applied to communities and companies affected by transportation accessibility

  2. Utility of radiotracer methodology in scientific research of industrial relevancy

    International Nuclear Information System (INIS)

    Kolar, Z.I.

    1990-01-01

    Utilization of radiotracer methodology in industrial research provides substantial scientific rather than directly demonstrable economic benefits. These benefits include better understanding of industrial processes and subsequently the development of new ones. Examples are given of the use of radiotracers in technological studies and the significance of the obtained results is put down. Creative application of radiotracer methodology may contribute to the economic development and technological advancement of all countries including the developing ones. (orig.) [de

  3. Quantitative evaluation of geodiversity: development of methodological procedures with application to territorial management

    Science.gov (United States)

    Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.

    2012-04-01

    Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity

  4. Ensuring VGI Credibility in Urban-Community Data Generation: A Methodological Research Design

    Directory of Open Access Journals (Sweden)

    Jamie O'Brien

    2016-06-01

    Full Text Available In this paper we outline the methodological development of current research into urban community formations based on combinations of qualitative (volunteered and quantitative (spatial analytical and geo-statistical data. We outline a research design that addresses problems of data quality relating to credibility in volunteered geographic information (VGI intended for Web-enabled participatory planning. Here we have drawn on a dual notion of credibility in VGI data, and propose a methodological workflow to address its criteria. We propose a ‘super-positional’ model of urban community formations, and report on the combination of quantitative and participatory methods employed to underpin its integration. The objective of this methodological phase of study is to enhance confidence in the quality of data for Web-enabled participatory planning. Our participatory method has been supported by rigorous quantification of area characteristics, including participant communities’ demographic and socio-economic contexts. This participatory method provided participants with a ready and accessible format for observing and mark-making, which allowed the investigators to iterate rapidly a system design based on participants’ responses to the workshop tasks. Participatory workshops have involved secondary school-age children in socio-economically contrasting areas of Liverpool (Merseyside, UK, which offers a test-bed for comparing communities’ formations in comparative contexts, while bringing an under-represented section of the population into a planning domain, whose experience may stem from public and non-motorised transport modalities. Data has been gathered through one-day participatory workshops, featuring questionnaire surveys, local site analysis, perception mapping and brief, textual descriptions. This innovative approach will support Web-based participation among stakeholding planners, who may benefit from well-structured, community

  5. The benefit of daily photoprotection.

    Science.gov (United States)

    Seité, Sophie; Fourtanier, Anny M A

    2008-05-01

    It is now recognized that both ultraviolet (UV)-A and UVB wavelengths participate in the generation of photodamaged human skin during sun exposure. During usual daily activities, an appropriate protection against solar UV exposure should prevent clinical, cellular, and molecular changes potentially leading to photoaging. This study was designed to evaluate in human beings the protection afforded by a day cream containing a photostable combination of UVB and UVA filters and thus protect against the UV-induced skin alterations. In solar-simulated radiation exposed and unprotected skin sites we observed melanization. The epidermis revealed a significant increase in stratum corneum and stratum granulosum thickness. In the dermis, an enhanced expression of tenascin and a reduced expression of type I procollagen were evidenced just below the dermoepidermal junction. Although no change in elastic fibers in exposed buttock skin was seen, a slightly increased deposit of lysozyme and alpha-1 antitrypsin on elastin fibers was observed using immunofluorescence techniques. A day cream with photoprotection properties was shown to prevent all of the above-described alterations. This study was performed on a limited number of patients (n = 12) with specific characteristics (20-35 years old and skin type II and III). Two dermal alterations were evaluated by visual assessment and not by computer-assisted image analysis quantification. Our in vivo results demonstrate the benefits of daily photoprotection using a day cream containing appropriate broad-spectrum sunscreens, which prevent solar UV-induced skin damages.

  6. Development of a methodology for the detection of Ra226 in large volumes of water by gamma spectrometry; modification and validation of the method for detection and quantification of Ra226 in small volumes of water by alpha spectrometry, used by the Centro de Investigacion en Ciencias Atomicas, Nucleares y Moleculares (CICANUM, UCR)

    International Nuclear Information System (INIS)

    Molina Porras, Arnold

    2011-01-01

    The test method has been validated for quantifying the specific activity of Ra 226 in water alpha spectrometry. The CICANUM has used this method as part of the proposed harmonization of methods ARCAL (IAEA). The method is based on a first separation and preconcentration of Ra 226 by coprecipitation and subsequent MnO 2 micro precipitation as Ba (Ra) SO 4 . Samples were prepared and then was performed the counting by alpha spectrometry. A methodology of radio sampling for large volumes of water was tested in parallel, using acrylic fibers impregnated with manganese oxide (IV) to determine the amount of Ra 226 present by gamma spectrometry. Small-scale tests, have determined that the best way to prepare the fiber is the reference method found in the literature and using the oven at 60 degrees Celsius. (author) [es

  7. Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.

    Science.gov (United States)

    1979-09-01

    This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...

  8. Quantification of in vivo oxidative damage in Caenorhabditis elegans during aging by endogenous F3-isoprostane measurement

    NARCIS (Netherlands)

    Labuschagne, C.F.; Stigter, E.C.; Hendriks, M.M.; Berger, R.; Rokach, J.; Korswagen, H.C.; Brenkman, A.B.

    2013-01-01

    Oxidative damage is thought to be a major cause in development of pathologies and aging. However, quantification of oxidative damage is methodologically difficult. Here, we present a robust liquid chromatography-tandem mass spectrometry (LC-MS/MS) approach for accurate, sensitive, and linear in vivo

  9. Study on dynamic team performance evaluation methodology based on team situation awareness model

    International Nuclear Information System (INIS)

    Kim, Suk Chul

    2005-02-01

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  10. Study on dynamic team performance evaluation methodology based on team situation awareness model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Chul

    2005-02-15

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  11. Benefits of Exercise

    Science.gov (United States)

    ... activity into your life. To get the most benefit, you should try to get the recommended amount ... likely even live longer. What are the health benefits of exercise? Regular exercise and physical activity may ...

  12. Medicare Hospice Benefits

    Science.gov (United States)

    CENTERS for MEDICARE & MEDICAID SERVICES Medicare Hospice Benefits This official government booklet includes information about Medicare hospice benefits: Who’s eligible for hospice care What services are included in hospice care How ...

  13. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Min, Jung Joon; Lee, Byeong Il; Choi, Eun Seo; Tak, Yoon O; Choi, Heung Kook; Lee, Ju Young

    2009-01-01

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  14. Employee motivation and benefits

    OpenAIRE

    Březíková, Tereza

    2009-01-01

    The topic of my bachelor's thesis is the employee motivation and benefits. The thesis is divided in two parts, a theoretical one and a practical one. The theoretical part deals with the theory of motivation and individual employee benefits. The practical part describes employee benefits in ČSOB, where I did my research by questionnaires that were filled in by employees from different departments of ČSOB. These employees answered questions about their work motivation and benefits. The resultts...

  15. Analysis of benefits

    OpenAIRE

    Kováříková, Kamila

    2012-01-01

    This master thesis deals with employee benefits in the current labour market, especially from the perspective of young employees. The first part is focused on the theory of motivation and employee benefits also with their tax impact on employee's income. Employee benefits in the current labour market, employee's satisfaction and employer's attitude to this issue are analyzed in the second part of this thesis.

  16. Introduction to LCA Methodology

    DEFF Research Database (Denmark)

    Hauschild, Michael Z.

    2018-01-01

    In order to offer the reader an overview of the LCA methodology in the preparation of the more detailed description of its different phases, a brief introduction is given to the methodological framework according to the ISO 14040 standard and the main elements of each of its phases. Emphasis...

  17. Methodologies, languages and tools

    International Nuclear Information System (INIS)

    Amako, Katsuya

    1994-01-01

    This is a summary of the open-quotes Methodologies, Languages and Toolsclose quotes session in the CHEP'94 conference. All the contributions to methodologies and languages are relevant to the object-oriented approach. Other topics presented are related to various software tools in the down-sized computing environment

  18. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  20. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works...

  1. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  2. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  3. Análise dos métodos de contagem de pontos e planímetro na quantificação do biofilme da dentadura: um estudo de validação metodológica Analysis of the point-counting and planimetric methods in the quantification of the biofilm of dentures: a study of methodological validation

    Directory of Open Access Journals (Sweden)

    Roseana Aparecida Gomes FERNANDES

    2002-03-01

    Full Text Available Dois métodos de quantificação de biofilme da dentadura (contagem de pontos e planímetro foram testados e comparados com o método de pesagem de papel e Índice de Higiene de Prótese. Superfícies internas de 62 próteses foram coradas, fotografadas e as áreas total e do biofilme foram projetadas em papel e contornadas com grafite. O método de contagem de pontos (experimental 1 foi aplicado com uma grade de pontos. Para o método do planímetro (experimental 2, as áreas foram medidas com um planímetro digital e para o método de pesagem (controle 1 foram recortadas e pesadas em balança de precisão. No Índice de Higiene de Prótese (controle 2, utilizou-se a atribuição de escores. Os resultados mostraram uma porcentagem de concordância entre os métodos experimentais e controle 1 de 82% (contagem de pontos e 95% (planímetro, bem como alto grau de correlação (r = 0,98; r = 0,99 entre os valores obtidos. Quando comparados com o controle 2, houve concordância em 55% (contagem de pontos e 37% (planímetro dos casos. Os métodos experimentais podem ser úteis em estudos clínicos para avaliação da eficácia de agentes de higienização.Two methods of quantification of the biofilm (point-counting and planimetric were tested and compared with the paper-weighing method and with the Prosthesis Hygiene Index. The internal surfaces of 62 complete dentures were stained and photographed. The slides were projected on a paper sheet. The total area and the area covered with biofilm were contoured using a black pencil. The point-counting method (experimental 1 was carried out on a mesh of equidistant points. For the planimetric method (experimental 2, the areas of interest were measured by means of a digital planimeter. In the paper-weighing method (control 1 the areas of interest were cut and weighed on a precision scale. In the determination of the Prosthesis Hygiene Index (control 2, the accumulation of biofilm was estimated by means of a

  4. Urinary Cell-Free DNA Quantification as Non-Invasive Biomarker in Patients with Bladder Cancer.

    Science.gov (United States)

    Brisuda, Antonin; Pazourkova, Eva; Soukup, Viktor; Horinek, Ales; Hrbáček, Jan; Capoun, Otakar; Svobodova, Iveta; Pospisilova, Sarka; Korabecna, Marie; Mares, Jaroslav; Hanuš, Tomáš; Babjuk, Marek

    2016-01-01

    Concentration of urinary cell-free DNA (ucfDNA) belongs to potential bladder cancer markers, but the reported results are inconsistent due to the use of various non-standardised methodologies. The aim of the study was to standardise the methodology for ucfDNA quantification as a potential non-invasive tumour biomarker. In total, 66 patients and 34 controls were enrolled into the study. Volumes of each urine portion (V) were recorded and ucfDNA concentrations (c) were measured using real-time PCR. Total amounts (TA) of ucfDNA were calculated and compared between patients and controls. Diagnostic accuracy of the TA of ucfDNA was determined. The calculation of TA of ucfDNA in the second urine portion was the most appropriate approach to ucfDNA quantification, as there was logarithmic dependence between the volume and the concentration of a urine portion (p = 0.0001). Using this methodology, we were able to discriminate between bladder cancer patients and subjects without bladder tumours (p = 0.0002) with area under the ROC curve of 0.725. Positive and negative predictive value of the test was 90 and 45%, respectively. Quantification of ucf DNA according to our modified method could provide a potential non-invasive biomarker for diagnosis of patients with bladder cancer. © 2015 S. Karger AG, Basel.

  5. A Short Review of FDTD-Based Methods for Uncertainty Quantification in Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    Theodoros T. Zygiridis

    2017-01-01

    Full Text Available We provide a review of selected computational methodologies that are based on the deterministic finite-difference time-domain algorithm and are suitable for the investigation of electromagnetic problems involving uncertainties. As it will become apparent, several alternatives capable of performing uncertainty quantification in a variety of cases exist, each one exhibiting different qualities and ranges of applicability, which we intend to point out here. Given the numerous available approaches, the purpose of this paper is to clarify the main strengths and weaknesses of the described methodologies and help the potential readers to safely select the most suitable approach for their problem under consideration.

  6. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  7. An EPGPT-based approach for uncertainty quantification

    International Nuclear Information System (INIS)

    Wang, C.; Abdel-Khalik, H. S.

    2012-01-01

    Generalized Perturbation Theory (GPT) has been widely used by many scientific disciplines to perform sensitivity analysis and uncertainty quantification. This manuscript employs recent developments in GPT theory, collectively referred to as Exact-to-Precision Generalized Perturbation Theory (EPGPT), to enable uncertainty quantification for computationally challenging models, e.g. nonlinear models associated with many input parameters and many output responses and with general non-Gaussian parameters distributions. The core difference between EPGPT and existing GPT is in the way the problem is formulated. GPT formulates an adjoint problem that is dependent on the response of interest. It tries to capture via the adjoint solution the relationship between the response of interest and the constraints on the state variations. EPGPT recasts the problem in terms of a smaller set of what is referred to as the 'active' responses which are solely dependent on the physics model and the boundary and initial conditions rather than on the responses of interest. The objective of this work is to apply an EPGPT methodology to propagate cross-sections variations in typical reactor design calculations. The goal is to illustrate its use and the associated impact for situations where the typical Gaussian assumption for parameters uncertainties is not valid and when nonlinear behavior must be considered. To allow this demonstration, exaggerated variations will be employed to stimulate nonlinear behavior in simple prototypical neutronics models. (authors)

  8. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  9. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  10. Quantification of virus syndrome in chili peppers

    African Journals Online (AJOL)

    Jane

    2011-06-15

    Jun 15, 2011 ... alternative for the quantification of the disease' syndromes in regards to this crop. The result of these ..... parison of treatments such as cultivars or control measures and ..... Vascular discoloration and stem necrosis. 2.

  11. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  12. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    Science.gov (United States)

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. 42 CFR 493.649 - Methodology for determining fee amount.

    Science.gov (United States)

    2010-10-01

    ... fringe benefit costs to support the required number of State inspectors, management and direct support... full time equivalent employee. Included in this cost are salary and fringe benefit costs, necessary... 42 Public Health 5 2010-10-01 2010-10-01 false Methodology for determining fee amount. 493.649...

  14. Tannins quantification in barks of Mimosa tenuiflora and Acacia mearnsii

    Directory of Open Access Journals (Sweden)

    Leandro Calegari

    2016-03-01

    Full Text Available Due to its chemical complexity, there are several methodologies for vegetable tannins quantification. Thus, this work aims at quantifying both tannin and non-tannin substances present in the barks of Mimosa tenuiflora and Acacia mearnsii by two different methods. From bark particles of both species, analytical solutions were produced by using a steam-jacketed extractor. The solution was analyzed by Stiasny and hide-powder (no chromed methods. For both species, tannin levels were superior when analyzed by hide-powder method, reaching 47.8% and 24.1% for A. mearnsii and M. tenuiflora, respectively. By Stiasny method, the tannins levels considered were 39.0% for A. mearnsii, and 15.5% for M. tenuiflora. Despite the best results presented by A. mearnsii, the bark of M. tenuiflora also showed great potential due to its considerable amount of tannin and the availability of the species at Caatinga biome.

  15. Ideas underlying the Quantification of Margins and Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Pilch, Martin, E-mail: mpilch@sandia.gov [Department 1514, Sandia National Laboratories, Albuquerque, NM 87185-0828 (United States); Trucano, Timothy G. [Department 1411, Sandia National Laboratories, Albuquerque, NM 87185-0370 (United States); Helton, Jon C. [Department of Mathematics and Statistics, Arizona State University, Tempe, AZ 85287-1804 (United States)

    2011-09-15

    Key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions are described. While QMU is a broad process and methodology for generating critical technical information to be used in U.S. nuclear weapon stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, the following topics are discussed: (i) the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, (ii) the need to separate aleatory and epistemic uncertainty in QMU, and (iii) the properties of risk-informed decision making (RIDM) that are best suited for effective application of QMU. The paper is written at a high level, but provides an extensive bibliography of useful papers for interested readers to deepen their understanding of the presented ideas.

  16. Plasticity models of material variability based on uncertainty quantification techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Reese E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Rizzi, Francesco [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Boyce, Brad [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Templeton, Jeremy Alan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ostien, Jakob [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-11-01

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQ techniques can be used in model selection and assessing the quality of calibrated physical parameters.

  17. An Uncertainty Quantification Framework for Remote Sensing Retrievals

    Science.gov (United States)

    Braverman, A. J.; Hobbs, J.

    2017-12-01

    Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.

  18. Neural Networks Methodology and Applications

    CERN Document Server

    Dreyfus, Gérard

    2005-01-01

    Neural networks represent a powerful data processing technique that has reached maturity and broad application. When clearly understood and appropriately used, they are a mandatory component in the toolbox of any engineer who wants make the best use of the available data, in order to build models, make predictions, mine data, recognize shapes or signals, etc. Ranging from theoretical foundations to real-life applications, this book is intended to provide engineers and researchers with clear methodologies for taking advantage of neural networks in industrial, financial or banking applications, many instances of which are presented in the book. For the benefit of readers wishing to gain deeper knowledge of the topics, the book features appendices that provide theoretical details for greater insight, and algorithmic details for efficient programming and implementation. The chapters have been written by experts ands seemlessly edited to present a coherent and comprehensive, yet not redundant, practically-oriented...

  19. Methodologies of health impact assessment as part of an integrated approach to reduce effects of air pollution

    Energy Technology Data Exchange (ETDEWEB)

    Aunan, K; Seip, H M

    1995-12-01

    Quantification of average frequencies of health effects on a population level is an essential part of an integrated assessment of pollution effects. Epidemiological studies seem to provide the best basis for such estimates. This paper gives an introduction to a methodology for health impact assessment and also the results from selected parts of a case study in Hungary. This case study is aimed at testing and improving the methodology for integrated assessment and focuses on energy production and consumption and implications for air pollution. Using monitoring data from Budapest, the paper gives estimates of excess frequencies of respiratory illness, mortality and other health end-points. For a number of health end-points, particles probably may serve as a good indicator component. Stochastic simulation is used to illustrate the uncertainties imbedded in the exposure-response function applied. The paper uses the ``bottom up approach`` to find cost-effective abatement strategies against pollution damages, where specific abatement measures such as emission standards for vehicles are explored in detail. It is concluded that in spite of large uncertainties in every step of the analysis, an integrated assessment of costs and benefits of different abatement measures is valuable as it clarifies the main objectives of an abatement policy and explicitly describes the adverse impacts of different activities and their relative importance. 46 refs., 11 figs., 2 tabs.

  20. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  1. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  2. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.

    2014-01-01

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  3. Inverse problems and uncertainty quantification

    KAUST Repository

    Litvinenko, Alexander

    2013-12-18

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)— the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  4. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  5. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  6. Financial methodology for Brazilian market of small producers of oil and natural gas, based on Canadian and North American experiences in reserves quantification, evaluation and certification; Metodologia de financeiamento para pequenos produtores do mercado brasileiro de petroleo e gas natural, baseado nas experiencias canadense e americana na quantificacao, valoracao e certificacao de reservas

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Enrico Brunno Zipoli de Sousa e [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Programa de Pos-Graduacao em Geologia; Coelho, Jose Mario [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Dept. de Minas

    2008-07-01

    ANP (National Agency of Petroleum, Natural gas and Biofuels), through auctions of exploratory blocks in the subsequent years from the break of the monopoly of PETROBRAS with the Law 9.478 of 1997, had important role in the opening of the section and in the attainment of the self-sufficiency of petroleum. However the petroleum production in ripe and marginal fields were left off, since the initial interest in the first rounds was to attract the major companies. - International Oil Companies (IOC) - when ANP granted great blocks offshore. Ripe fields are defined as fields in phase of irreversible declination and marginal fields are also defined as economical concept, certain for business decision and external economical factors (price of the oil, etc.). Canada and USA, worldwide leaders in the market of petroleum and gas, possess politics that benefit and maintain the small companies protected of the competition of INOC's by assuring small companies finance through the guarantee of proved reserves of oil. This paper assemble Canadian and American experiences in regulation for small companies investments and compares it with Brazilian financing types, which is restricted due to the Brazilian finance agent's despite about oil and gas activity. (author)

  7. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    Science.gov (United States)

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. The affect heuristic in judgments of risks and benefits

    Energy Technology Data Exchange (ETDEWEB)

    Finucane, M.; Slovic, P.; Johnson, S.M. [Decision Research, 1201 Oak St, Eugene, Oregon (United States); Alhakami, A. [Imam Muhammad Ibn Saud Islamic University Psychology Dept. (Saudi Arabia)

    1998-07-01

    The role of affect in judgment of risks and benefits is examined in two studies. Despite using different methodologies the two studies suggest that risk and benefit are linked somehow in people's perception, consequently influencing their judgments. Short paper.

  9. The Service Learning Projects: Stakeholder Benefits and Potential Class Topics

    Science.gov (United States)

    Rutti, Raina M.; LaBonte, Joanne; Helms, Marilyn Michelle; Hervani, Aref Agahei; Sarkarat, Sy

    2016-01-01

    Purpose: The purpose of this paper is to summarize the benefits of including a service learning project in college classes and focusses on benefits to all stakeholders, including students, community, and faculty. Design/methodology/approach: Using a snowball approach in academic databases as well as a nominal group technique to poll faculty, key…

  10. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  11. GPS system simulation methodology

    Science.gov (United States)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  12. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  13. Nonlinear Image Denoising Methodologies

    National Research Council Canada - National Science Library

    Yufang, Bao

    2002-01-01

    In this thesis, we propose a theoretical as well as practical framework to combine geometric prior information to a statistical/probabilistic methodology in the investigation of a denoising problem...

  14. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    "Now viewed as its own scientific discipline, clinical trial methodology encompasses the methods required for the protection of participants in a clinical trial and the methods necessary to provide...

  15. Risk-based Regulatory Evaluation Program methodology

    International Nuclear Information System (INIS)

    DuCharme, A.R.; Sanders, G.A.; Carlson, D.D.; Asselin, S.V.

    1987-01-01

    The objectives of this DOE-supported Regulatory Evaluation Progrwam are to analyze and evaluate the safety importance and economic significance of existing regulatory guidance in order to assist in the improvement of the regulatory process for current generation and future design reactors. A risk-based cost-benefit methodology was developed to evaluate the safety benefit and cost of specific regulations or Standard Review Plan sections. Risk-based methods can be used in lieu of or in combination with deterministic methods in developing regulatory requirements and reaching regulatory decisions

  16. Water Footprint Symposium: where next for water footprint and water assessment methodology?

    NARCIS (Netherlands)

    Tillotson, M.R.; Kiu, J.; Guan, D.; Wu, P.; Zhao, Xu; Zhang, Guoping; Pfister, S.; Pahlow, Markus

    2014-01-01

    Recognizing the need for a comprehensive review of the tools and metrics for the quantification and assessment of water footprints, and allowing for the opportunity for open discussion on the challenges and future of water footprinting methodology, an international symposium on water footprint was

  17. Water Footprint Symposium : where next for water footprint and water assessment methodology?

    NARCIS (Netherlands)

    Tillotson, Martin R.; Liu, Junguo; Guan, Dabo; Wu, Pute; Zhao, Xu; Zhang, Guoping; Pfister, Stephan; Pahlow, Markus

    2014-01-01

    Recognizing the need for a comprehensive review of the tools and metrics for the quantification and assessment of water footprints, and allowing for the opportunity for open discussion on the challenges and future of water footprinting methodology, an international symposium on water footprint was

  18. Analysis of Employee Benefits

    OpenAIRE

    Burešová, Lenka

    2013-01-01

    The target of this bachelor thesis is to analyze employee benefits from the perspective of employees and to employers suggest possible ideas to improve their provision. The work is divided into two parts: theoretical and practical. The theoretical part describes the overal remuneration of employees, payroll system and employee benefits. Benefits are included in the remuneration system, broken and some of them are defined. The practical part presents a survey among employees in the Czech Repub...

  19. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  20. Transit Benefit Program Data -

    Data.gov (United States)

    Department of Transportation — This data set contains information about any US government agency participating in the transit benefits program, funding agreements, individual participating Federal...

  1. Overview of hybrid subspace methods for uncertainty quantification, sensitivity analysis

    International Nuclear Information System (INIS)

    Abdel-Khalik, Hany S.; Bang, Youngsuk; Wang, Congjian

    2013-01-01

    Highlights: ► We overview the state-of-the-art in uncertainty quantification and sensitivity analysis. ► We overview new developments in above areas using hybrid methods. ► We give a tutorial introduction to above areas and the new developments. ► Hybrid methods address the explosion in dimensionality in nonlinear models. ► Representative numerical experiments are given. -- Abstract: The role of modeling and simulation has been heavily promoted in recent years to improve understanding of complex engineering systems. To realize the benefits of modeling and simulation, concerted efforts in the areas of uncertainty quantification and sensitivity analysis are required. The manuscript intends to serve as a pedagogical presentation of the material to young researchers and practitioners with little background on the subjects. We believe this is important as the role of these subjects is expected to be integral to the design, safety, and operation of existing as well as next generation reactors. In addition to covering the basics, an overview of the current state-of-the-art will be given with particular emphasis on the challenges pertaining to nuclear reactor modeling. The second objective will focus on presenting our own development of hybrid subspace methods intended to address the explosion in the computational overhead required when handling real-world complex engineering systems.

  2. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  3. Socio-economic research on fusion. SERF 1997-98. Macro Tast E2: External costs and benefits. Task 2: Comparison of external costs

    International Nuclear Information System (INIS)

    Schleisner, Lotte; Korhonen, Riitta

    1998-12-01

    This report is part of the SERF (Socio-Economic Research on Fusion) project, Macro Task E2, which covers External Costs and Benefits. The report is the documentation of Task 2, Comparison of External Costs. The aim of Task 2 Comparison of External Costs, has been to compare the external costs of the fusion energy with those from other alternative energy generation technologies. In this task identification and quantification of the external costs for wind energy and photovoltaic have been performed by Risoe, while identification and quantification of the external cost for nuclear fission and fossil fuels have been discussed by VTT. The methodology used for the assessment of the externalities of the fuel cycles selected has been the one developed within the ExternE Project. First estimates for the externalities of fusion energy have been under examination in Macrotask E2. Externalities of fossil fuels and nuclear fission have already been evaluated in the ExternE project and a vast amount of material for different sites in various countries is available. This material is used in comparison. In the case of renewable wind energy and photovoltaic are assessed separately. External costs of the various alternatives may change as new technologies are developed and costs can to a high extent be avoided (e.g. acidifying impacts but also global warming due to carbon dioxide emissions). Also fusion technology can experience major progress and some important cost components probably can be avoided already by 2050. (EG)

  4. Socio-economic research on fusion. SERF 1997-98. Macro Tast E2: External costs and benefits. Task 2: Comparison of external costs

    Energy Technology Data Exchange (ETDEWEB)

    Schleisner, Lotte; Korhonen, Riitta

    1998-12-01

    This report is part of the SERF (Socio-Economic Research on Fusion) project, Macro Task E2, which covers External Costs and Benefits. The report is the documentation of Task 2, Comparison of External Costs. The aim of Task 2 Comparison of External Costs, has been to compare the external costs of the fusion energy with those from other alternative energy generation technologies. In this task identification and quantification of the external costs for wind energy and photovoltaic have been performed by Risoe, while identification and quantification of the external cost for nuclear fission and fossil fuels have been discussed by VTT. The methodology used for the assessment of the externalities of the fuel cycles selected has been the one developed within the ExternE Project. First estimates for the externalities of fusion energy have been under examination in Macrotask E2. Externalities of fossil fuels and nuclear fission have already been evaluated in the ExternE project and a vast amount of material for different sites in various countries is available. This material is used in comparison. In the case of renewable wind energy and photovoltaic are assessed separately. External costs of the various alternatives may change as new technologies are developed and costs can to a high extent be avoided (e.g. acidifying impacts but also global warming due to carbon dioxide emissions). Also fusion technology can experience major progress and some important cost components probably can be avoided already by 2050. (EG) 36 refs.

  5. Health economic evaluation: important principles and methodology.

    Science.gov (United States)

    Rudmik, Luke; Drummond, Michael

    2013-06-01

    To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.

  6. Unemployment Benefit Exhaustion

    DEFF Research Database (Denmark)

    Filges, Trine; Pico Geerdsen, Lars; Knudsen, Anne-Sofie Due

    2015-01-01

    This systematic review studied the impact of exhaustion of unemployment benefits on the exit rate out of unemployment and into employment prior to benefit exhaustion or shortly thereafter. Method: We followed Campbell Collaboration guidelines to prepare this review, and ultimately located 12...

  7. Putting Paid to Benefits

    NARCIS (Netherlands)

    Stella Hoff; Gerda Jehoel-Gijsbers; J.M. Wildeboer Schut

    2003-01-01

    Original title: De uitkering van de baan. A good deal of time, money and effort is invested in the reintegration of benefit claimants. What is the result? How many recipients of disability, unemployment or social assistance benefit are in principle capable of working but are currently not

  8. Nanocosmetics: benefits and risks

    OpenAIRE

    Shokri, Javad

    2017-01-01

    Summary Various nanomaterials/nanoparticles (NPs) have been used for the development of cosmetic products - a field so-called nanocosmetic formulations. These advanced materials offer some benefits, while their utilization in the cosmetic formulations may be associated with some risks. The main aim of this editorial is to highlight the benefits and risks of the nanomaterials used in the cosmetic products.

  9. Who Benefits from Religion?

    Science.gov (United States)

    Mochon, Daniel; Norton, Michael I.; Ariely, Dan

    2011-01-01

    Many studies have documented the benefits of religious involvement. Indeed, highly religious people tend to be healthier, live longer, and have higher levels of subjective well-being. While religious involvement offers clear benefits to many, in this paper we explore whether it may also be detrimental to some. Specifically, we examine in detail…

  10. Wellbeing or welfare benefits

    DEFF Research Database (Denmark)

    Handlos, Line Neerup; Kristiansen, Maria; Nørredam, Marie Louise

    2016-01-01

    This debate article debunks the myth that migrants are driven primarily by the size of the welfare benefits in the host country, when they decide where to migrate to. We show that instead of welfare benefits, migrants are driven by a desire for safety, wellbeing, social networks and opportunities...

  11. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  12. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  13. Bone histomorphometric quantification by X-ray phase contrast and transmission 3D SR microcomputed tomography

    International Nuclear Information System (INIS)

    Nogueira, L.P.; Pinheiro, C.J.G.; Braz, D.; Oliveira, L.F.; Barroso, R.C.

    2008-01-01

    Full text: Conventional histomorphometry is an important method for quantitative evaluation of bone microstructure. X-ray computed tomography is a noninvasive technique, which can be used to evaluate histomorphometric indices. In this technique, the output 3D images are used to quantify the whole sample, differently from the conventional one, in which the quantification is performed in 2D slices and extrapolated for 3D case. Looking for better resolutions and visualization of soft tissues, X-ray phase contrast imaging technique was developed. The objective of this work was to perform histomorphometric quantification of human cancellous bone using 3D synchrotron X ray computed microtomography, using two distinct techniques: transmission and phase contrast, in order to compare the results and evaluate the viability of applying the same methodology of quantification for both technique. All experiments were performed at the ELETTRA Synchrotron Light Laboratory in Trieste (Italy). MicroCT data sets were collected using the CT set-up on the SYRMEP (Synchrotron Radiation for Medical Physics) beamline. Results showed that there is a better correlation between histomorphometric parameters of both techniques when morphological filters had been used. However, using these filters, some important information given by phase contrast are lost and they shall be explored by new techniques of quantification

  14. Forest Carbon Leakage Quantification Methods and Their Suitability for Assessing Leakage in REDD

    Directory of Open Access Journals (Sweden)

    Sabine Henders

    2012-01-01

    Full Text Available This paper assesses quantification methods for carbon leakage from forestry activities for their suitability in leakage accounting in a future Reducing Emissions from Deforestation and Forest Degradation (REDD mechanism. To that end, we first conducted a literature review to identify specific pre-requisites for leakage assessment in REDD. We then analyzed a total of 34 quantification methods for leakage emissions from the Clean Development Mechanism (CDM, the Verified Carbon Standard (VCS, the Climate Action Reserve (CAR, the CarbonFix Standard (CFS, and from scientific literature sources. We screened these methods for the leakage aspects they address in terms of leakage type, tools used for quantification and the geographical scale covered. Results show that leakage methods can be grouped into nine main methodological approaches, six of which could fulfill the recommended REDD leakage requirements if approaches for primary and secondary leakage are combined. The majority of methods assessed, address either primary or secondary leakage; the former mostly on a local or regional and the latter on national scale. The VCS is found to be the only carbon accounting standard at present to fulfill all leakage quantification requisites in REDD. However, a lack of accounting methods was identified for international leakage, which was addressed by only two methods, both from scientific literature.

  15. The policy trail methodology

    DEFF Research Database (Denmark)

    Holford, John; Larson, Anne; Melo, Susana

    of ‘policy trail’, arguing that it can overcome ‘methodological nationalism’ and link structure and agency in research on the ‘European educational space’. The ‘trail’ metaphor, she suggests, captures the intentionality and the erratic character of policy. The trail connects sites and brings about change......, but – although policy may be intended to be linear, with specific outcomes – policy often has to bend, and sometimes meets insurmountable obstacles. This symposium outlines and develops the methodology, but also reports on research undertaken within a major FP7 project (LLLIght’in’Europe, 2012-15) which made use......In recent years, the “policy trail” has been proposed as a methodology appropriate to the shifting and fluid governance of lifelong learning in the late modern world (Holford et al. 2013, Holford et al. 2013, Cort 2014). The contemporary environment is marked by multi-level governance (global...

  16. Changing methodologies in TESOL

    CERN Document Server

    Spiro, Jane

    2013-01-01

    Covering core topics from vocabulary and grammar to teaching, writing speaking and listening, this textbook shows you how to link research to practice in TESOL methodology. It emphasises how current understandings have impacted on the language classroom worldwide and investigates the meaning of 'methods' and 'methodology' and the importance of these for the teacher: as well as the underlying assumptions and beliefs teachers bring to bear in their practice. By introducing you to language teaching approaches, you will explore the way these are influenced by developments in our understanding of l

  17. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss......Nursing research is often concerned with lived experiences in human life using phenomenological and hermeneutic approaches. These empirical studies may use different creative expressions and art-forms to describe and enhance an embodied and personalised understanding of lived experiences. Drawing...... may support a respectful renewal of phenomenological research traditions in nursing research....

  18. Assessing the carbon benefit of saltmarsh restoration

    Science.gov (United States)

    Taylor, Benjamin; Paterson, David; Hanley, Nicholas

    2016-04-01

    The quantification of carbon sequestration rates in coastal ecosystems is required to better realise their potential role in climate change mitigation. Through accurate valuation this service can be fully appreciated and perhaps help facilitate efforts to restore vulnerable ecosystems such as saltmarshes. Vegetated coastal ecosystems are suggested to account for approximately 50% of oceanic sedimentary carbon despite their 2% areal extent. Saltmarshes, conservatively estimated to store 430 ± 30 Tg C in surface sediment deposits, have experienced extensive decline in the recent past; through processes such as land use change and coastal squeeze. Saltmarsh habitats offer a range of services that benefit society and the natural world, making their conservation meaningful and beneficial. The associated costs of restoration projects could, in part, be subsidised through payment for ecosystem services, specifically Blue carbon. Additional storage is generated through the (re)vegetation of mudflat areas leading to an altered ecosystem state and function; providing similar benefits to natural saltmarsh areas. The Eden Estuary, Fife, Scotland has been a site of saltmarsh restoration since 2000; providing a temporal and spatial scale to evaluate these additional benefits. The study is being conducted to quantify the carbon benefit of restoration efforts and provide an insight into the evolution of this benefit through sites of different ages. Seasonal sediment deposition and settlement rates are measured across the estuary in: mudflat, young planted saltmarsh, old planted saltmarsh and extant high marsh areas. Carbon values being derived from loss on ignition organic content values. Samples are taken across a tidal cycle on a seasonal basis; providing data on tidal influence, vegetation condition effects and climatic factors on sedimentation and carbon sequestration rates. These data will inform on the annual characteristics of sedimentary processes in the estuary and be

  19. Quantification of margins and mixed uncertainties using evidence theory and stochastic expansions

    International Nuclear Information System (INIS)

    Shah, Harsheel; Hosder, Serhat; Winter, Tyler

    2015-01-01

    The objective of this paper is to implement Dempster–Shafer Theory of Evidence (DSTE) in the presence of mixed (aleatory and multiple sources of epistemic) uncertainty to the reliability and performance assessment of complex engineering systems through the use of quantification of margins and uncertainties (QMU) methodology. This study focuses on quantifying the simulation uncertainties, both in the design condition and the performance boundaries along with the determination of margins. To address the possibility of multiple sources and intervals for epistemic uncertainty characterization, DSTE is used for uncertainty quantification. An approach to incorporate aleatory uncertainty in Dempster–Shafer structures is presented by discretizing the aleatory variable distributions into sets of intervals. In view of excessive computational costs for large scale applications and repetitive simulations needed for DSTE analysis, a stochastic response surface based on point-collocation non-intrusive polynomial chaos (NIPC) has been implemented as the surrogate for the model response. The technique is demonstrated on a model problem with non-linear analytical functions representing the outputs and performance boundaries of two coupled systems. Finally, the QMU approach is demonstrated on a multi-disciplinary analysis of a high speed civil transport (HSCT). - Highlights: • Quantification of margins and uncertainties (QMU) methodology with evidence theory. • Treatment of both inherent and epistemic uncertainties within evidence theory. • Stochastic expansions for representation of performance metrics and boundaries. • Demonstration of QMU on an analytical problem. • QMU analysis applied to an aerospace system (high speed civil transport)

  20. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  1. A risk-based sensor placement methodology

    International Nuclear Information System (INIS)

    Lee, Ronald W.; Kulesz, James J.

    2008-01-01

    A risk-based sensor placement methodology is proposed to solve the problem of optimal location of sensors to protect population against the exposure to, and effects of, known and/or postulated chemical, biological, and/or radiological threats. Risk is calculated as a quantitative value representing population at risk from exposure at standard exposure levels. Historical meteorological data are used to characterize weather conditions as the frequency of wind speed and direction pairs. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate risk values. Sensor locations are determined via an iterative dynamic programming algorithm whereby threats detected by sensors placed in prior iterations are removed from consideration in subsequent iterations. In addition to the risk-based placement algorithm, the proposed methodology provides a quantification of the marginal utility of each additional sensor. This is the fraction of the total risk accounted for by placement of the sensor. Thus, the criteria for halting the iterative process can be the number of sensors available, a threshold marginal utility value, and/or a minimum cumulative utility achieved with all sensors

  2. Identification and Quantification of the Major Constituents in Egyptian Carob Extract by Liquid Chromatography–Electrospray Ionization-Tandem Mass Spectrometry

    Science.gov (United States)

    Owis, Asmaa Ibrahim; El-Naggar, El-Motaz Bellah

    2016-01-01

    Background: Carob - Ceratonia siliqua L., commonly known as St John's-bread or locust bean, family Fabaceae - is one of the most useful native Mediterranean trees. There is no data about the chromatography methods performed by high performance liquid chromatography (HPLC) for determining polyphenols in Egyptian carob pods. Objective: To establish a sensitive and specific liquid chromatography–electrospray ionization (ESI)-tandem mass spectrometry (MSn) methodology for the identification of the major constituents in Egyptian carob extract. Materials and Methods: HPLC with diode array detector and ESI-mass spectrometry (MS) was developed for the identification and quantification of phenolic acids, flavonoid glycosides, and aglycones in the methanolic extract of Egyptian C. siliqua. The MS and MSn data together with HPLC retention time of phenolic components allowed structural characterization of these compounds. Peak integration of ions in the MS scans had been used in the quantification technique. Results: A total of 36 compounds were tentatively identified. Twenty-six compounds were identified in the negative mode corresponding to 85.4% of plant dry weight, while ten compounds were identified in the positive mode representing 16.1% of plant dry weight, with the prevalence of flavonoids (75.4% of plant dry weight) predominantly represented by two methylapigenin-O-pentoside isomers (20.9 and 13.7% of plant dry weight). Conclusion: The identification of various compounds present in carob pods opens a new door to an increased understanding of the different health benefits brought about by the consumption of carob and its products. SUMMARY This research proposed a good example for the rapid identification of major constituents in complex systems such as herbs using sensitive, accurate and specific method coupling HPLC with DAD and MS, which facilitate the clarification of phytochemical composition of herbal medicine for better understanding of their nature and

  3. Spanish methodological approach for biosphere assessment of radioactive waste disposal

    International Nuclear Information System (INIS)

    Agueero, A.; Pinedo, P.; Cancio, D.; Simon, I.; Moraleda, M.; Perez-Sanchez, D.; Trueba, C.

    2007-01-01

    The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS 'Reference Biospheres Methodology' and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates

  4. Spanish methodological approach for biosphere assessment of radioactive waste disposal.

    Science.gov (United States)

    Agüero, A; Pinedo, P; Cancio, D; Simón, I; Moraleda, M; Pérez-Sánchez, D; Trueba, C

    2007-10-01

    The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS "Reference Biospheres Methodology" and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates.

  5. Contraceptives with novel benefits.

    Science.gov (United States)

    Su, Ying; Lian, Qing-Quan; Ge, Ren-Shan

    2012-01-01

    Progesterone receptor (PR) agonists (progestins) and antagonists are developed for female contraceptives. However, non-contraceptive applications of newer progestins and PR modulators are being given more attention. The newer PR agonists including drospirenone, nomegestrol, trimegestone, dienogest and nestorone are being evaluated as contraceptives with health benefits because of their unique pharmacological properties. The selective PR modulators (SPRM; PR antagonists with PR agonistic properties) are under development not only for emergency contraception but also for other health benefits such as the treatment of endometritis and leiomyoma. After searching the literature from PubMed, clinicaltrials.gov and patent database, this review focuses on the effects and mechanisms of these progestins, and SPRMs as contraceptives with other health benefits. PR agonists and antagonists that have novel properties may generate better contraceptive effects with other health benefits.

  6. Benefits of being biased!

    Indian Academy of Sciences (India)

    Administrator

    Journal of Genetics, Vol. 83, No. 2, August 2004. Keywords. codon bias; alcohol dehydrogenase; Darwinian ... RESEARCH COMMENTARY. Benefits of being biased! SUTIRTH DEY*. Evolutionary Biology Laboratory, Evolutionary & Organismal Biology Unit,. Jawaharlal Nehru Centre for Advanced Scientific Research,.

  7. Benefits of CHP Partnership

    Science.gov (United States)

    Learn about the benefits of being a EPA CHP Partner, which include expert advice and answers to questions, CHP news, marketing resources, publicity and recognition, and being associated with EPA through a demonstrated commitment to CHP.

  8. Low Cost Benefit Suggestions.

    Science.gov (United States)

    Doyel, Hoyt W.; McMillan, John D.

    1980-01-01

    Outlines eight low-cost employee benefits and summarizes their relative advantages. The eight include a stock ownership program, a sick leave pool, flexible working hours, production incentives, and group purchase plans. (IRT)

  9. Computer Network Operations Methodology

    Science.gov (United States)

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  10. SCI Hazard Report Methodology

    Science.gov (United States)

    Mitchell, Michael S.

    2010-01-01

    This slide presentation reviews the methodology in creating a Source Control Item (SCI) Hazard Report (HR). The SCI HR provides a system safety risk assessment for the following Ares I Upper Stage Production Contract (USPC) components (1) Pyro Separation Systems (2) Main Propulsion System (3) Reaction and Roll Control Systems (4) Thrust Vector Control System and (5) Ullage Settling Motor System components.

  11. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly...

  12. Complicating Methodological Transparency

    Science.gov (United States)

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  13. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  14. NUSAM Methodology for Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Snell, Mark K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-07-01

    This document provides a methodology for the performance-based assessment of security systems designed for the protection of nuclear and radiological materials and the processes that produce and/or involve them. It is intended for use with both relatively simple installations and with highly regulated complex sites with demanding security requirements.

  15. MIRD methodology. Part 1

    International Nuclear Information System (INIS)

    Rojo, Ana M.

    2004-01-01

    This lecture develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this first part, the basic concepts and the main equations are presented. The ICRP Dosimetric System is also explained. (author)

  16. Response Surface Methodology

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.

    2014-01-01

    Abstract: This chapter first summarizes Response Surface Methodology (RSM), which started with Box and Wilson’s article in 1951 on RSM for real, non-simulated systems. RSM is a stepwise heuristic that uses first-order polynomials to approximate the response surface locally. An estimated polynomial

  17. MIRD methodology. Part 2

    International Nuclear Information System (INIS)

    Gomez Parada, Ines

    2004-01-01

    This paper develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this second part, different methods for the calculation of the accumulated activity are presented, together with the effective half life definition. Different forms of Retention Activity curves are also shown. (author)

  18. Terahertz identification and quantification of penicillamine enantiomers

    International Nuclear Information System (INIS)

    Ji Te; Zhao Hongwei; Chen Min; Xiao Tiqiao; Han Pengyu

    2013-01-01

    Identification and characterization of L-, D- and DL- penicillamine were demonstrated by Terahertz time-domain spectroscopy (THz-TDS). To understand the physical origins of the low frequency resonant modes, the density functional theory (DFT) was adopted for theoretical calculation. It was found that the collective THz frequency motions were decided by the intramolecular and intermolecular hydrogen bond interactions. Moreover, the quantification of penicillamine enantiomers mixture was demonstrated by a THz spectra fitting method with a relative error of less than 3.5%. This technique can be a valuable tool for the discrimination and quantification of chiral drugs in pharmaceutical industry. (authors)

  19. Benefits at risk

    DEFF Research Database (Denmark)

    Lassen, Jesper; Sandøe, Peter

    2007-01-01

    Herbicide resistant GM plants have been promoted as a tool in the development of more environment-friendly agriculture. The environmental benefits here, however, depend not only on farmer's acceptance of GM crops as such, but also on their willingness to use herbicides in accordance with altered ...... spraying plans. In this paper, we will argue that factors driving the spraying practices of Danish farmers may hamper efforts to secure the environmental benefits of the new crops....

  20. Benefits for handicapped children

    CERN Multimedia

    2003-01-01

    The introduction of long-term care benefits within the CERN Health Insurance Scheme requires the coordination of the benefits foreseen for handicapped children. Measures were adopted by the Management following the recommendation made by the Standing Concertation Committee on 26 March 2003. A document clarifying these measures is available on the Web at the following address: http://humanresources.web.cern.ch/humanresources/external/soc/Social_affairs/social_affairs.asp Social Affairs Service 74201

  1. A solar reserve methodology for renewable energy integration studies based on sub-hourly variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez, Eduardo; Brinkman, Gregory; Hummon, Marissa [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lew, Debra

    2012-07-01

    Increasing penetration of wind and solar energy are raising concerns among electric system operators because of the variability and uncertainty associated with the power sources. Previous work focused on the quantification of reserves for systems with wind power. This paper presents a new methodology that allows the determination of necessary reserves for high penetrations of photovoltaic power and compares it to the wind-based methodology. The solar reserve methodology was applied to Phase 2 of the Western Wind and Solar Integration Study. A summary of the results is included. (orig.)

  2. New approach for the quantification of processed animal proteins in feed using light microscopy.

    Science.gov (United States)

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  3. Guidebook in using Cost Benefit Analysis and strategic environmental assessment for environmental planning in China

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    Environmental planning in China may benefit from greater use of Cost Benefit Analysis (CBA) and Strategic Environmental Assessment (SEA) methodologies. We provide guidance on using these methodologies. Part I and II show the principles behind the methodologies as well as their theoretical structure. Part III demonstrates the methodologies in action in a range of different good practice examples. The case studies and theoretical expositions are intended to teach by way of example as well as by understanding the principles, and to help planners use the methodologies as correctly as possible.(auth)

  4. Benchmarking common quantification strategies for large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Hogrebe, Alexander; von Stechow, Louise; Bekker-Jensen, Dorte B

    2018-01-01

    Comprehensive mass spectrometry (MS)-based proteomics is now feasible, but reproducible quantification remains challenging, especially for post-translational modifications such as phosphorylation. Here, we compare the most popular quantification techniques for global phosphoproteomics: label-free...

  5. Cost-Benefit Analysis of Smart Grids Implementation

    International Nuclear Information System (INIS)

    Tomsic, Z.; Pongrasic, M.

    2014-01-01

    Paper presents guidelines for conducting the cost-benefit analysis of Smart Grid projects connected to the implementation of advanced technologies in electric power system. Restrictions of presented electric power networks are also mentioned along with solutions that are offered by advanced electric power network. From an economic point of view, the main characteristic of advanced electric power network is big investment, and benefits are seen after some time with risk of being smaller than expected. Therefore it is important to make a comprehensive analysis of those projects which consist of economic and qualitative analysis. This report relies on EPRI methodology developed in American institute for energy. The methodology is comprehensive and useful, but also simple and easy to understand. Steps of this methodology and main characteristics of methodologies which refer to EPRI methodology: methodology developed in Joint Research Center and methodologies for analysing implementation of smart meters in electricity power network are explained. Costs, benefits and categories in which they can be classified are also defined. As a part of qualitative analysis, social aspect of Smart Grid projects is described. In cost defining, special attention has to be paid to projects of integrating electricity from variable renewable energy sources into the power system because of additional costs. This work summarized categories of additional costs. In the end of this report, an overview is given of what has been done and what will be done in European Union. (author).

  6. Defined contribution health benefits.

    Science.gov (United States)

    Fronstin, P

    2001-03-01

    This Issue Brief discusses the emerging issue of "defined contribution" (DC) health benefits. The term "defined contribution" is used to describe a wide variety of approaches to the provision of health benefits, all of which have in common a shift in the responsibility for payment and selection of health care services from employers to employees. DC health benefits often are mentioned in the context of enabling employers to control their outlay for health benefits by avoiding increases in health care costs. DC health benefits may also shift responsibility for choosing a health plan and the associated risks of choosing a plan from employers to employees. There are three primary reasons why some employers currently are considering some sort of DC approach. First, they are once again looking for ways to keep their health care cost increases in line with overall inflation. Second, some employers are concerned that the public "backlash" against managed care will result in new legislation, regulations, and litigation that will further increase their health care costs if they do not distance themselves from health care decisions. Third, employers have modified not only most employee benefit plans, but labor market practices in general, by giving workers more choice, control, and flexibility. DC-type health benefits have existed as cafeteria plans since the 1980s. A cafeteria plan gives each employee the opportunity to determine the allocation of his or her total compensation (within employer-defined limits) among various employee benefits (primarily retirement or health). Most types of DC health benefits currently being discussed could be provided within the existing employment-based health insurance system, with or without the use of cafeteria plans. They could also allow employees to purchase health insurance directly from insurers, or they could drive new technologies and new forms of risk pooling through which health care services are provided and financed. DC health

  7. Reference Materials for Calibration of Analytical Biases in Quantification of DNA Methylation.

    Science.gov (United States)

    Yu, Hannah; Hahn, Yoonsoo; Yang, Inchul

    2015-01-01

    Most contemporary methods for the quantification of DNA methylation employ bisulfite conversion and PCR amplification. However, many reports have indicated that bisulfite-mediated PCR methodologies can result in inaccurate measurements of DNA methylation owing to amplification biases. To calibrate analytical biases in quantification of gene methylation, especially those that arise during PCR, we utilized reference materials that represent exact bisulfite-converted sequences with 0% and 100% methylation status of specific genes. After determining relative quantities using qPCR, pairs of plasmids were gravimetrically mixed to generate working standards with predefined DNA methylation levels at 10% intervals in terms of mole fractions. The working standards were used as controls to optimize the experimental conditions and also as calibration standards in melting-based and sequencing-based analyses of DNA methylation. Use of the reference materials enabled precise characterization and proper calibration of various biases during PCR and subsequent methylation measurement processes, resulting in accurate measurements.

  8. Climate and desertification: indicators for an assessment methodology

    International Nuclear Information System (INIS)

    Sciortino, M.; Caiaffa, E.; Fattoruso, G.; Donolo, R.; Salvetti, G.

    2009-01-01

    This work aims to define a methodology that, on the basis of commonly available surface climate records, assesses indicators of the increase or decrease of the extension of territories vulnerable to desertification and land degradation. The definition and quantification of environmental policy relevant indicators aims to improve the understanding and the decision making processes in dry lands. the results of this study show that since 1931 changes of climate involved 90% of the territory of the Sicilian region, with stronger intensity in the internal areas of Enna, Caltanissetta and Palermo provinces. (Author) 9 refs.

  9. Measuring Identification and Quantification Errors in Spectral CT Material Decomposition

    Directory of Open Access Journals (Sweden)

    Aamir Younis Raja

    2018-03-01

    Full Text Available Material decomposition methods are used to identify and quantify multiple tissue components in spectral CT but there is no published method to quantify the misidentification of materials. This paper describes a new method for assessing misidentification and mis-quantification in spectral CT. We scanned a phantom containing gadolinium (1, 2, 4, 8 mg/mL, hydroxyapatite (54.3, 211.7, 808.5 mg/mL, water and vegetable oil using a MARS spectral scanner equipped with a poly-energetic X-ray source operated at 118 kVp and a CdTe Medipix3RX camera. Two imaging protocols were used; both with and without 0.375 mm external brass filter. A proprietary material decomposition method identified voxels as gadolinium, hydroxyapatite, lipid or water. Sensitivity and specificity information was used to evaluate material misidentification. Biological samples were also scanned. There were marked differences in identification and quantification between the two protocols even though spectral and linear correlation of gadolinium and hydroxyapatite in the reconstructed images was high and no qualitative segmentation differences in the material decomposed images were observed. At 8 mg/mL, gadolinium was correctly identified for both protocols, but concentration was underestimated by over half for the unfiltered protocol. At 1 mg/mL, gadolinium was misidentified in 38% of voxels for the filtered protocol and 58% of voxels for the unfiltered protocol. Hydroxyapatite was correctly identified at the two higher concentrations for both protocols, but mis-quantified for the unfiltered protocol. Gadolinium concentration as measured in the biological specimen showed a two-fold difference between protocols. In future, this methodology could be used to compare and optimize scanning protocols, image reconstruction methods, and methods for material differentiation in spectral CT.

  10. Genomic DNA-based absolute quantification of gene expression in Vitis.

    Science.gov (United States)

    Gambetta, Gregory A; McElrone, Andrew J; Matthews, Mark A

    2013-07-01

    Many studies in which gene expression is quantified by polymerase chain reaction represent the expression of a gene of interest (GOI) relative to that of a reference gene (RG). Relative expression is founded on the assumptions that RG expression is stable across samples, treatments, organs, etc., and that reaction efficiencies of the GOI and RG are equal; assumptions which are often faulty. The true variability in RG expression and actual reaction efficiencies are seldom determined experimentally. Here we present a rapid and robust method for absolute quantification of expression in Vitis where varying concentrations of genomic DNA were used to construct GOI standard curves. This methodology was utilized to absolutely quantify and determine the variability of the previously validated RG ubiquitin (VvUbi) across three test studies in three different tissues (roots, leaves and berries). In addition, in each study a GOI was absolutely quantified. Data sets resulting from relative and absolute methods of quantification were compared and the differences were striking. VvUbi expression was significantly different in magnitude between test studies and variable among individual samples. Absolute quantification consistently reduced the coefficients of variation of the GOIs by more than half, often resulting in differences in statistical significance and in some cases even changing the fundamental nature of the result. Utilizing genomic DNA-based absolute quantification is fast and efficient. Through eliminating error introduced by assuming RG stability and equal reaction efficiencies between the RG and GOI this methodology produces less variation, increased accuracy and greater statistical power. © 2012 Scandinavian Plant Physiology Society.

  11. Utilité du partage des corpus pour l'analyse des interactions en ligne en situation d'apprentissage : un exemple d'approche méthodologique autour d'une base de corpus d'apprentissage Benefits of Sharing Corpora when Analyzing Online Interactions: an Example of Methodology Related to a Databank of Learning and Teaching Corpora.

    Directory of Open Access Journals (Sweden)

    Maud Ciekanski

    2010-12-01

    propos sur les modes de valorisation scientifique du travail du chercheur confronté à la collecte et à la structuration de corpus d'apprentissage.The study of online learning, whether aimed at understanding this form of situated human learning, at evaluating relevant pedagogical scenarios and settings or at improving technological environments, requires the availability of interaction data from all participants in the learning situations. However, usually data are either inaccessible, or of limited access to those who were not involved in the original project. Moreover data are fragmented, therefore decontextualized with respect to the original teaching/learning settings. Sometimes they are buried in a proprietary format within the technological environment. The consequence is that research lacks a scientific basis. In the literature comparisons are often attempted between objects that are ill-defined and may in fact be different. The processes of scientific enquiry, such as re-analyzing, replicating, verifying, refuting or extending the original findings, are therefore disabled. To address this anomaly, we suggest to create and disseminate a new type of corpus, a contextualized learner corpus, entitled "LEarning and TEaching Corpus" (Letec. Such corpora include not only the data that correspond to output of learner activity in online courses, but also their context. Sharing Letec corpora within the research community implies that: (1 corpora are formatted and structured according to a new model which is compatible with existing standards for corpora and for learning design specifications; (2 corpora are placed on a server offering cross-platform compatibility and free access; (3 an ethics policy is formulated as well as copyright-licences. This paper presents the answers brought by our Mulce project from a theoretical and methodological standpoint. We give examples extracted from two learning and teaching corpora (Simuligne and Copéas. We show how data structured

  12. Colour thresholding and objective quantification in bioimaging

    Science.gov (United States)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  13. Recurrence quantification analysis in Liu's attractor

    International Nuclear Information System (INIS)

    Balibrea, Francisco; Caballero, M. Victoria; Molera, Lourdes

    2008-01-01

    Recurrence Quantification Analysis is used to detect transitions chaos to periodical states or chaos to chaos in a new dynamical system proposed by Liu et al. This system contains a control parameter in the second equation and was originally introduced to investigate the forming mechanism of the compound structure of the chaotic attractor which exists when the control parameter is zero

  14. Quantification of coating aging using impedance measurements

    NARCIS (Netherlands)

    Westing, E.P.M. van; Weijde, D.H. van der; Vreijling, M.P.W.; Ferrari, G.M.; Wit, J.H.W. de

    1998-01-01

    This chapter shows the application results of a novel approach to quantify the ageing of organic coatings using impedance measurements. The ageing quantification is based on the typical impedance behaviour of barrier coatings in immersion. This immersion behaviour is used to determine the limiting

  15. Quantification analysis of CT for aphasic patients

    International Nuclear Information System (INIS)

    Watanabe, Shunzo; Ooyama, Hiroshi; Hojo, Kei; Tasaki, Hiroichi; Hanazono, Toshihide; Sato, Tokijiro; Metoki, Hirobumi; Totsuka, Motokichi; Oosumi, Noboru.

    1987-01-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on Slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis). (author)

  16. Quantification analysis of CT for aphasic patients

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, S.; Ooyama, H.; Hojo, K.; Tasaki, H.; Hanazono, T.; Sato, T.; Metoki, H.; Totsuka, M.; Oosumi, N.

    1987-02-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis).

  17. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  18. Quantification of glycyrrhizin biomarker in Glycyrrhiza glabra ...

    African Journals Online (AJOL)

    Background: A simple and sensitive thin-layer chromatographic method has been established for quantification of glycyrrhizin in Glycyrrhiza glabra rhizome and baby herbal formulations by validated Reverse Phase HPTLC method. Materials and Methods: RP-HPTLC Method was carried out using glass coated with RP-18 ...

  19. Noninvasive Quantification of Pancreatic Fat in Humans

    OpenAIRE

    Lingvay, Ildiko; Esser, Victoria; Legendre, Jaime L.; Price, Angela L.; Wertz, Kristen M.; Adams-Huet, Beverley; Zhang, Song; Unger, Roger H.; Szczepaniak, Lidia S.

    2009-01-01

    Objective: To validate magnetic resonance spectroscopy (MRS) as a tool for non-invasive quantification of pancreatic triglyceride (TG) content and to measure the pancreatic TG content in a diverse human population with a wide range of body mass index (BMI) and glucose control.

  20. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  1. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  2. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  3. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    As part of learning at the Nordic Workshop of Evidence-based Medicine, we have read with interest the practice guidelines for central venous access, published in your Journal in 2012.1 We appraised the quality of this guideline using the checklist developed by The Evidence-Based Medicine Working ...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....... Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...

  4. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  5. Steganography: LSB Methodology

    Science.gov (United States)

    2012-08-02

    of LSB steganography in grayscale and color images . In J. Dittmann, K. Nahrstedt, and P. Wohlmacher, editors, Proceedings of the ACM, Special...Fridrich, M. Gojan and R. Du paper titled “Reliable detection of LSB steganography in grayscale and color images ”. From a general perspective Figure 2...REPORT Steganography : LSB Methodology (Progress Report) 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: In computer science, steganography is the science

  6. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    Science.gov (United States)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  7. Soil Radiological Characterisation Methodology

    International Nuclear Information System (INIS)

    Attiogbe, Julien; Aubonnet, Emilie; De Maquille, Laurence; De Moura, Patrick; Desnoyers, Yvon; Dubot, Didier; Feret, Bruno; Fichet, Pascal; Granier, Guy; Iooss, Bertrand; Nokhamzon, Jean-Guy; Ollivier Dehaye, Catherine; Pillette-Cousin, Lucien; Savary, Alain

    2014-12-01

    This report presents the general methodology and best practice approaches which combine proven existing techniques for sampling and characterisation to assess the contamination of soils prior to remediation. It is based on feedback of projects conducted by main French nuclear stakeholders involved in the field of remediation and dismantling (EDF, CEA, AREVA and IRSN). The application of this methodology will enable the project managers to obtain the elements necessary for the drawing up of files associated with remediation operations, as required by the regulatory authorities. It is applicable to each of the steps necessary for the piloting of remediation work-sites, depending on the objectives targeted (release into the public domain, re-use, etc.). The main part describes the applied statistical methodology with the exploratory analysis and variogram data, identification of singular points and their location. The results obtained permit assessment of a mapping to identify the contaminated surface and subsurface areas. It stakes the way for radiological site characterisation since the initial investigations from historical and functional analysis to check that the remediation objectives have been met. It follows an example application from the feedback of the remediation of a contaminated site on the Fontenay aux Roses facility. It is supplemented by a glossary of main terms used in the field from different publications or international standards. This technical report is a support of the ISO Standard ISO ISO/TC 85/SC 5 N 18557 'Sampling and characterisation principles for soils, buildings and infrastructures contaminated by radionuclides for remediation purposes'. (authors) [fr

  8. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  9. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  10. Quantification of analytes affected by relevant interfering signals under quality controlled conditions

    International Nuclear Information System (INIS)

    Bettencourt da Silva, Ricardo J.N.; Santos, Julia R.; Camoes, M. Filomena G.F.C.

    2006-01-01

    The analysis of organic contaminants or residues in biological samples is frequently affected by the presence of compounds producing interfering instrumental signals. This feature is responsible for the higher complexity and cost of these analyses and/or by a significant reduction of the number of studied analytes in a multi-analyte method. This work presents a methodology to estimate the impact of the interfering compounds on the quality of the analysis of complex samples, based on separative instrumental methods of analysis, aiming at supporting the inclusion of analytes affected by interfering compounds in the list of compounds analysed in the studied samples. The proposed methodology involves the study of the magnitude of the signal produced by the interfering compounds in the analysed matrix, and is applicable to analytical systems affected by interfering compounds with varying concentration in the studied matrix. The proposed methodology is based on the comparison of the signals from a representative number of examples of the studied matrix, in order to estimate the impact of the presence of such compounds on the measurement quality. The treatment of the chromatographic signals necessary to collect these data can be easily performed considering algorithms of subtraction of chromatographic signals available in most of the analytical instrumentation software. The subtraction of the interfering compounds signal from the sample signal allows the compensation of the interfering effect irrespective of the relative magnitude of the interfering and analyte signals, supporting the applicability of the same model of the method performance for a broader concentration range. The quantification of the measurement uncertainty was performed using the differential approach, which allows the estimation of the contribution of the presence of the interfering compounds to the quality of the measurement. The proposed methodology was successfully applied to the analysis of

  11. Consistent quantification of climate impacts due to biogenic carbon storage across a range of bio-product systems

    International Nuclear Information System (INIS)

    Guest, Geoffrey; Bright, Ryan M.; Cherubini, Francesco; Strømman, Anders H.

    2013-01-01

    Temporary and permanent carbon storage from biogenic sources is seen as a way to mitigate climate change. The aim of this work is to illustrate the need to harmonize the quantification of such mitigation across all possible storage pools in the bio- and anthroposphere. We investigate nine alternative storage cases and a wide array of bio-resource pools: from annual crops, short rotation woody crops, medium rotation temperate forests, and long rotation boreal forests. For each feedstock type and biogenic carbon storage pool, we quantify the carbon cycle climate impact due to the skewed time distribution between emission and sequestration fluxes in the bio- and anthroposphere. Additional consideration of the climate impact from albedo changes in forests is also illustrated for the boreal forest case. When characterizing climate impact with global warming potentials (GWP), we find a large variance in results which is attributed to different combinations of biomass storage and feedstock systems. The storage of biogenic carbon in any storage pool does not always confer climate benefits: even when biogenic carbon is stored long-term in durable product pools, the climate outcome may still be undesirable when the carbon is sourced from slow-growing biomass feedstock. For example, when biogenic carbon from Norway Spruce from Norway is stored in furniture with a mean life time of 43 years, a climate change impact of 0.08 kg CO 2 eq per kg CO 2 stored (100 year time horizon (TH)) would result. It was also found that when biogenic carbon is stored in a pool with negligible leakage to the atmosphere, the resulting GWP factor is not necessarily − 1 CO 2 eq per kg CO 2 stored. As an example, when biogenic CO 2 from Norway Spruce biomass is stored in geological reservoirs with no leakage, we estimate a GWP of − 0.56 kg CO 2 eq per kg CO 2 stored (100 year TH) when albedo effects are also included. The large variance in GWPs across the range of resource and carbon storage

  12. Consistent quantification of climate impacts due to biogenic carbon storage across a range of bio-product systems

    Energy Technology Data Exchange (ETDEWEB)

    Guest, Geoffrey, E-mail: geoffrey.guest@ntnu.no; Bright, Ryan M., E-mail: ryan.m.bright@ntnu.no; Cherubini, Francesco, E-mail: francesco.cherubini@ntnu.no; Strømman, Anders H., E-mail: anders.hammer.stromman@ntnu.no

    2013-11-15

    Temporary and permanent carbon storage from biogenic sources is seen as a way to mitigate climate change. The aim of this work is to illustrate the need to harmonize the quantification of such mitigation across all possible storage pools in the bio- and anthroposphere. We investigate nine alternative storage cases and a wide array of bio-resource pools: from annual crops, short rotation woody crops, medium rotation temperate forests, and long rotation boreal forests. For each feedstock type and biogenic carbon storage pool, we quantify the carbon cycle climate impact due to the skewed time distribution between emission and sequestration fluxes in the bio- and anthroposphere. Additional consideration of the climate impact from albedo changes in forests is also illustrated for the boreal forest case. When characterizing climate impact with global warming potentials (GWP), we find a large variance in results which is attributed to different combinations of biomass storage and feedstock systems. The storage of biogenic carbon in any storage pool does not always confer climate benefits: even when biogenic carbon is stored long-term in durable product pools, the climate outcome may still be undesirable when the carbon is sourced from slow-growing biomass feedstock. For example, when biogenic carbon from Norway Spruce from Norway is stored in furniture with a mean life time of 43 years, a climate change impact of 0.08 kg CO{sub 2}eq per kg CO{sub 2} stored (100 year time horizon (TH)) would result. It was also found that when biogenic carbon is stored in a pool with negligible leakage to the atmosphere, the resulting GWP factor is not necessarily − 1 CO{sub 2}eq per kg CO{sub 2} stored. As an example, when biogenic CO{sub 2} from Norway Spruce biomass is stored in geological reservoirs with no leakage, we estimate a GWP of − 0.56 kg CO{sub 2}eq per kg CO{sub 2} stored (100 year TH) when albedo effects are also included. The large variance in GWPs across the range of

  13. Deserving social benefits?

    DEFF Research Database (Denmark)

    Esmark, Anders; Richardt Schoop, Sarah

    2017-01-01

    welfare reforms involving reductions of social benefits in Denmark in 2005 and 2013, the article analyses the frames used by politicians supporting and opposing reform, as well as the frames used by the media. The article shows, first, that political reforms reducing social benefits are followed...... by increased framing of recipients as undeserving. The article finds a strong correlation between the political objective of reducing benefits and the reliance on frames that position recipients as undeserving. Second, the article shows that media framing remains significantly different from political framing......The article contributes to the growing literature on framing of deservingness as an alternative to ‘blame avoidance’ strategies in the politics of welfare retrenchment. In particular, the article focuses on the interplay between political framing and media framing. Based on an analysis of two major...

  14. The green and blue water footprint of paper products: methodological considerations and quantification

    NARCIS (Netherlands)

    van Oel, P.R.; Hoekstra, Arjen Ysbert

    2010-01-01

    For a hardcopy of this report, printed in the Netherlands, an estimated 200 litres of water have been used. Water is required during different stages in the production process, from growing wood to processing pulp into the final consumer product. Most of the water is consumed in the forestry stage,

  15. Uncertainty quantification methodologies development for stress corrosion cracking of canister welds

    Energy Technology Data Exchange (ETDEWEB)

    Dingreville, Remi Philippe Michel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bryan, Charles R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-30

    This letter report presents a probabilistic performance assessment model to evaluate the probability of canister failure (through-wall penetration) by SCC. The model first assesses whether environmental conditions for SCC – the presence of an aqueous film – are present at canister weld locations (where tensile stresses are likely to occur) on the canister surface. Geometry-specific storage system thermal models and weather data sets representative of U.S. spent nuclear fuel (SNF) storage sites are implemented to evaluate location-specific canister surface temperature and relative humidity (RH). As the canister cools and aqueous conditions become possible, the occurrence of corrosion is evaluated. Corrosion is modeled as a two-step process: first, pitting is initiated, and the extent and depth of pitting is a function of the chloride surface load and the environmental conditions (temperature and RH). Second, as corrosion penetration increases, the pit eventually transitions to a SCC crack, with crack initiation becoming more likely with increasing pit depth. Once pits convert to cracks, a crack growth model is implemented. The SCC growth model includes rate dependencies on both temperature and crack tip stress intensity factor, and crack growth only occurs in time steps when aqueous conditions are predicted. The model suggests that SCC is likely to occur over potential SNF interim storage intervals; however, this result is based on many modeling assumptions. Sensitivity analyses provide information on the model assumptions and parameter values that have the greatest impact on predicted storage canister performance, and provide guidance for further research to reduce uncertainties.

  16. Application of the RPN methodology for quantification of the operability of the quadruple-tank process

    Directory of Open Access Journals (Sweden)

    J.O. Trierweiler

    2002-04-01

    Full Text Available The RPN indicates how potentially difficult it is for a given system to achieve the desired performance robustly. It reflects both the attainable performance of a system and its degree of directionality. Two new indices, RPN ratio and RPN difference are introduced to quantify how realizable a given desired performance can be. The predictions made by RPN are verified by closed-loop simulations. These indices are applied to quantify the IO-controllability of the quadruple-tank process.

  17. Methodology for Gamma cameras calibration for I-131 uptake quantification in Hyperthyroidism diseases

    International Nuclear Information System (INIS)

    Lopez Diaz, A.; Palau San Pedro, A.; Martin Escuela, J. M.; Reynosa Montejo, R.; Castillo, J.; Torres Aroche, L.

    2015-01-01

    Optimization and verification of Patient-Specific Treatment Planning with unsealed I-131 sources is a desirable goal from medical and radiation protection point of view. To obtain a practical protocol to combine the estimation of the related parameters with patient's specific treatment dose in hyperthyroidism disease, 3 equipment were studied (Iodine Probe, a Philips Forte Camera with pin-hole collimators and a Mediso Nucline with HEGP for planar and SPECT techniques) and crossed calibrated. The linear behaviour on diagnostic and therapeutic activity range was verified, showing a linear correlation fitting factor R 2 > 0.99. The differences between thyroid uptake determinations in all equipment were less than 6% for therapeutic activities and less than 1.1% in the diagnostic range. The combined protocol to calculate, with only one administration of I 131 , all the necessary parameters to the treatment dose estimation in 2D or 3D, avoiding wasting time with gamma cameras, was established and verified. Following this protocol the difference between apparent and calculated activities were less than 3%. (Author)

  18. BenefitClaimWebServiceBean/BenefitClaimWebService

    Data.gov (United States)

    Department of Veterans Affairs — A formal or informal request for a type of monetary or non-monetary benefit. This service provides benefit claims and benefit claim special issues data, allows the...

  19. Development of Proliferation Resistance Assessment Methodology Based on International Standards

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Lee, Jung Won; Lee, Kwang Seok

    2009-03-01

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  20. CONSIDERATIONS REGARDING THE QUANTIFICATION OF THE BENEFITS OF A CLEAN AND HEALTHY ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    PAUL-BOGDAN ZAMFIR

    2012-09-01

    Full Text Available The practice and economic theory reveal relationships of dependence, between degree of reduction ofpollutant residues, on the one hand, and the cost, as well as the total positive effects which is to be made by the controland the actions of reduction the degree of pollution, on the other hand.Thus, from an ecological point of view, an action may be defined as economically efficient, not only whereensures achievement of the objectives proposed in terms of minimum costs, not only where but also if it ensures at leastkeeping the quality of natural environment. The protection program of the environmental quality drawn up ofenterprises, program included in their strategy of development, to be operational it is necessary to include a series ofindicators such as: the permissible level of pollution of the environment with different substances, acceptable levels ofcontamination from the enterprise products, the volume of expenditure which it involves taking measures for theconservation and protection of the environment, the modality of including in the production cost the expensesrelated to protect the natural environment, etc.

  1. Enlisting Ecosystem Benefits: Quantification and Valuation of Ecosystem Services to Inform Installation Management

    Science.gov (United States)

    2015-05-27

    from Virginia Tech, J. Turnley (Exponent); T. Donigian, J. Imhoff, and E. Regan from AQUA TERRA Consultants for sharing data and/or information from...Plateau to the north and the Coastal Plain to the south, which represent distinct features of topography, geology and soils, and vegetation communities...erosion protection provided by the presence of vegetation , and management practices. The model can also value the landscape in terms of water quality

  2. Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Datta, Susmita; Payne, Samuel H.; Kang, Jiyun; Bramer, Lisa M.; Nicora, Carrie D.; Shukla, Anil K.; Metz, Thomas O.; Rodland, Karin D.; Smith, Richard D.; Tardiff, Mark F.; McDermott, Jason E.; Pounds, Joel G.; Waters, Katrina M.

    2014-12-01

    As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.

  3. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  4. Quantification of glucosylceramide in plasma of Gaucher disease patients

    Directory of Open Access Journals (Sweden)

    Maria Viviane Gomes Muller

    2010-12-01

    Full Text Available Gaucher disease is a sphingolipidosis that leads to an accumulation of glucosylceramide. The objective of this study was to develop a methodology, based on the extraction, purification and quantification of glucosylceramide from blood plasma, for use in clinical research laboratories. Comparison of the glucosylceramide content in plasma from Gaucher disease patients, submitted to enzyme replacement therapy or otherwise, against that from normal individuals was also carried out. The glucosylceramide, separated from other glycosphingolipids by high performance thin layer chromatography (HPTLC was chemically developed (CuSO4 / H3PO4 and the respective band confirmed by immunostaining (human anti-glucosylceramide antibody / peroxidase-conjugated secondary antibody. Chromatogram quantification by densitometry demonstrated that the glucosylceramide content in Gaucher disease patients was seventeen times higher than that in normal individuals, and seven times higher than that in patients on enzyme replacement therapy. The results obtained indicate that the methodology established can be used in complementary diagnosis and for treatment monitoring of Gaucher disease patients.A doença de Gaucher é uma esfingolipidose caracterizada pelo acúmulo de glicosilceramida. O objetivo deste estudo foi desenvolver metodologia baseada na extração, purificação e quantificação da glicosilceramida plasmática a qual possa ser usada em laboratórios de pesquisa clínica. Após o desenvolvimento desta metodologia, foi proposto, também, comparar o conteúdo de glicosilceramida presente no plasma de pacientes com doença de Gaucher, submetidos ou não a tratamento, com aquele de indivíduos normais. A glicosilceramida, separada de outros glicoesfingolipídios por cromatografia de camada delgada de alto desempenho (HPTLC, foi revelada quimicamente (CuSO4/H3PO4 e a respectiva banda foi confirmada por imunorrevelação (anticorpo anti-glicosilceramida humana

  5. PENSION FUND BENEFITS SERVICE

    CERN Multimedia

    Benefits Service

    2002-01-01

    Please note that from now on, our offices (5-1-030) will be opened to members and beneficiaries on Tuesday, Wednesday and Thursday from 10 to 12 a.m. and from 3 to 5 p.m. We are otherwise available but by appointment only. Benefits Service (tel. 79194 / 72738)

  6. PENSION FUND BENEFITS SERVICE

    CERN Multimedia

    Benefits Service

    2002-01-01

    Please note that from now on, our offices will be opened to members and beneficiaries on Tuesday, Wednesday and Thursday from 10 to 12 a.m. and from 3 to 5 p.m. We are otherwise available but by appointment only. Benefits Service 5-1-030 tel. 79194 / 72738

  7. Bayesian benefits with JASP

    NARCIS (Netherlands)

    Marsman, M.; Wagenmakers, E.-J.

    2017-01-01

    We illustrate the Bayesian approach to data analysis using the newly developed statistical software program JASP. With JASP, researchers are able to take advantage of the benefits that the Bayesian framework has to offer in terms of parameter estimation and hypothesis testing. The Bayesian

  8. Studies Highlight Biodiesel's Benefits

    Science.gov (United States)

    , Colo., July 6, 1998 — Two new studies highlight the benefits of biodiesel in reducing overall air Energy's National Renewable Energy Laboratory (NREL) conducted both studies: An Overview of Biodiesel and Petroleum Diesel Life Cycles and Biodiesel Research Progress, 1992-1997. Biodiesel is a renewable diesel

  9. Your Medicare Benefits

    Science.gov (United States)

    ... schedule a lung cancer screening counseling and shared decision making visit with your doctor to discuss the benefits ... when they’re available in your MyMedicare.gov account. 58 Section 3: For more information Visit Medicare. gov for general information about Medicare ...

  10. Development of a new methodology for quantifying nuclear safety culture

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kiyoon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of). Dept. of Nuclear Engineering

    2017-01-15

    The present study developed a Safety Culture Impact Assessment Model (SCIAM) which consists of a safety culture assessment methodology and a safety culture impact quantification methodology. The SCIAM uses a safety culture impact index (SCII) to monitor the status of safety culture of NPPs periodically and it uses relative core damage frequency (RCDF) to present the impact of safety culture on the safety of NPPs. As a result of applying the SCIAM to the reference plant (Kori 3), the standard for the healthy safety culture of the reference plant is suggested. SCIAM might contribute to improve the safety of NPPs (Nuclear Power Plants) by monitoring the status of safety culture periodically and presenting the standard of healthy safety culture.

  11. Development of a new methodology for quantifying nuclear safety culture

    International Nuclear Information System (INIS)

    Han, Kiyoon; Jae, Moosung

    2017-01-01

    The present study developed a Safety Culture Impact Assessment Model (SCIAM) which consists of a safety culture assessment methodology and a safety culture impact quantification methodology. The SCIAM uses a safety culture impact index (SCII) to monitor the status of safety culture of NPPs periodically and it uses relative core damage frequency (RCDF) to present the impact of safety culture on the safety of NPPs. As a result of applying the SCIAM to the reference plant (Kori 3), the standard for the healthy safety culture of the reference plant is suggested. SCIAM might contribute to improve the safety of NPPs (Nuclear Power Plants) by monitoring the status of safety culture periodically and presenting the standard of healthy safety culture.

  12. Sulfonylurea herbicides – methodological challenges in setting aquatic limit values

    DEFF Research Database (Denmark)

    Rosenkrantz, Rikke Tjørnhøj; Baun, Anders; Kusk, Kresten Ole

    according to the EU Water Framework Directive, the resulting Water Quality Standards (WQSs) are below the analytical quantification limit, making it difficult to verify compliance with the limit values. However, several methodological concerns may be raised in relation to the very low effect concentrations...... and rimsulfuron. The following parameters were varied during testing: pH, exposure duration, temperature and light/dark cycle. Preliminary results show that a decrease in pH causes an increase in toxicity for all compounds. Exposure to a high concentration for 24 hours caused a reduction in growth rate, from...... for setting limit values for SUs or if more detailed information should be gained by taking methodological considerations into account....

  13. The methodological defense of realism scrutinized.

    Science.gov (United States)

    Wray, K Brad

    2015-12-01

    I revisit an older defense of scientific realism, the methodological defense, a defense developed by both Popper and Feyerabend. The methodological defense of realism concerns the attitude of scientists, not philosophers of science. The methodological defense is as follows: a commitment to realism leads scientists to pursue the truth, which in turn is apt to put them in a better position to get at the truth. In contrast, anti-realists lack the tenacity required to develop a theory to its fullest. As a consequence, they are less likely to get at the truth. My aim is to show that the methodological defense is flawed. I argue that a commitment to realism does not always benefit science, and that there is reason to believe that a research community with both realists and anti-realists in it may be better suited to advancing science. A case study of the Copernican Revolution in astronomy supports this claim. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Methodologies of health impact assessment as part of an integrated approach to reduce effects of air pollution

    OpenAIRE

    Aunan, Kristin; Seip, Hans Martin

    1995-01-01

    Quantification of average frequencies of health effects on a population level is an essential part of an integrated assessment of pollution effects. Epidemiological studies seem to provide the best basis for such estimates. This paper gives an introduction to a methodology for health impact assessment. It also gives results from some selected parts of a case-study in Hungary. This study is aimed at testing and improving the methodology for integrated assessment and focuses on energy productio...

  15. Outline of cost-benefit analysis and a case study

    Science.gov (United States)

    Kellizy, A.

    1978-01-01

    The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.

  16. Pesticides residues in water treatment plant sludge: validation of analytical methodology using liquid chromatography coupled to Tandem mass spectrometry (LC-MS/MS)

    International Nuclear Information System (INIS)

    Moracci, Luiz Fernando Soares

    2008-01-01

    The evolving scenario of Brazilian agriculture brings benefits to the population and demands technological advances to this field. Constantly, new pesticides are introduced encouraging scientific studies with the aim of determine and evaluate impacts on the population and on environment. In this work, the evaluated sample was the sludge resulted from water treatment plant located in the Vale do Ribeira, Sao Paulo, Brazil. The technique used was the reversed phase liquid chromatography coupled to electrospray ionization tandem mass spectrometry. Compounds were previously liquid extracted from the matrix. The development of the methodology demanded data processing in order to be transformed into reliable information. The processes involved concepts of validation of chemical analysis. The evaluated parameters were selectivity, linearity, range, sensitivity, accuracy, precision, limit of detection, limit of quantification and robustness. The obtained qualitative and quantitative results were statistically treated and presented. The developed and validated methodology is simple. As results, even exploring the sensitivity of the analytical technique, the work compounds were not detected in the sludge of the WTP. One can explain that these compounds can be present in a very low concentration, can be degraded under the conditions of the water treatment process or are not completely retained by the WTP. (author)

  17. Projected benefits of actinide partitioning

    International Nuclear Information System (INIS)

    Braun, C.; Goldstein, M.

    1976-05-01

    Possible benefits that could accrue from actinide separation and transmutations are presented. The time frame for implementing these processes is discussed and the expected benefits are qualitatively described. These benefits are provisionally quantified in a sample computation

  18. Social Security and Medicare Benefits

    Data.gov (United States)

    Social Security Administration — Cash benefits and rehabilitation benefits paid in each year from the Old-Age and Survivors Insurance, and Disability Insurance Trust Funds, and benefits paid from...

  19. A Bayesian approach for quantification of model uncertainty

    International Nuclear Information System (INIS)

    Park, Inseok; Amarchinta, Hemanth K.; Grandhi, Ramana V.

    2010-01-01

    In most engineering problems, more than one model can be created to represent an engineering system's behavior. Uncertainty is inevitably involved in selecting the best model from among the models that are possible. Uncertainty in model selection cannot be ignored, especially when the differences between the predictions of competing models are significant. In this research, a methodology is proposed to quantify model uncertainty using measured differences between experimental data and model outcomes under a Bayesian statistical framework. The adjustment factor approach is used to propagate model uncertainty into prediction of a system response. A nonlinear vibration system is used to demonstrate the processes for implementing the adjustment factor approach. Finally, the methodology is applied on the engineering benefits of a laser peening process, and a confidence band for residual stresses is established to indicate the reliability of model prediction.

  20. Case Study Research Methodology

    Directory of Open Access Journals (Sweden)

    Mark Widdowson

    2011-01-01

    Full Text Available Commenting on the lack of case studies published in modern psychotherapy publications, the author reviews the strengths of case study methodology and responds to common criticisms, before providing a summary of types of case studies including clinical, experimental and naturalistic. Suggestions are included for developing systematic case studies and brief descriptions are given of a range of research resources relating to outcome and process measures. Examples of a pragmatic case study design and a hermeneutic single-case efficacy design are given and the paper concludes with some ethical considerations and an exhortation to the TA community to engage more widely in case study research.

  1. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  2. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    The reports comprising this volume concern the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in granites in Cornwall, with particular reference to the effect of structures imposed by discontinuities on the engineering behaviour of rock masses. The topics covered are: in-situ stress measurements using (a) the hydraulic fracturing method, or (b) the US Bureau of Mines deformation probe; scanline discontinuity survey - coding form and instructions, and data; applicability of geostatistical estimation methods to scalar rock properties; comments on in-situ stress at the Carwynnen test mine and the state of stress in the British Isles. (U.K.)

  3. Microphysics evolution and methodology

    International Nuclear Information System (INIS)

    Dionisio, J.S.

    1990-01-01

    A few general features of microscopics evolution and their relationship with microscopics methodology are briefly surveyed. Several pluri-disciplinary and interdisciplinary aspects of microscopics research are also discussed in the present scientific context. The need for an equilibrium between individual tendencies and collective constraints required by team work, already formulated thirty years ago by Frederic Joliot, is particularly stressed in the present conjuncture of Nuclear Research favouring very large team projects and discouraging individual initiatives. The increasing importance of the science of science (due to their multiple social, economical, ecological aspects) and the stronger competition between national and international tendencies of scientific (and technical) cooperation are also discussed. (author)

  4. MIRD methodology; Metodologia MIRD

    Energy Technology Data Exchange (ETDEWEB)

    Rojo, Ana M [Autoridad Regulatoria Nuclear, Buenos Aires (Argentina); Gomez Parada, Ines [Sociedad Argentina de Radioproteccion, Buenos Aires (Argentina)

    2004-07-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained.

  5. Beam optimization: improving methodology

    International Nuclear Information System (INIS)

    Quinteiro, Guillermo F.

    2004-01-01

    Different optimization techniques commonly used in biology and food technology allow a systematic and complete analysis of response functions. In spite of the great interest in medical and nuclear physics in the problem of optimizing mixed beams, little attention has been given to sophisticate mathematical tools. Indeed, many techniques are perfectly suited to the typical problem of beam optimization. This article is intended as a guide to the use of two methods, namely Response Surface Methodology and Simplex, that are expected to fasten the optimization process and, meanwhile give more insight into the relationships among the dependent variables controlling the response

  6. Literacy research methodologies

    CERN Document Server

    Duke, Nell K

    2012-01-01

    The definitive reference on literacy research methods, this book serves as a key resource for researchers and as a text in graduate-level courses. Distinguished scholars clearly describe established and emerging methodologies, discuss the types of questions and claims for which each is best suited, identify standards of quality, and present exemplary studies that illustrate the approaches at their best. The book demonstrates how each mode of inquiry can yield unique insights into literacy learning and teaching and how the methods can work together to move the field forward.   New to This Editi

  7. Methodology of site studies

    International Nuclear Information System (INIS)

    Caries, J.C.; Hugon, J.; Grauby, A.

    1980-01-01

    This methodology consists in an essentially dynamic, estimated and follow-up analysis of the impact of discharges on all the environment compartments, whether natural or not, that play a part in the protection of man and his environment. It applies at two levels, to wit: the choice of site, or the detailed study of the site selected. Two examples of its application will be developed, namely: at the choice of site level in the case of marine sites, and of the detailed study level of the chosen site in that of a riverside site [fr

  8. Alternative pricing methodologies

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    With the increased interest in competitive market forces and growing recognition of the deficiencies in current practices, FERC and others are exploring alternatives to embedded cost pricing. A number of these alternatives are discussed in this chapter. Marketplace pricing, discussed briefly here, is the subject of the next chapter. Obviously, the pricing formula may combine several of these methodologies. One utility of which the authors are aware is seeking a price equal to the sum of embedded costs, opportunity costs, line losses, value of service, FERC's percentage adder formula and a contract service charge

  9. Comparison of economic evaluation methodology for the nuclear plant lifetime extension

    International Nuclear Information System (INIS)

    Song, T. H.; Jung, I. S.

    2003-01-01

    In connection with economic evaluation of NPP lifetime management, there are lots of methodologies such as present worth calculation, Levelized Unit Energy Cost (LUEC) calculation, and market benefit comparison methodology. In this paper, economic evaluation of NPP lifetime management was carried out by using these three methodologies, and the results of each was compared with the other methodologies. With these three methodologies, break even points of investment cost related to life extension of nuclear power plant were calculated. It was turned out to be as a analysis result that LUEC is more conservative than present worth calculation and that benefit comparison is more conservative than LUEC, which means that Market Benefit Comparison is the most conservative methodology, and which means base load demand of the future would be far more important than any other factors such as capacity factor, investment cost of life extension, and performance of replacing power plant

  10. The benefits of defining "snacks".

    Science.gov (United States)

    Hess, Julie M; Slavin, Joanne L

    2018-04-18

    Whether eating a "snack" is considered a beneficial or detrimental behavior is largely based on how "snack" is defined. The term "snack food" tends to connote energy-dense, nutrient-poor foods high in nutrients to limit (sugar, sodium, and/or saturated fat) like cakes, cookies, chips and other salty snacks, and sugar-sweetened beverages. Eating a "snack food" is often conflated with eating a "snack," however, leading to an overall perception of snacks as a dietary negative. Yet the term "snack" can also refer simply to an eating occasion outside of breakfast, lunch, or dinner. With this definition, the evidence to support health benefits or detriments to eating a "snack" remains unclear, in part because relatively few well-designed studies that specifically focus on the impact of eating frequency on health have been conducted. Despite these inconsistencies and research gaps, in much of the nutrition literature, "snacking" is still referred to as detrimental to health. As discussed in this review, however, there are multiple factors that influence the health impacts of snacking, including the definition of "snack" itself, the motivation to snack, body mass index of snack eaters, and the food selected as a snack. Without a definition of "snack" and a body of research using methodologically rigorous protocols, determining the health impact of eating a "snack" will continue to elude the nutrition research community and prevent the development of evidence-based policies about snacking that support public health. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Wavelets in quantification of liver tumors in contrasted computed tomography images

    International Nuclear Information System (INIS)

    Rodrigues, Bruna T.; Alvarez, Matheus; Souza, Rafael T.F.; Miranda, Jose R.A.; Romeiro, Fernando G.; Pina, Diana R. de; Trindade, Andre Petean

    2012-01-01

    This paper presents an original methodology of liver tumors segmentation, based on wavelet transform. A virtual phantom was constructed with the same mean and standard deviation of the intensity of gray presented by the measured liver tissue. The optimized algorithm had a sensitivity ranging from 0.81 to 0.83, with a specificity of 0.95 for differentiation of hepatic tumors from normal tissues. We obtained a 96% agreement between the pixels segmented by an experienced radiologist and the algorithm presented here. According to the results shown in this work, the algorithm is optimal for the beginning of the tests for quantification of liver tumors in retrospective surveys. (author)

  12. Processing and quantification of x-ray energy dispersive spectra in the Analytical Electron Microscope

    International Nuclear Information System (INIS)

    Zaluzec, N.J.

    1988-08-01

    Spectral processing in x-ray energy dispersive spectroscopy deals with the extraction of characteristic signals from experimental data. In this text, the four basic procedures for this methodology are reviewed and their limitations outlined. Quantification, on the other hand, deals with the interpretation of the information obtained from spectral processing. Here the limitations are for the most part instrumental in nature. The prospects of higher voltage operation does not, in theory, present any new problems and may in fact prove to be more desirable assuming that electron damage effects do not preclude analysis. 28 refs., 6 figs

  13. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  14. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  15. Benefit-cost assessment programs: Costa Rica case study

    International Nuclear Information System (INIS)

    Clark, A.L.; Trocki, L.K.

    1991-01-01

    An assessment of mineral potential, in terms of types and numbers of deposits, approximate location and associated tonnage and grades, is a valuable input to a nation's economic planning and mineral policy development. This study provides a methodology for applying benefit-cost analysis to mineral resource assessment programs, both to determine the cost effectiveness of resource assessments and to ascertain future benefits to the nation. In a case study of Costa Rica, the benefit-cost ratio of a resource assessment program was computed to be a minimum of 4:1 ($10.6 million to $2.5 million), not including the economic benefits accuring from the creation of 800 mining sector and 1,200 support services jobs. The benefit-cost ratio would be considerably higher if presently proposed revisions of mineral policy were implemented and benefits could be defined for Costa Rica

  16. Cost benefit analysis of power plant database integration

    International Nuclear Information System (INIS)

    Wilber, B.E.; Cimento, A.; Stuart, R.

    1988-01-01

    A cost benefit analysis of plant wide data integration allows utility management to evaluate integration and automation benefits from an economic perspective. With this evaluation, the utility can determine both the quantitative and qualitative savings that can be expected from data integration. The cost benefit analysis is then a planning tool which helps the utility to develop a focused long term implementation strategy that will yield significant near term benefits. This paper presents a flexible cost benefit analysis methodology which is both simple to use and yields accurate, verifiable results. Included in this paper is a list of parameters to consider, a procedure for performing the cost savings analysis, and samples of this procedure when applied to a utility. A case study is presented involving a specific utility where this procedure was applied. Their uses of the cost-benefit analysis are also described

  17. RHIC Data Correlation Methodology

    International Nuclear Information System (INIS)

    Michnoff, R.; D'Ottavio, T.; Hoff, L.; MacKay, W.; Satogata, T.

    1999-01-01

    A requirement for RHIC data plotting software and physics analysis is the correlation of data from all accelerator data gathering systems. Data correlation provides the capability for a user to request a plot of multiple data channels vs. time, and to make meaningful time-correlated data comparisons. The task of data correlation for RHIC requires careful consideration because data acquisition triggers are generated from various asynchronous sources including events from the RHIC Event Link, events from the two Beam Sync Links, and other unrelated clocks. In order to correlate data from asynchronous acquisition systems a common time reference is required. The RHIC data correlation methodology will allow all RHIC data to be converted to a common wall clock time, while still preserving native acquisition trigger information. A data correlation task force team, composed of the authors of this paper, has been formed to develop data correlation design details and provide guidelines for software developers. The overall data correlation methodology will be presented in this paper

  18. Intelligent systems engineering methodology

    Science.gov (United States)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  19. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  20. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  1. Insights into PRA methodologies

    International Nuclear Information System (INIS)

    Gallagher, D.; Lofgren, E.; Atefi, B.; Liner, R.; Blond, R.; Amico, P.

    1984-08-01

    Probabilistic Risk Assessments (PRAs) for six nuclear power plants were examined to gain insight into how the choice of analytical methods can affect the results of PRAs. The PRA sreflectope considered was limited to internally initiated accidents sequences through core melt. For twenty methodological topic areas, a baseline or minimal methodology was specified. The choice of methods for each topic in the six PRAs was characterized in terms of the incremental level of effort above the baseline. A higher level of effort generally reflects a higher level of detail or a higher degree of sophistication in the analytical approach to a particular topic area. The impact on results was measured in terms of how additional effort beyond the baseline level changed the relative importance and ordering of dominant accident sequences compared to what would have been observed had methods corresponding to the baseline level of effort been employed. This measure of impact is a more useful indicator of how methods affect perceptions of plant vulnerabilities than changes in core melt frequency would be. However, the change in core melt frequency was used as a secondary measure of impact for nine topics where availability of information permitted. Results are presented primarily in the form of effort-impact matrices for each of the twenty topic areas. A suggested effort-impact profile for future PRAs is presented

  2. Ecological impact study methodology for hydrotechnical projects

    International Nuclear Information System (INIS)

    Manoliu, Mihai; Toculescu, Razvan

    1993-01-01

    Besides the expected benefits, hydrotechnical projects may entail unfavorable effects on the hydrological regime, environment, health and living conditions of the population. Rational water resource management should take into consideration both the favorable and unfavorable effects. This implies the assessment of socio-economic and environmental impacts of the changes of the hydrological regime. The paper proposes a methodology for carrying out impact studies of hydrotechnical projects. The results of the work are presented graphically on the basis of composite programing. A summary of mathematical methods involved in impact study design is also presented. (authors)

  3. Scrum methodology in banking environment

    OpenAIRE

    Strihová, Barbora

    2015-01-01

    Bachelor thesis "Scrum methodology in banking environment" is focused on one of agile methodologies called Scrum and description of the methodology used in banking environment. Its main goal is to introduce the Scrum methodology and outline a real project placed in a bank focused on software development through a case study, address problems of the project, propose solutions of the addressed problems and identify anomalies of Scrum in software development constrained by the banking environmen...

  4. Experimental Economics: Some Methodological Notes

    OpenAIRE

    Fiore, Annamaria

    2009-01-01

    The aim of this work is presenting in a self-contained paper some methodological aspects as they are received in the current experimental literature. The purpose has been to make a critical review of some very influential papers dealing with methodological issues. In other words, the idea is to have a single paper where people first approaching experimental economics can find summarised (some) of the most important methodological issues. In particular, the focus is on some methodological prac...

  5. Short-term Consumer Benefits of Dynamic Pricing

    OpenAIRE

    Dupont, Benjamin; De Jonghe, Cedric; Kessels, Kris; Belmans, Ronnie

    2011-01-01

    Consumer benefits of dynamic pricing depend on a variety of factors. Consumer characteristics and climatic circumstances widely differ, which forces a regional comparison. This paper presents a general overview of demand response programs and focuses on the short-term benefits of dynamic pricing for an average Flemish residential consumer. It reaches a methodology to develop a cost reflective dynamic pricing program and to estimate short-term bill savings. Participating in a dynamic pricing p...

  6. Uncertainty Quantification in Alchemical Free Energy Methods.

    Science.gov (United States)

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-05-02

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  7. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  8. SPECT quantification of regional radionuclide distributions

    International Nuclear Information System (INIS)

    Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1986-01-01

    SPECT quantification of regional radionuclide activities within the human body is affected by several physical and instrumental factors including attenuation of photons within the patient, Compton scattered events, the system's finite spatial resolution and object size, finite number of detected events, partial volume effects, the radiopharmaceutical biokinetics, and patient and/or organ motion. Furthermore, other instrumentation factors such as calibration of the center-of-rotation, sampling, and detector nonuniformities will affect the SPECT measurement process. These factors are described, together with examples of compensation methods that are currently available for improving SPECT quantification. SPECT offers the potential to improve in vivo estimates of absorbed dose, provided the acquisition, reconstruction, and compensation procedures are adequately implemented and utilized. 53 references, 2 figures

  9. Natural gas benefits

    International Nuclear Information System (INIS)

    1999-01-01

    The General Auditor in the Netherlands studied the natural gas policy in the Netherlands, as has been executed in the past decades, in the period 1997-1999. The purpose of the study is to inform the Dutch parliament on the planning and the backgrounds of the natural gas policy and on the policy risks with respect to the benefits for the Dutch State, taking into account the developments in the policy environment. The final conclusion is that the proposed liberalization of the national natural gas market will result in a considerable deprivation of income for the State in case the benefit policy is not adjusted. This report includes a reaction of the Dutch Minister of Economic Affairs and an afterword of the General Auditor. In the appendix an outline is given of the natural gas policy

  10. Harnessing natural ventilation benefits.

    Science.gov (United States)

    O'Leary, John

    2013-04-01

    Making sure that a healthcare establishment has a good supply of clean fresh air is an important factor in keeping patients, staff, and visitors, free from the negative effects of CO2 and other contaminants. John O'Leary of Trend Controls, a major international supplier of building energy management solutions (BEMS), examines the growing use of natural ventilation, and the health, energy-saving, and financial benefits, that it offers.

  11. Uncertainty quantification for hyperbolic and kinetic equations

    CERN Document Server

    Pareschi, Lorenzo

    2017-01-01

    This book explores recent advances in uncertainty quantification for hyperbolic, kinetic, and related problems. The contributions address a range of different aspects, including: polynomial chaos expansions, perturbation methods, multi-level Monte Carlo methods, importance sampling, and moment methods. The interest in these topics is rapidly growing, as their applications have now expanded to many areas in engineering, physics, biology and the social sciences. Accordingly, the book provides the scientific community with a topical overview of the latest research efforts.

  12. Quantification of heterogeneity observed in medical images

    OpenAIRE

    Brooks, Frank J; Grigsby, Perry W

    2013-01-01

    Background There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging mod...

  13. Benefits of transmission interconnections

    International Nuclear Information System (INIS)

    Lyons, D.

    2006-01-01

    The benefits of new power transmission interconnections from Alberta were discussed with reference to the challenges and measures needed to move forward. Alberta's electricity system has had a long period of sustained growth in generation and demand and this trend is expected to continue. However, no new interconnections have been built since 1985 because the transmission network has not expanded in consequence with the growth in demand. As such, Alberta remains weakly interconnected with the rest of the western region. The benefits of stronger transmission interconnections include improved reliability, long-term generation capability, hydrothermal synergies, a more competitive market, system efficiencies and fuel diversity. It was noted that the more difficult challenges are not technical. Rather, the difficult challenges lie in finding an appropriate business model that recognizes different market structures. It was emphasized that additional interconnections are worthwhile and will require significant collaboration among market participants and governments. It was concluded that interties enable resource optimization between systems and their benefits far exceed their costs. tabs., figs

  14. Use of the SSHAC methodology within regulated environments: Cost-effective application for seismic characterization at multiple sites

    International Nuclear Information System (INIS)

    Coppersmith, Kevin J.; Bommer, Julian J.

    2012-01-01

    Highlights: ► SSHAC processes provide high levels of regulatory assurance in hazard assessments for purposes of licensing and safety review. ► SSHAC projects provide structure to the evaluation of available data, models, and methods for building hazard input models. ► Experience on several nuclear projects in the past 15 years leads to the identification of key essential procedural steps. ► Conducting a regional SSHAC Level 3 study, followed by Level 2 site-specific studies can be time and cost effective. - Abstract: Essential elements of license applications and safety reviews for nuclear facilities are quantifications of earthquake and other natural hazards. A Senior Seismic Hazard Analysis Committee (SSHAC) Level 3 or 4 process provides regulatory assurance that the hazard assessment considers all data and models proposed by members of the technical community and the associated uncertainties have been properly quantified. The SSHAC process has been endorsed as an acceptable hazard assessment methodology in US NRC regulatory guidance. Where hazard studies are required for multiple sites, regional SSHAC Level 3 or 4 studies followed by site-specific Level 2 refinements can provide major benefits in cost and duration.

  15. Need for a marginal methodology in assessing natural gas system methane emissions in response to incremental consumption.

    Science.gov (United States)

    Mac Kinnon, Michael; Heydarzadeh, Zahra; Doan, Quy; Ngo, Cuong; Reed, Jeff; Brouwer, Jacob

    2018-05-17

    Accurate quantification of methane emissions from the natural gas system is important for establishing greenhouse gas inventories and understanding cause and effect for reducing emissions. Current carbon intensity methods generally assume methane emissions are proportional to gas throughput so that increases in gas consumption yield linear increases in emitted methane. However, emissions sources are diverse and many are not proportional to throughput. Insights into the causal drivers of system methane emissions, and how system-wide changes affect such drivers are required. The development of a novel cause-based methodology to assess marginal methane emissions per unit of fuel consumed is introduced. The carbon intensities of technologies consuming natural gas are critical metrics currently used in policy decisions for reaching environmental goals. For example, the low-carbon fuel standard in California uses carbon intensity to determine incentives provided. Current methods generally assume methane emissions from the natural gas system are completely proportional to throughput. The proposed cause-based marginal emissions method will provide a better understanding of the actual drivers of emissions to support development of more effective mitigation measures. Additionally, increasing the accuracy of carbon intensity calculations supports the development of policies that can maximize the environmental benefits of alternative fuels, including reducing greenhouse gas emissions.

  16. Methodological challenges for the evaluation of clinical effectiveness in the context of accelerated regulatory approval: an overview.

    Science.gov (United States)

    Woolacott, Nerys; Corbett, Mark; Jones-Diette, Julie; Hodgson, Robert

    2017-10-01

    Regulatory authorities are approving innovative therapies with limited evidence. Although this level of data is sufficient for the regulator to establish an acceptable risk-benefit balance, it is problematic for downstream health technology assessment, where assessment of cost-effectiveness requires reliable estimates of effectiveness relative to existing clinical practice. Some key issues associated with a limited evidence base include using data, from nonrandomized studies, from small single-arm trials, or from single-center trials; and using surrogate end points. We examined these methodological challenges through a pragmatic review of the available literature. Methods to adjust nonrandomized studies for confounding are imperfect. The relative treatment effect generated from single-arm trials is uncertain and may be optimistic. Single-center trial results may not be generalizable. Surrogate end points, on average, overestimate treatment effects. Current methods for analyzing such data are limited, and effectiveness claims based on these suboptimal forms of evidence are likely to be subject to significant uncertainty. Assessments of cost-effectiveness, based on the modeling of such data, are likely to be subject to considerable uncertainty. This uncertainty must not be underestimated by decision makers: methods for its quantification are required and schemes to protect payers from the cost of uncertainty should be implemented. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  17. Artifacts Quantification of Metal Implants in MRI

    Science.gov (United States)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  18. Methodology for uranium resource estimates and reliability

    International Nuclear Information System (INIS)

    Blanchfield, D.M.

    1980-01-01

    The NURE uranium assessment method has evolved from a small group of geologists estimating resources on a few lease blocks, to a national survey involving an interdisciplinary system consisting of the following: (1) geology and geologic analogs; (2) engineering and cost modeling; (3) mathematics and probability theory, psychology and elicitation of subjective judgments; and (4) computerized calculations, computer graphics, and data base management. The evolution has been spurred primarily by two objectives; (1) quantification of uncertainty, and (2) elimination of simplifying assumptions. This has resulted in a tremendous data-gathering effort and the involvement of hundreds of technical experts, many in uranium geology, but many from other fields as well. The rationality of the methods is still largely based on the concept of an analog and the observation that the results are reasonable. The reliability, or repeatability, of the assessments is reasonably guaranteed by the series of peer and superior technical reviews which has been formalized under the current methodology. The optimism or pessimism of individual geologists who make the initial assessments is tempered by the review process, resulting in a series of assessments which are a consistent, unbiased reflection of the facts. Despite the many improvements over past methods, several objectives for future development remain, primarily to reduce subjectively in utilizing factual information in the estimation of endowment, and to improve the recognition of cost uncertainties in the assessment of economic potential. The 1980 NURE assessment methodology will undoubtly be improved, but the reader is reminded that resource estimates are and always will be a forecast for the future

  19. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    A final report summarizing the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in Cornwall. The geological setting of the test site in the Cornubian granite batholith is described. The effect of structure imposed by discontinuities on the engineering behaviour of rock masses is discussed and the scanline survey method of obtaining data on discontinuities in the rock mass is described. The applicability of some methods of statistical analysis for discontinuity data is reviewed. The requirement for remote geophysical methods of characterizing the mass is discussed and experiments using seismic and ultrasonic velocity measurements are reported. Methods of determining the in-situ stresses are described and the final results of a programme of in-situ stress measurements using the overcoring and hydrofracture methods are reported. (author)

  20. UNCOMMON SENSORY METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladimír Vietoris

    2015-02-01

    Full Text Available Sensory science is the young but the rapidly developing field of the food industry. Actually, the great emphasis is given to the production of rapid techniques of data collection, the difference between consumers and trained panel is obscured and the role of sensory methodologists is to prepare the ways for evaluation, by which a lay panel (consumers can achieve identical results as a trained panel. Currently, there are several conventional methods of sensory evaluation of food (ISO standards, but more sensory laboratories are developing methodologies that are not strict enough in the selection of evaluators, their mechanism is easily understandable and the results are easily interpretable. This paper deals with mapping of marginal methods used in sensory evaluation of food (new types of profiles, CATA, TDS, napping.

  1. Adaptive polynomial chaos techniques for uncertainty quantification of a gas cooled fast reactor transient

    International Nuclear Information System (INIS)

    Perko, Z.; Gilli, L.; Lathouwers, D.; Kloosterman, J. L.

    2013-01-01

    Uncertainty quantification plays an increasingly important role in the nuclear community, especially with the rise of Best Estimate Plus Uncertainty methodologies. Sensitivity analysis, surrogate models, Monte Carlo sampling and several other techniques can be used to propagate input uncertainties. In recent years however polynomial chaos expansion has become a popular alternative providing high accuracy at affordable computational cost. This paper presents such polynomial chaos (PC) methods using adaptive sparse grids and adaptive basis set construction, together with an application to a Gas Cooled Fast Reactor transient. Comparison is made between a new sparse grid algorithm and the traditionally used technique proposed by Gerstner. An adaptive basis construction method is also introduced and is proved to be advantageous both from an accuracy and a computational point of view. As a demonstration the uncertainty quantification of a 50% loss of flow transient in the GFR2400 Gas Cooled Fast Reactor design was performed using the CATHARE code system. The results are compared to direct Monte Carlo sampling and show the superior convergence and high accuracy of the polynomial chaos expansion. Since PC techniques are easy to implement, they can offer an attractive alternative to traditional techniques for the uncertainty quantification of large scale problems. (authors)

  2. Quantification of organic acids in beer by nuclear magnetic resonance (NMR)-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, J.E.A. [CICECO-Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Erny, G.L. [CESAM - Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Barros, A.S. [QOPNAA-Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Esteves, V.I. [CESAM - Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Brandao, T.; Ferreira, A.A. [UNICER, Bebidas de Portugal, Leca do Balio, 4466-955 S. Mamede de Infesta (Portugal); Cabrita, E. [Department of Chemistry, New University of Lisbon, 2825-114 Caparica (Portugal); Gil, A.M., E-mail: agil@ua.pt [CICECO-Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal)

    2010-08-03

    The organic acids present in beer provide important information on the product's quality and history, determining organoleptic properties and being useful indicators of fermentation performance. NMR spectroscopy may be used for rapid quantification of organic acids in beer and different NMR-based methodologies are hereby compared for the six main acids found in beer (acetic, citric, lactic, malic, pyruvic and succinic). The use of partial least squares (PLS) regression enables faster quantification, compared to traditional integration methods, and the performance of PLS models built using different reference methods (capillary electrophoresis (CE), both with direct and indirect UV detection, and enzymatic essays) was investigated. The best multivariate models were obtained using CE/indirect detection and enzymatic essays as reference and their response was compared with NMR integration, either using an internal reference or an electrical reference signal (Electronic REference To access In vivo Concentrations, ERETIC). NMR integration results generally agree with those obtained by PLS, with some overestimation for malic and pyruvic acids, probably due to peak overlap and subsequent integral errors, and an apparent relative underestimation for citric acid. Overall, these results make the PLS-NMR method an interesting choice for organic acid quantification in beer.

  3. Critical aspects of data analysis for quantification in laser-induced breakdown spectroscopy

    Science.gov (United States)

    Motto-Ros, V.; Syvilay, D.; Bassel, L.; Negre, E.; Trichard, F.; Pelascini, F.; El Haddad, J.; Harhira, A.; Moncayo, S.; Picard, J.; Devismes, D.; Bousquet, B.

    2018-02-01

    In this study, a collaborative contest focused on LIBS data processing has been conducted in an original way since the participants did not share the same samples to be analyzed on their own LIBS experiments but a set of LIBS spectra obtained from one single experiment. Each participant was asked to provide the predicted concentrations of several elements for two glass samples. The analytical contest revealed a wide diversity of results among participants, even when the same spectral lines were considered for the analysis. Then, a parametric study was conducted to investigate the influence of each step during the data processing. This study was based on several analytical figures of merit such as the determination coefficient, uncertainty, limit of quantification and prediction ability (i.e., trueness). Then, it was possible to interpret the results provided by the participants, emphasizing the fact that the type of data extraction, baseline modeling as well as the calibration model play key roles in the quantification performance of the technique. This work provides a set of recommendations based on a systematic evaluation of the quantification procedure with the aim of optimizing the methodological steps toward the standardization of LIBS.

  4. Quantification of organic acids in beer by nuclear magnetic resonance (NMR)-based methods

    International Nuclear Information System (INIS)

    Rodrigues, J.E.A.; Erny, G.L.; Barros, A.S.; Esteves, V.I.; Brandao, T.; Ferreira, A.A.; Cabrita, E.; Gil, A.M.

    2010-01-01

    The organic acids present in beer provide important information on the product's quality and history, determining organoleptic properties and being useful indicators of fermentation performance. NMR spectroscopy may be used for rapid quantification of organic acids in beer and different NMR-based methodologies are hereby compared for the six main acids found in beer (acetic, citric, lactic, malic, pyruvic and succinic). The use of partial least squares (PLS) regression enables faster quantification, compared to traditional integration methods, and the performance of PLS models built using different reference methods (capillary electrophoresis (CE), both with direct and indirect UV detection, and enzymatic essays) was investigated. The best multivariate models were obtained using CE/indirect detection and enzymatic essays as reference and their response was compared with NMR integration, either using an internal reference or an electrical reference signal (Electronic REference To access In vivo Concentrations, ERETIC). NMR integration results generally agree with those obtained by PLS, with some overestimation for malic and pyruvic acids, probably due to peak overlap and subsequent integral errors, and an apparent relative underestimation for citric acid. Overall, these results make the PLS-NMR method an interesting choice for organic acid quantification in beer.

  5. MAXIMIZING THE BENEFITS OF ERP SYSTEMS

    Directory of Open Access Journals (Sweden)

    Paulo André da Conceição Menezes

    2010-04-01

    Full Text Available The ERP (Enterprise Resource Planning systems have been consolidated in companies with different sizes and sectors, allowing their real benefits to be definitively evaluated. In this study, several interactions have been studied in different phases, such as the strategic priorities and strategic planning defined as ERP Strategy; business processes review and the ERP selection in the pre-implementation phase, the project management and ERP adaptation in the implementation phase, as well as the ERP revision and integration efforts in the post-implementation phase. Through rigorous use of case study methodology, this research led to developing and to testing a framework for maximizing the benefits of the ERP systems, and seeks to contribute for the generation of ERP initiatives to optimize their performance.

  6. Safety class methodology

    International Nuclear Information System (INIS)

    Donner, E.B.; Low, J.M.; Lux, C.R.

    1992-01-01

    DOE Order 6430.1A, General Design Criteria (GDC), requires that DOE facilities be evaluated with respect to ''safety class items.'' Although the GDC defines safety class items, it does not provide a methodology for selecting safety class items. The methodology described in this paper was developed to assure that Safety Class Items at the Savannah River Site (SRS) are selected in a consistent and technically defensible manner. Safety class items are those in the highest of four categories determined to be of special importance to nuclear safety and, merit appropriately higher-quality design, fabrication, and industrial test standards and codes. The identification of safety class items is approached using a cascading strategy that begins at the 'safety function' level (i.e., a cooling function, ventilation function, etc.) and proceeds down to the system, component, or structure level. Thus, the items that are required to support a safety function are SCls. The basic steps in this procedure apply to the determination of SCls for both new project activities, and for operating facilities. The GDC lists six characteristics of SCls to be considered as a starting point for safety item classification. They are as follows: 1. Those items whose failure would produce exposure consequences that would exceed the guidelines in Section 1300-1.4, ''Guidance on Limiting Exposure of the Public,'' at the site boundary or nearest point of public access 2. Those items required to maintain operating parameters within the safety limits specified in the Operational Safety Requirements during normal operations and anticipated operational occurrences. 3. Those items required for nuclear criticality safety. 4. Those items required to monitor the release of radioactive material to the environment during and after a Design Basis Accident. Those items required to achieve, and maintain the facility in a safe shutdown condition 6. Those items that control Safety Class Item listed above

  7. Quantification and Negation in Event Semantics

    Directory of Open Access Journals (Sweden)

    Lucas Champollion

    2010-12-01

    Full Text Available Recently, it has been claimed that event semantics does not go well together with quantification, especially if one rejects syntactic, LF-based approaches to quantifier scope. This paper shows that such fears are unfounded, by presenting a simple, variable-free framework which combines a Neo-Davidsonian event semantics with a type-shifting based account of quantifier scope. The main innovation is that the event variable is bound inside the verbal denotation, rather than at sentence level by existential closure. Quantifiers can then be interpreted in situ. The resulting framework combines the strengths of event semantics and type-shifting accounts of quantifiers and thus does not force the semanticist to posit either a default underlying word order or a syntactic LF-style level. It is therefore well suited for applications to languages where word order is free and quantifier scope is determined by surface order. As an additional benefit, the system leads to a straightforward account of negation, which has also been claimed to be problematic for event-based frameworks.ReferencesBarker, Chris. 2002. ‘Continuations and the nature of quantification’. Natural Language Semantics 10: 211–242.http://dx.doi.org/10.1023/A:1022183511876Barker, Chris & Shan, Chung-chieh. 2008. ‘Donkey anaphora is in-scope binding’. Semantics and Pragmatics 1: 1–46.Beaver, David & Condoravdi, Cleo. 2007. ‘On the logic of verbal modification’. In Maria Aloni, Paul Dekker & Floris Roelofsen (eds. ‘Proceedings of the Sixteenth Amsterdam Colloquium’, 3–9. Amsterdam, Netherlands: University of Amsterdam.Beghelli, Filippo & Stowell, Tim. 1997. ‘Distributivity and negation: The syntax of each and every’. In Anna Szabolcsi (ed. ‘Ways of scope taking’, 71–107. Dordrecht, Netherlands: Kluwer.Brasoveanu, Adrian. 2010. ‘Modified Numerals as Post-Suppositions’. In Maria Aloni, Harald Bastiaanse, Tikitu de Jager & Katrin Schulz (eds. ‘Logic, Language

  8. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  9. Millennial Expectations and Constructivist Methodologies: Their Corresponding Characteristics and Alignment

    Science.gov (United States)

    Carter, Timothy L.

    2008-01-01

    In recent years, much emphasis has been placed on constructivist methodologies and their potential benefit for learners of various ages (Brandt & Perkins, 2000; Brooks, 1990). Although certain aspects of the constructivist paradigm have replaced several aspects of the behaviorist paradigm for a large contingency of stakeholders (particularly,…

  10. A Big Data Analytics Methodology Program in the Health Sector

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  11. Evolution of aging assessment methodologies

    International Nuclear Information System (INIS)

    McCrea, L.; Dam, R.; Gold, R.

    2011-01-01

    Under the influence of organizations like the IAEA and INPO the expectations of the regulator and plant operators alike are driving the evolution of aging assessment methodologies. The key result is that these assessments need to be executed more efficiently while supporting risk informed thinking within a living process. Some recent trends impacting aging assessments include new standards from the regulator requiring more frequent updates of aging assessments (RD-334), and broader component coverage driven by equipment reliability program demands (INPO AP-913). These trends point to the need to be able to do aging assessment more efficiently, and to manage the configuration. Some of the challenges include increasing efficiency while maintaining completeness and minimizing error, employing a systematic, well defined approach while maintaining the flexibility to apply the right level of effort to achieve desired results, and in particular, assuring that Aging Related Degradation Mechanisms (ARDMs) are sufficiently addressed. Meeting these needs creates a natural synergy with the Preventive Maintenance living program and therefore lends itself to a more integrated approach. To support this program, the SYSTMSTM software has been enhanced to accommodate for the various facets of an integrated program while meeting the needs described above. The systematic processes in SYSTMS are built with the vision of supporting risk-informed decision making as part of a larger risk-based functional tools suite. This paper intends to show how the utilities can benefit from the cost savings associated with increased assessment efficiency, and utilizing Candu Energy Inc.'s CANDU-specific knowledge-base and experience in aging assessment to get it right the first time. (author)

  12. Evolution of aging assessment methodologies

    Energy Technology Data Exchange (ETDEWEB)

    McCrea, L.; Dam, R.; Gold, R. [Candu Energy Inc., Mississauga, Ontario (Canada)

    2011-07-01

    Under the influence of organizations like the IAEA and INPO the expectations of the regulator and plant operators alike are driving the evolution of aging assessment methodologies. The key result is that these assessments need to be executed more efficiently while supporting risk informed thinking within a living process. Some recent trends impacting aging assessments include new standards from the regulator requiring more frequent updates of aging assessments (RD-334), and broader component coverage driven by equipment reliability program demands (INPO AP-913). These trends point to the need to be able to do aging assessment more efficiently, and to manage the configuration. Some of the challenges include increasing efficiency while maintaining completeness and minimizing error, employing a systematic, well defined approach while maintaining the flexibility to apply the right level of effort to achieve desired results, and in particular, assuring that Aging Related Degradation Mechanisms (ARDMs) are sufficiently addressed. Meeting these needs creates a natural synergy with the Preventive Maintenance living program and therefore lends itself to a more integrated approach. To support this program, the SYSTMSTM software has been enhanced to accommodate for the various facets of an integrated program while meeting the needs described above. The systematic processes in SYSTMS are built with the vision of supporting risk-informed decision making as part of a larger risk-based functional tools suite. This paper intends to show how the utilities can benefit from the cost savings associated with increased assessment efficiency, and utilizing Candu Energy Inc.'s CANDU-specific knowledge-base and experience in aging assessment to get it right the first time. (author)

  13. Regional issue identification and assessment: study methodology. First annual report

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    The overall assessment methodologies and models utilized for the first project under the Regional Issue Identification and Assessment (RIIA) program are described. Detailed descriptions are given of the methodologies used by lead laboratories for the quantification of the impacts of an energy scenario on one or more media (e.g., air, water, land, human and ecology), and by all laboratories to assess the regional impacts on all media. The research and assessments reflected in this document were performed by the following national laboratories: Argonne National Laboratory; Brookhaven National Laboratory; Lawrence Berkeley Laboratory; Los Alamos Scientific Laboratory; Oak Ridge National Laboratory; and Pacific Northwest Laboratory. This report contains five chapters. Chapter 1 briefly describes the overall study methodology and introduces the technical participants. Chapter 2 is a summary of the energy policy scenario selected for the RIIA I study and Chapter 3 describes how this scenario was translated into a county-level siting pattern of energy development. The fourth chapter is a detailed description of the individual methodologies used to quantify the environmental and socioeconomic impacts of the scenario while Chapter 5 describes how these impacts were translated into comprehensive regional assessments for each Federal Region.

  14. Benefiting through partnering

    International Nuclear Information System (INIS)

    Carr, T.J.

    2000-01-01

    As a consequence of dramatic changes in the world market in nuclear services over the last decade, BNFL has embarked on a comprehensive strategic review of its business. Central to this review has been the need for the company to achieve cost reduction and improved efficiency in all aspects of its business. An area where substantial benefits can be gained is in improved efficiency in the discharge of the capital expenditure programme. This paper focuses on the opportunity of profiting through partnering in capital project delivery. (author)

  15. Uncertainty of a detected spatial cluster in 1D: quantification and visualization

    KAUST Repository

    Lee, Junho; Gangnon, Ronald E.; Zhu, Jun; Liang, Jingjing

    2017-01-01

    Spatial cluster detection is an important problem in a variety of scientific disciplines such as environmental sciences, epidemiology and sociology. However, there appears to be very limited statistical methodology for quantifying the uncertainty of a detected cluster. In this paper, we develop a new method for the quantification and visualization of uncertainty associated with a detected cluster. Our approach is defining a confidence set for the true cluster and visualizing the confidence set, based on the maximum likelihood, in time or in one-dimensional space. We evaluate the pivotal property of the statistic used to construct the confidence set and the coverage rate for the true cluster via empirical distributions. For illustration, our methodology is applied to both simulated data and an Alaska boreal forest dataset. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    Science.gov (United States)

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  17. Uncertainty of a detected spatial cluster in 1D: quantification and visualization

    KAUST Repository

    Lee, Junho

    2017-10-19

    Spatial cluster detection is an important problem in a variety of scientific disciplines such as environmental sciences, epidemiology and sociology. However, there appears to be very limited statistical methodology for quantifying the uncertainty of a detected cluster. In this paper, we develop a new method for the quantification and visualization of uncertainty associated with a detected cluster. Our approach is defining a confidence set for the true cluster and visualizing the confidence set, based on the maximum likelihood, in time or in one-dimensional space. We evaluate the pivotal property of the statistic used to construct the confidence set and the coverage rate for the true cluster via empirical distributions. For illustration, our methodology is applied to both simulated data and an Alaska boreal forest dataset. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Can the CFO Trust the FX Exposure Quantification from a Stock Market Approach?

    DEFF Research Database (Denmark)

    Aabo, Tom; Brodin, Danielle

    This study examines the sensitivity of detected exchange rate exposures at the firm specific level to changes in methodological choices using a traditional two factor stock market approach for exposure quantification. We primarily focus on two methodological choices: the choice of market index...... and the choice of observation frequency. We investigate to which extent the detected exchange rate exposures for a given firm can be confirmed when the choice of market index and/or the choice of observation frequency are changed. Applying our sensitivity analysis to Scandinavian non-financial firms, we...... thirds of the number of detected exposures using weekly data and 2) there is no economic rationale that the detected exposures at the firm-specific level should change when going from the use of weekly data to the use of monthly data. In relation to a change in the choice of market index, we find...

  19. Behavioural Insights into Benefits Claimants' Training

    Science.gov (United States)

    Gloster, Rosie; Buzzeo, Jonathan; Cox, Annette; Bertram, Christine; Tassinari, Arianna; Schmidtke, Kelly Ann; Vlaev, Ivo

    2018-01-01

    Purpose: The purpose of this paper is to explore the behavioural determinants of work-related benefits claimants' training behaviours and to suggest ways to improve claimants' compliance with training referrals. Design/methodology/approach: Qualitative interviews were conducted with 20 Jobcentre Plus staff and training providers, and 60 claimants.…

  20. Requirements and benefits of flow forecasting for improving hydropower generation

    NARCIS (Netherlands)

    Dong, Xiaohua; Vrijling, J.K.; Dohmen-Janssen, Catarine M.; Ruigh, E.; Booij, Martijn J.; Stalenberg, B.; Hulscher, Suzanne J.M.H.; van Gelder, P.H.A.J.M.; Verlaan, M.; Zijderveld, A.; Waarts, P.

    2005-01-01

    This paper presents a methodology to identify the required lead time and accuracy of flow forecasting for improving hydropower generation of a reservoir, by simulating the benefits (in terms of electricity generated) obtained from the forecasting with varying lead times and accuracies. The

  1. Determination of benefit of early identification of severe forms of ...

    African Journals Online (AJOL)

    Background/Aims: A pilot study to determine benefits of early identification of severe forms of malaria in peripheral centres was carried out in 3 rural communities of South Eastern Nigeria. Methodology: The study area is located in the rain forest belt of South Eastern Nigeria with high temperature and humidity. It is a typical ...

  2. Liquid fuel concept benefits

    International Nuclear Information System (INIS)

    Hron, M.

    1996-01-01

    There are principle drawbacks of any kind of solid nuclear fuel listed and analyzed in the first part of the paper. One of the primary results of the analyses performed shows that the solid fuel concept, which was to certain degree advantageous in the first periods of a nuclear reactor development and operation, has guided this branch of a utilization of atomic nucleus energy to a death end. On the background of this, the liquid fuel concept and its benefits are introduced and briefly described in the first part of the paper, too. As one of the first realistic attempts to utilize the advantages of liquid fuels, the reactor/blanket system with molten fluoride salts in the role of fuel and coolant simultaneously, as incorporated in the accelerator-driven transmutation technology (ADTT) being proposed and currently having been under development in the Los Alamos National Laboratory, will be studied both theoretically and experimentally. There is a preliminary design concept of an experimental assembly LA-O briefly introduced in the paper which is under preparation in the Czech Republic for such a project. Finally, there will be another very promising concept of a small low power ADTT system introduced which is characterized by a high level of safety and economical efficiency. In the conclusion, the overall survey of principal benefits which may be expected by introducing liquid nuclear fuel in nuclear power and research reactor systems is given and critically analyzed. 7 refs, 4 figs

  3. Radiation: cost or benefit?

    International Nuclear Information System (INIS)

    Crouch, D.

    1988-01-01

    In a previous issue of SCRAM it was argued that the apparent increased incidence of child leukaemia around nuclear power stations could have been caused by radioactive discharges into the environment. The National Radiological Protection Board (NRPB) claim that the known levels of contamination could not be responsible for the observed cancer rates. NRPB estimates of radiation risk are, however, considered to be underestimates. The NRPB is criticised for its study of the Sellafield workforce which excluded ex-employees and which revealed, when a statistical mistake was put right, a significant excess of myeloma amongst the Windscale workforce. The radiation protection philosophy of the NRPB is based on a cost benefit analysis which balances the cost of protection against the benefits of power generation. Criticism is made of NRPB, not only for ignoring long-term risks and costs but also for suggesting that some levels of radiation exposure are acceptable. The Board is also accused of not being independent of the nuclear industry. (UK)

  4. The Methodological Imperatives of Feminist Ethnography

    Directory of Open Access Journals (Sweden)

    Richelle D. Schrock

    2013-12-01

    Full Text Available Feminist ethnography does not have a single, coherent definition and is caught between struggles over the definition and goals of feminism and the multiple practices known collectively as ethnography. Towards the end of the 1980s, debates emerged that problematized feminist ethnography as a productive methodology and these debates still haunt feminist ethnographers today. In this article, I provide a concise historiography of feminist ethnography that summarizes both its promises and its vulnerabilities. I address the three major challenges I argue feminist ethnographers currently face, which include responding productively to feminist critiques of representing "others," accounting for feminisms' commitment to social change while grappling with poststructuralist critiques of knowledge production, and confronting the historical and ongoing lack of recognition for significant contributions by feminist ethnographers. Despite these challenges, I argue that feminist ethnography is a productive methodology and I conclude by delineating its methodological imperatives. These imperatives include producing knowledge about women's lives in specific cultural contexts, recognizing the potential detriments and benefits of representation, exploring women's experiences of oppression along with the agency they exercise in their own lives, and feeling an ethical responsibility towards the communities in which the researchers work. I argue that this set of imperatives enables feminist ethnographers to successfully navigate the challenges they face.

  5. Common methodological flaws in economic evaluations.

    Science.gov (United States)

    Drummond, Michael; Sculpher, Mark

    2005-07-01

    Economic evaluations are increasingly being used by those bodies such as government agencies and managed care groups that make decisions about the reimbursement of health technologies. However, several reviews of economic evaluations point to numerous deficiencies in the methodology of studies or the failure to follow published methodological guidelines. This article, written for healthcare decision-makers and other users of economic evaluations, outlines the common methodological flaws in studies, focussing on those issues that are likely to be most important when deciding on the reimbursement, or guidance for use, of health technologies. The main flaws discussed are: (i) omission of important costs or benefits; (ii) inappropriate selection of alternatives for comparison; (iii) problems in making indirect comparisons; (iv) inadequate representation of the effectiveness data; (v) inappropriate extrapolation beyond the period observed in clinical studies; (vi) excessive use of assumptions rather than data; (vii) inadequate characterization of uncertainty; (viii) problems in aggregation of results; (ix) reporting of average cost-effectiveness ratios; (x) lack of consideration of generalizability issues; and (xi) selective reporting of findings. In each case examples are given from the literature and guidance is offered on how to detect flaws in economic evaluations.

  6. Assessment of methodologies for radioactive waste management

    International Nuclear Information System (INIS)

    Hoos, I.R.

    1978-01-01

    No quantitative methodology is adequate to encompass and assess all the risks, no risk/benefit calculation is fine-tuned enough to supply decision-makers with the full range and all of the dimensions. Quality assurance cannot be conceived in terms of systems design alone, but must be maintained vigilantly and with integrity throughout the process. The responsibility of the NRC is fairly well established with respect to overall reactor safety. With respect to the management of radioactive wastes, its mission is not yet so clearly delineated. Herein lies a challenge and an opportunity. Where the known quantitative methodologies are restrictive and likely to have negative feedback effect on authority and public support, the broader lens and the bolder thrust are called for. The cozy cocoon of figures ultimately protects no one. The Commission, having acknowledged that the management of radioactive wastes is not merely a technological matter can now take the socially responsible position of exploring as fully and confronting as candidly as possible the total range of dimensions involved. Paradoxically, it is Charles J. Hitch, intellectual progenitor of the methodology, who observes that we may be missing the meaning of his message by relying too heavily on quantitative analysis and thus defining our task too narrowly. We live in a closed system, in which science and technology, politics and economics, and, above all, social and human elements interact, sometimes to create the problems, sometimes to articulate the questions, and sometimes to find viable solutions

  7. Methodological Problems of Nanotechnoscience

    Science.gov (United States)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  8. Methodological themes and variations

    International Nuclear Information System (INIS)

    Tetlock, P.E.

    1989-01-01

    This paper reports on the tangible progress that has been made in clarifying the underlying processes that affect both the likelihood of war in general and of nuclear war in particular. It also illustrates how difficult it is to make progress in this area. Nonetheless, what has been achieved should not be minimized. We have learned a good deal on both the theoretical and the methodological fronts and, perhaps, most important, we have learned a good deal about the limits of our knowledge. Knowledge of our ignorance---especially in a policy domain where confident, even glib, causal assertions are so common---can be a major contribution in itself. The most important service the behavioral and social sciences can currently provide to the policy making community may well be to make thoughtful skepticism respectable: to sensitize those who make key decisions to the uncertainty surrounding our understanding of international conflict and to the numerous qualifications that now need to be attached to simple causal theories concerning the origins of war

  9. Engineering radioecology: Methodological considerations

    International Nuclear Information System (INIS)

    Nechaev, A.F.; Projaev, V.V.; Sobolev, I.A.; Dmitriev, S.A.

    1995-01-01

    The term ''radioecology'' has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ''engineering radioecology'', seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology

  10. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  11. Improving perfusion quantification in arterial spin labeling for delayed arrival times by using optimized acquisition schemes

    International Nuclear Information System (INIS)

    Kramme, Johanna; Diehl, Volker; Madai, Vince I.; Sobesky, Jan; Guenther, Matthias

    2015-01-01

    The improvement in Arterial Spin Labeling (ASL) perfusion quantification, especially for delayed bolus arrival times (BAT), with an acquisition redistribution scheme mitigating the T1 decay of the label in multi-TI ASL measurements is investigated. A multi inflow time (TI) 3D-GRASE sequence is presented which adapts the distribution of acquisitions accordingly, by keeping the scan time constant. The MR sequence increases the number of averages at long TIs and decreases their number at short TIs and thus compensating the T1 decay of the label. The improvement of perfusion quantification is evaluated in simulations as well as in-vivo in healthy volunteers and patients with prolonged BATs due to age or steno-occlusive disease. The improvement in perfusion quantification depends on BAT. At healthy BATs the differences are small, but become larger for longer BATs typically found in certain diseases. The relative error of perfusion is improved up to 30% at BATs > 1500 ms in comparison to the standard acquisition scheme. This adapted acquisition scheme improves the perfusion measurement in comparison to standard multi-TI ASL implementations. It provides relevant benefit in clinical conditions that cause prolonged BATs and is therefore of high clinical relevance for neuroimaging of steno-occlusive diseases.

  12. Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Lines, Amanda M. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Adami, Susan R. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Sinkov, Sergey I. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Lumetta, Gregg J. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Bryan, Samuel A. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States

    2017-08-09

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example of a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.

  13. Forensic Uncertainty Quantification of Explosive Dispersal of Particles

    Science.gov (United States)

    Hughes, Kyle; Park, Chanyoung; Haftka, Raphael; Kim, Nam-Ho

    2017-06-01

    In addition to the numerical challenges of simulating the explosive dispersal of particles, validation of the simulation is often plagued with poor knowledge of the experimental conditions. The level of experimental detail required for validation is beyond what is usually included in the literature. This presentation proposes the use of forensic uncertainty quantification (UQ) to investigate validation-quality experiments to discover possible sources of uncertainty that may have been missed in initial design of experiments or under-reported. The current experience of the authors has found that by making an analogy to crime scene investigation when looking at validation experiments, valuable insights may be gained. One examines all the data and documentation provided by the validation experimentalists, corroborates evidence, and quantifies large sources of uncertainty a posteriori with empirical measurements. In addition, it is proposed that forensic UQ may benefit from an independent investigator to help remove possible implicit biases and increases the likelihood of discovering unrecognized uncertainty. Forensic UQ concepts will be discussed and then applied to a set of validation experiments performed at Eglin Air Force Base. This work was supported in part by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program.

  14. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  15. Development of a methodology for the application of the analysis of human reliability to individualized temporary storage facility; Desarrollo de una metodologia de aplicacion del Analisis de Fiabilidad Humana a una instalacion de Almacen Temporal Individualizado

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, P.; Dies, J.; Tapia, C.; Blas, A. de

    2014-07-01

    The paper aims to present the methodology that has been developed with the purpose of applying an ATI without the need of having experts during the process of modelling and quantification analysis of HRA. The developed methodology is based on ATHEANA and relies on the use of other methods of analysis of human action and in-depth analysis. (Author)

  16. Application of the NUREG/CR-6850 EPRI/NRC Fire PRA Methodology to a DOE Facility

    International Nuclear Information System (INIS)

    Elicson, Tom; Harwood, Bentley; Yorg, Richard; Lucek, Heather; Bouchard, Jim; Jukkola, Ray; Phan, Duan

    2011-01-01

    The application NUREG/CR-6850 EPRI/NRC fire PRA methodology to DOE facility presented several challenges. This paper documents the process and discusses several insights gained during development of the fire PRA. A brief review of the tasks performed is provided with particular focus on the following: Tasks 5 and 14: Fire-induced risk model and fire risk quantification. A key lesson learned was to begin model development and quantification as early as possible in the project using screening values and simplified modeling if necessary. Tasks 3 and 9: Fire PRA cable selection and detailed circuit failure analysis. In retrospect, it would have been beneficial to perform the model development and quantification in 2 phases with detailed circuit analysis applied during phase 2. This would have allowed for development of a robust model and quantification earlier in the project and would have provided insights into where to focus the detailed circuit analysis efforts. Tasks 8 and 11: Scoping fire modeling and detailed fire modeling. More focus should be placed on detailed fire modeling and less focus on scoping fire modeling. This was the approach taken for the fire PRA. Task 14: Fire risk quantification. Typically, multiple safe shutdown (SSD) components fail during a given fire scenario. Therefore dependent failure analysis is critical to obtaining a meaningful fire risk quantification. Dependent failure analysis for the fire PRA presented several challenges which will be discussed in the full paper.

  17. Uncertainties and quantification of common cause failure rates and probabilities for system analyses

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2005-01-01

    Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology

  18. Methods for modeling and quantification in functional imaging by positron emissions tomography and magnetic resonance imaging

    International Nuclear Information System (INIS)

    Costes, Nicolas

    2017-01-01

    This report presents experiences and researches in the field of in vivo medical imaging by positron emission tomography (PET) and magnetic resonance imaging (MRI). In particular, advances in terms of reconstruction, quantification and modeling in PET are described. The validation of processing and analysis methods is supported by the creation of data by simulation of the imaging process in PET. The recent advances of combined PET/MRI clinical cameras, allowing simultaneous acquisition of molecular/metabolic PET information, and functional/structural MRI information opens the door to unique methodological innovations, exploiting spatial alignment and simultaneity of the PET and MRI signals. It will lead to an increase in accuracy and sensitivity in the measurement of biological phenomena. In this context, the developed projects address new methodological issues related to quantification, and to the respective contributions of MRI or PET information for a reciprocal improvement of the signals of the two modalities. They open perspectives for combined analysis of the two imaging techniques, allowing optimal use of synchronous, anatomical, molecular and functional information for brain imaging. These innovative concepts, as well as data correction and analysis methods, will be easily translated into other areas of investigation using combined PET/MRI. (author) [fr

  19. A multi-model assessment of the co-benefits of climate mitigation for global air quality

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Shilpa; Klimont, Zbigniew; Leitao, Joana; Riahi, Keywan; van Dingenen, Rita; Reis, Lara Aleluia; Calvin, Katherine; Dentener, Frank; Drouet, Laurent; Fujimori, Shinichiro; Harmsen, Mathijs; Luderer, Gunnar; Heyes, Chris; Strefler, Jessica; Tavoni, Massimo; van Vuuren, Detlef P.

    2016-12-01

    sector and region level. A second methodological advancement is a quantification of the co-benefits in terms of the associated atmospheric concentrations of fine particulate matter (PM2.5) and consequent mortality related outcomes across different models. This is made possible by the use of state-of the art simplified atmospheric model that allows for the first time a computationally feasible multi-model evaluation of such outcomes.

  20. Quantification in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Buvat, Irene

    2005-01-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena; 2 - quantification in SPECT, problems and correction methods: Attenuation, scattering, un-stationary spatial resolution, partial volume effect, movement, tomographic reconstruction, calibration; 3 - Synthesis: actual quantification accuracy; 4 - Beyond the activity concentration measurement

  1. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  2. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  3. Risks and benefits of energy systems in Czechoslovakia

    International Nuclear Information System (INIS)

    Bohal, L.; Erban, P.; Kadlec, J.; Kraus, V.; Trcka, V.

    1984-01-01

    The paper describes the fundamental philosophy of an approach to risk and benefit assessment in the fuel and energy complex in Czechoslovakia. The first part analyses the need to solve the risk and benefit problems stemming from structural changes occurring in the Czechoslovakian fuel and energy complex. The second part describes main features of risk and benefit research with special respect to the fuel and energy complex defined within the framework of the national economy with interfaces to the relevant environment. Furthermore, a glimpse is given of how to assess, using the general philosophy, the risks and benefits of various developing variants of the fuel and energy complex. The third part deals with methodological aspects of such risk and benefit evaluation research with special consideration of the methods of long-term prediction in structural analysis and multi-measure assessment. Finally, further progress in solving these problems in VUPEK and some other Czechoslovakian scientific institutions is briefly noted. (author)

  4. A Practical Risk Assessment Methodology for Safety-Critical Train Control Systems

    Science.gov (United States)

    2009-07-01

    This project proposes a Practical Risk Assessment Methodology (PRAM) for analyzing railroad accident data and assessing the risk and benefit of safety-critical train control systems. This report documents in simple steps the algorithms and data input...

  5. Quantification of competitive value of documents

    Directory of Open Access Journals (Sweden)

    Pavel Šimek

    2009-01-01

    Full Text Available The majority of Internet users use the global network to search for different information using fulltext search engines such as Google, Yahoo!, or Seznam. The web presentation operators are trying, with the help of different optimization techniques, to get to the top places in the results of fulltext search engines. Right there is a great importance of Search Engine Optimization and Search Engine Marketing, because normal users usually try links only on the first few pages of the fulltext search engines results on certain keywords and in catalogs they use primarily hierarchically higher placed links in each category. Key to success is the application of optimization methods which deal with the issue of keywords, structure and quality of content, domain names, individual sites and quantity and reliability of backward links. The process is demanding, long-lasting and without a guaranteed outcome. A website operator without advanced analytical tools do not identify the contribution of individual documents from which the entire web site consists. If the web presentation operators want to have an overview of their documents and web site in global, it is appropriate to quantify these positions in a specific way, depending on specific key words. For this purpose serves the quantification of competitive value of documents, which consequently sets global competitive value of a web site. Quantification of competitive values is performed on a specific full-text search engine. For each full-text search engine can be and often are, different results. According to published reports of ClickZ agency or Market Share is according to the number of searches by English-speaking users most widely used Google search engine, which has a market share of more than 80%. The whole procedure of quantification of competitive values is common, however, the initial step which is the analysis of keywords depends on a choice of the fulltext search engine.

  6. Advances in forensic DNA quantification: a review.

    Science.gov (United States)

    Lee, Steven B; McCord, Bruce; Buel, Eric

    2014-11-01

    This review focuses upon a critical step in forensic biology: detection and quantification of human DNA from biological samples. Determination of the quantity and quality of human DNA extracted from biological evidence is important for several reasons. Firstly, depending on the source and extraction method, the quality (purity and length), and quantity of the resultant DNA extract can vary greatly. This affects the downstream method as the quantity of input DNA and its relative length can determine which genotyping procedure to use-standard short-tandem repeat (STR) typing, mini-STR typing or mitochondrial DNA sequencing. Secondly, because it is important in forensic analysis to preserve as much of the evidence as possible for retesting, it is important to determine the total DNA amount available prior to utilizing any destructive analytical method. Lastly, results from initial quantitative and qualitative evaluations permit a more informed interpretation of downstream analytical results. Newer quantitative techniques involving real-time PCR can reveal the presence of degraded DNA and PCR inhibitors, that provide potential reasons for poor genotyping results and may indicate methods to use for downstream typing success. In general, the more information available, the easier it is to interpret and process the sample resulting in a higher likelihood of successful DNA typing. The history of the development of quantitative methods has involved two main goals-improving precision of the analysis and increasing the information content of the result. This review covers advances in forensic DNA quantification methods and recent developments in RNA quantification. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Making benefit transfers work

    DEFF Research Database (Denmark)

    Bateman, I.J.; Brouwer, R.; Ferrini, S.

    We develop and test guidance principles for benefits transfers. These argue that when transferring across relatively similar sites, simple mean value transfers are to be preferred but that when sites are relatively dissimilar then value function transfers will yield lower errors. The paper also...... provides guidance on the appropriate specification of transferable value functions arguing that these should be developed from theoretical rather than ad-hoc statistical principles. These principles are tested via a common format valuation study of water quality improvements across five countries. Results...... support our various hypotheses providing a set of principles for future transfer studies. The application also considers new ways of incorporating distance decay, substitution and framing effects within transfers and presents a novel water quality ladder....

  8. PESTICIDES: BENEFITS AND HAZARDS

    Directory of Open Access Journals (Sweden)

    Ivan Maksymiv

    2015-05-01

    Full Text Available Pesticides are an integral part of modern life used to prevent growth of unwanted living  organisms. Despite the fact that scientific statements coming from many toxicological works provide indication on the low risk of the pesticides and their residues, the community especially last years is deeply concerned about massive application of pesticides in diverse fields. Therefore evaluation of hazard risks particularly in long term perspective is very important. In the fact there are at least two clearly different approaches for evaluation of pesticide using: the first one is defined as an objective or probabilistic risk assessment, while the second one is the potential economic and agriculture benefits. Therefore, in this review the author has considered scientifically based assessment of positive and negative effects of pesticide application and discusses possible approaches to find balance between them.

  9. An approach for quantification of platinum distribution in tissues by LA-ICP-MS imaging using isotope dilution analysis.

    Science.gov (United States)

    Moraleja, I; Mena, M L; Lázaro, A; Neumann, B; Tejedor, A; Jakubowski, N; Gómez-Gómez, M M; Esteban-Fernández, D

    2018-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been revealed as a convenient technique for trace elemental imaging in tissue sections, providing elemental 2D distribution at a quantitative level. For quantification purposes, in the last years several approaches have been proposed in the literature such as the use of CRMs or matrix matched standards. The use of Isotope Dilution (ID) for quantification by LA-ICP-MS has been also described, being mainly useful for bulk analysis but not feasible for spatial measurements so far. In this work, a quantification method based on ID analysis was developed by printing isotope-enriched inks onto kidney slices from rats treated with antitumoral Pt-based drugs using a commercial ink-jet device, in order to perform an elemental quantification in different areas from bio-images. For the ID experiments 194 Pt enriched platinum was used. The methodology was validated by deposition of natural Pt standard droplets with a known amount of Pt onto the surface of a control tissue, where could be quantified even 50pg of Pt, with recoveries higher than 90%. The amount of Pt present in the whole kidney slices was quantified for cisplatin, carboplatin and oxaliplatin-treated rats. The results obtained were in accordance with those previously reported. The amount of Pt distributed between the medullar and cortical areas was also quantified, observing different behavior for the three drugs. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. University Benefits Survey. Part 1 (All Benefits Excluding Pensions).

    Science.gov (United States)

    University of Western Ontario, London.

    Results of a 1983 survey of benefits, excluding pensions, for 17 Ontario, Canada, universities are presented. Information is provided on the following areas: whether the university self-administers insurance plans, communication of benefits, proposed changes in benefits, provision of life and dismemberment insurance, maternity leave policy,…

  11. Quantification of thermal damage in skin tissue

    Institute of Scientific and Technical Information of China (English)

    Xu Feng; Wen Ting; Lu Tianjian; Seffen Keith

    2008-01-01

    Skin thermal damage or skin burns are the most commonly encountered type of trauma in civilian and military communities. Besides, advances in laser, microwave and similar technologies have led to recent developments of thermal treatments for disease and damage involving skin tissue, where the objective is to induce thermal damage precisely within targeted tissue structures but without affecting the surrounding, healthy tissue. Further, extended pain sensation induced by thermal damage has also brought great problem for burn patients. Thus, it is of great importance to quantify the thermal damage in skin tissue. In this paper, the available models and experimental methods for quantification of thermal damage in skin tissue are discussed.

  12. Automated Quantification of Pneumothorax in CT

    Science.gov (United States)

    Do, Synho; Salvaggio, Kristen; Gupta, Supriya; Kalra, Mannudeep; Ali, Nabeel U.; Pien, Homer

    2012-01-01

    An automated, computer-aided diagnosis (CAD) algorithm for the quantification of pneumothoraces from Multidetector Computed Tomography (MDCT) images has been developed. Algorithm performance was evaluated through comparison to manual segmentation by expert radiologists. A combination of two-dimensional and three-dimensional processing techniques was incorporated to reduce required processing time by two-thirds (as compared to similar techniques). Volumetric measurements on relative pneumothorax size were obtained and the overall performance of the automated method shows an average error of just below 1%. PMID:23082091

  13. Uncertainty quantification for PZT bimorph actuators

    Science.gov (United States)

    Bravo, Nikolas; Smith, Ralph C.; Crews, John

    2018-03-01

    In this paper, we discuss the development of a high fidelity model for a PZT bimorph actuator used for micro-air vehicles, which includes the Robobee. We developed a high-fidelity model for the actuator using the homogenized energy model (HEM) framework, which quantifies the nonlinear, hysteretic, and rate-dependent behavior inherent to PZT in dynamic operating regimes. We then discussed an inverse problem on the model. We included local and global sensitivity analysis of the parameters in the high-fidelity model. Finally, we will discuss the results of Bayesian inference and uncertainty quantification on the HEM.

  14. Linking probe thermodynamics to microarray quantification

    International Nuclear Information System (INIS)

    Li, Shuzhao; Pozhitkov, Alexander; Brouwer, Marius

    2010-01-01

    Understanding the difference in probe properties holds the key to absolute quantification of DNA microarrays. So far, Langmuir-like models have failed to link sequence-specific properties to hybridization signals in the presence of a complex hybridization background. Data from washing experiments indicate that the post-hybridization washing has no major effect on the specifically bound targets, which give the final signals. Thus, the amount of specific targets bound to probes is likely determined before washing, by the competition against nonspecific binding. Our competitive hybridization model is a viable alternative to Langmuir-like models. (comment)

  15. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  16. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  17. Dosimetric methodology of the ICRP

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    1994-01-01

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples

  18. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  19. Analytical methodology for nuclear safeguards

    International Nuclear Information System (INIS)

    Ramakumar, K.L.

    2011-01-01

    This paper attempts to briefly describe the analytical methodologies available and also highlight some of the challenges, expectations from nuclear material accounting and control (NUMAC) point of view

  20. Country report: a methodology

    International Nuclear Information System (INIS)

    Colin, A.

    2013-01-01

    This paper describes a methodology which could be applicable to establish a country report. In the framework of nuclear non proliferation appraisal and IAEA safeguards implementation, it is important to be able to assess the potential existence of undeclared nuclear materials and activities as undeclared facilities in the country under review. In our views a country report should aim at providing detailed information on nuclear related activities for each country examined taken 'as a whole' such as nuclear development, scientific and technical capabilities, etc. In order to study a specific country, we need to know if there is already an operating nuclear civil programme or not. In the first case, we have to check carefully if it could divert nuclear material, if there are misused declared facilities or if they operate undeclared facilities and conduct undeclared activities aiming at manufacturing nuclear weapon. In the second case, we should pay attention to the development of a nuclear civil project. A country report is based on a wide span of information (most of the time coming from open sources but sometimes coming also from confidential or private ones). Therefore, it is important to carefully check the nature and the credibility (reliability?) of these sources through cross-check examination. Eventually, it is necessary to merge information from different sources and apply an expertise filter. We have at our disposal a lot of performing tools to help us to assess, understand and evaluate the situation (cartography, imagery, bibliometry, etc.). These tools allow us to offer the best conclusions as far as possible. The paper is followed by the slides of the presentation. (author)

  1. Microbiological Methodology in Astrobiology

    Science.gov (United States)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  2. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    Science.gov (United States)

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  3. An Innovative Fuzzy-Logic-Based Methodology for Trend Identification

    International Nuclear Information System (INIS)

    Wang Xin; Tsoukalas, Lefteri H.; Wei, Thomas Y.C.; Reifman, Jaques

    2001-01-01

    A new fuzzy-logic-based methodology for on-line signal trend identification is introduced. The methodology may be used for detecting the onset of nuclear power plant (NPP) transients at the earliest possible time and could be of great benefit to diagnostic, maintenance, and performance-monitoring programs. Although signal trend identification is complicated by the presence of noise, fuzzy methods can help capture important features of on-line signals, integrate the information included in these features, and classify incoming NPP signals into increasing, decreasing, and steady-state trend categories. A computer program named PROTREN is developed and tested for the purpose of verifying this methodology using NPP and simulation data. The results indicate that the new fuzzy-logic-based methodology is capable of detecting transients accurately, it identifies trends reliably and does not misinterpret a steady-state signal as a transient one

  4. The economic costs and benefits of potassium iodide prophylaxis for a reference LWR facility in the United States

    International Nuclear Information System (INIS)

    Behling, U.H.; Behling, K.

    1995-01-01

    Policy decisions relating to radiation protection are commonly based on an evaluation in which the benefits of exposure reduction are compared to the economic costs of the protective measure. A generic difficulty countered in cost-benefit analyses, however, is the quantification of major elements that define the costs and the benefits in commensurate units. In this study, the costs of making KI (potassium iodine) available for public use and the avoidance of thyroidal health effects (i.e., the benefit) in the event of nuclear emergency are defined in the commensurate units of dollars. (Authors). 11 refs., 15 tabs

  5. Benefits of public roadside safety rest areas in Texas : technical report.

    Science.gov (United States)

    2011-05-01

    The objective of this investigation was to develop a benefit-cost analysis methodology for safety rest areas in : Texas and to demonstrate its application in select corridors throughout the state. In addition, this project : considered novel safety r...

  6. Multiplex electrochemical DNA platform for femtomolar-level quantification of genetically modified soybean.

    Science.gov (United States)

    Manzanares-Palenzuela, C Lorena; de-los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2015-06-15

    Current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) with a minimum content of 0.9% would benefit from the availability of reliable and rapid methods to detect and quantify DNA sequences specific for GMOs. Different genosensors have been developed to this aim, mainly intended for GMO screening. A remaining challenge, however, is the development of genosensing platforms for GMO quantification, which should be expressed as the number of event-specific DNA sequences per taxon-specific sequences. Here we report a simple and sensitive multiplexed electrochemical approach for the quantification of Roundup-Ready Soybean (RRS). Two DNA sequences, taxon (lectin) and event-specific (RR), are targeted via hybridization onto magnetic beads. Both sequences are simultaneously detected by performing the immobilization, hybridization and labeling steps in a single tube and parallel electrochemical readout. Hybridization is performed in a sandwich format using signaling probes labeled with fluorescein isothiocyanate (FITC) or digoxigenin (Dig), followed by dual enzymatic labeling using Fab fragments of anti-Dig and anti-FITC conjugated to peroxidase or alkaline phosphatase, respectively. Electrochemical measurement of the enzyme activity is finally performed on screen-printed carbon electrodes. The assay gave a linear range of 2-250 pM for both targets, with LOD values of 650 fM (160 amol) and 190 fM (50 amol) for the event-specific and the taxon-specific targets, respectively. Results indicate that the method could be applied for GMO quantification below the European labeling threshold level (0.9%), offering a general approach for the rapid quantification of specific GMO events in foods. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Strategy study of quantification harmonization of SUV in PET/CT images; Estudo da estrategia de harmonizacao da quantificacao do SUV em imagens de PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-07-01

    and quantitative assessments in different scopes. We concluded that the harmonization strategy of the SUV quantification presented in this paper was effective in reducing the variability of small structures quantification. However, for the comparison of SUV quantification between different scanners and institutions, it is essential that, in addition to the harmonization of quantification, the standardization of the methodology of patient preparation is maintained, in order to minimize the SUV variability due to biological factors. (author)

  8. Landfill Gas Energy Benefits Calculator

    Science.gov (United States)

    This page contains the LFG Energy Benefits Calculator to estimate direct, avoided, and total greenhouse gas reductions, as well as environmental and energy benefits, for a landfill gas energy project.

  9. The employee motivation and benefits

    OpenAIRE

    Fuhrmannová, Petra

    2013-01-01

    The aim of this bachelor's study is to describe and analyze the employee motivation and benefits in the payroll system and human recources field. Theoretical part attends to general terms as the employee motivation, the theory of the motivation,the types of the employee benefits, the influence of benefits to the employee's working performance. The practial part focuses on Elanor company, includes introduction of the company, it's history and the present, the offer of the employee benefits. Ne...

  10. Methane fugitive emissions quantification using the novel 'plume camera' (spatial correlation) method

    Science.gov (United States)

    Crosson, E.; Rella, C.

    2012-12-01

    Fugitive emissions of methane into the atmosphere are a major concern facing the natural gas production industry. Given that the global warming potential of methane is many times greater than that of carbon dioxide, the importance of quantifying methane emissions becomes clear. The rapidly increasing reliance on shale gas (or other unconventional sources) is only intensifying the interest in fugitive methane releases. Natural gas (which is predominantly methane) is an attractive energy source, as it emits 40% less carbon dioxide per Joule of energy generated than coal. However, if just a small percentage of the natural gas consumed is lost due to fugitive emissions during production, processing, or transport, this global warming benefit is lost (Howarth et al. 2012). It is therefore imperative, as production of natural gas increases, that the fugitive emissions of methane are quantified accurately. Traditional direct measurement techniques often involve physical access of the leak itself to quantify the emissions rate, and are generally require painstaking effort to first find the leak and then quantify the emissions rate. With over half a million natural gas producing wells in the U.S. (U.S. Energy Information Administration), not including the associated processing, storage, and transport facilities, and with each facility having hundreds or even thousands of fittings that can potentially leak, the need is clear to develop methodologies that can provide a rapid and accurate assessment of the total emissions rate on a per-well head basis. In this paper we present a novel method for emissions quantification which uses a 'plume camera' with three 'pixels' to quantify emissions using direct measurements of methane concentration in the downwind plume. By analyzing the spatial correlation between the pixels, the spatial extent of the instantaneous plume can be inferred. This information, when combined with the wind speed through the measurement plane, provides a direct

  11. Corporate benefits of CSR activities

    OpenAIRE

    Maja Żychlewicz

    2014-01-01

    The main aim of the paper is to present the benefits that a company may derive from socially responsible activities. The paper lists various definitions of CSR that indicate the expected benefits stemming from its use. Both in theory and in practice, there is observed the need for strategic connection between the CSR concept and its real-life benefits.

  12. Corporate benefits of CSR activities

    Directory of Open Access Journals (Sweden)

    Maja Żychlewicz

    2014-11-01

    Full Text Available The main aim of the paper is to present the benefits that a company may derive from socially responsible activities. The paper lists various definitions of CSR that indicate the expected benefits stemming from its use. Both in theory and in practice, there is observed the need for strategic connection between the CSR concept and its real-life benefits.

  13. Cost benefit analysis vs. referenda

    OpenAIRE

    Martin J. Osborne; Matthew A. Turner

    2007-01-01

    We consider a planner who chooses between two possible public policies and ask whether a referendum or a cost benefit analysis leads to higher welfare. We find that a referendum leads to higher welfare than a cost benefit analyses in "common value" environments. Cost benefit analysis is better in "private value" environments.

  14. Benefit-based tree valuation

    Science.gov (United States)

    E.G. McPherson

    2007-01-01

    Benefit-based tree valuation provides alternative estimates of the fair and reasonable value of trees while illustrating the relative contribution of different benefit types. This study compared estimates of tree value obtained using cost- and benefit-based approaches. The cost-based approach used the Council of Landscape and Tree Appraisers trunk formula method, and...

  15. Quantification of complex modular architecture in plants.

    Science.gov (United States)

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  16. Seed shape quantification in the order Cucurbitales

    Directory of Open Access Journals (Sweden)

    Emilio Cervantes

    2018-02-01

    Full Text Available Seed shape quantification in diverse species of the families belonging to the order Cucurbitales is done based on the comparison of seed images with geometric figures. Quantification of seed shape is a useful tool in plant description for phenotypic characterization and taxonomic analysis. J index gives the percent of similarity of the image of a seed with a geometric figure and it is useful in taxonomy for the study of relationships between plant groups. Geometric figures used as models in the Cucurbitales are the ovoid, two ellipses with different x/y ratios and the outline of the Fibonacci spiral. The images of seeds have been compared with these figures and values of J index obtained. The results obtained for 29 species in the family Cucurbitaceae support a relationship between seed shape and species ecology. Simple seed shape, with images resembling simple geometric figures like the ovoid, ellipse or the Fibonacci spiral, may be a feature in the basal clades of taxonomic groups.

  17. Quantification of abdominal aortic deformation after EVAR

    Science.gov (United States)

    Demirci, Stefanie; Manstad-Hulaas, Frode; Navab, Nassir

    2009-02-01

    Quantification of abdominal aortic deformation is an important requirement for the evaluation of endovascular stenting procedures and the further refinement of stent graft design. During endovascular aortic repair (EVAR) treatment, the aortic shape is subject to severe deformation that is imposed by medical instruments such as guide wires, catheters, and, the stent graft. This deformation can affect the flow characteristics and morphology of the aorta which have been shown to be elicitors for stent graft failures and be reason for reappearance of aneurysms. We present a method for quantifying the deformation of an aneurysmatic aorta imposed by an inserted stent graft device. The outline of the procedure includes initial rigid alignment of the two abdominal scans, segmentation of abdominal vessel trees, and automatic reduction of their centerline structures to one specified region of interest around the aorta. This is accomplished by preprocessing and remodeling of the pre- and postoperative aortic shapes before performing a non-rigid registration. We further narrow the resulting displacement fields to only include local non-rigid deformation and therefore, eliminate all remaining global rigid transformations. Finally, deformations for specified locations can be calculated from the resulting displacement fields. In order to evaluate our method, experiments for the extraction of aortic deformation fields are conducted on 15 patient datasets from endovascular aortic repair (EVAR) treatment. A visual assessment of the registration results and evaluation of the usage of deformation quantification were performed by two vascular surgeons and one interventional radiologist who are all experts in EVAR procedures.

  18. Virus detection and quantification using electrical parameters

    Science.gov (United States)

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-10-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.

  19. CT quantification of central airway in tracheobronchomalacia

    Energy Technology Data Exchange (ETDEWEB)

    Im, Won Hyeong; Jin, Gong Yong; Han, Young Min; Kim, Eun Young [Dept. of Radiology, Chonbuk National University Hospital, Jeonju (Korea, Republic of)

    2016-05-15

    To know which factors help to diagnose tracheobronchomalacia (TBM) using CT quantification of central airway. From April 2013 to July 2014, 19 patients (68.0 ± 15.0 years; 6 male, 13 female) were diagnosed as TBM on CT. As case-matching, 38 normal subjects (65.5 ± 21.5 years; 6 male, 13 female) were selected. All 57 subjects underwent CT with end-inspiration and end-expiration. Airway parameters of trachea and both main bronchus were assessed using software (VIDA diagnostic). Airway parameters of TBM patients and normal subjects were compared using the Student t-test. In expiration, both wall perimeter and wall thickness in TBM patients were significantly smaller than normal subjects (wall perimeter: trachea, 43.97 mm vs. 49.04 mm, p = 0.020; right main bronchus, 33.52 mm vs. 42.69 mm, p < 0.001; left main bronchus, 26.76 mm vs. 31.88 mm, p = 0.012; wall thickness: trachea, 1.89 mm vs. 2.22 mm, p = 0.017; right main bronchus, 1.64 mm vs. 1.83 mm, p = 0.021; left main bronchus, 1.61 mm vs. 1.75 mm, p = 0.016). Wall thinning and decreased perimeter of central airway of expiration by CT quantification would be a new diagnostic indicators in TBM.

  20. A methodology for social experimentation

    DEFF Research Database (Denmark)

    Ravn, Ib

    A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations......A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations...