WorldWideScience

Sample records for benefits quantification methodology

  1. Renewable Electricity Benefits Quantification Methodology: A Request for Technical Assistance from the California Public Utilities Commission

    Energy Technology Data Exchange (ETDEWEB)

    Mosey, G.; Vimmerstedt, L.

    2009-07-01

    The California Public Utilities Commission (CPUC) requested assistance in identifying methodological alternatives for quantifying the benefits of renewable electricity. The context is the CPUC's analysis of a 33% renewable portfolio standard (RPS) in California--one element of California's Climate Change Scoping Plan. The information would be used to support development of an analytic plan to augment the cost analysis of this RPS (which recently was completed). NREL has responded to this request by developing a high-level survey of renewable electricity effects, quantification alternatives, and considerations for selection of analytic methods. This report addresses economic effects and health and environmental effects, and provides an overview of related analytic tools. Economic effects include jobs, earnings, gross state product, and electricity rate and fuel price hedging. Health and environmental effects include air quality and related public-health effects, solid and hazardous wastes, and effects on water resources.

  2. Benefit quantification of interoperability in coordinate metrology

    DEFF Research Database (Denmark)

    Savio, E.; Carmignato, S.; De Chiffre, Leonardo

    2014-01-01

    these inefficiencies. The paper presents a methodology for an economic evaluation of interoperability benefits with respect to the verification of geometrical product specifications. It requires input data from testing and inspection activities, as well as information on training of personnel and licensing of software...

  3. A Micropillar Compression Methodology for Ductile Damage Quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  4. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  5. A Project-based Quantification of BIM Benefits

    Directory of Open Access Journals (Sweden)

    Jian Li

    2014-08-01

    Full Text Available In the construction industry, research is being carried out to look for feasible methods and technologies to cut down project costs and waste. Building Information Modelling (BIM is certainly currently a promising technology/method that can achieve this. The output of the construction industry has a considerable scale; however, the concentration of the industry and the level of informatization are still not high. There is still a large gap in terms of productivity between the construction industry and other industries. Due to the lack of first-hand data regarding how much of an effect can be genuinely had by BIM in real cases, it is unrealistic for construction stakeholders to take the risk of widely adopting BIM. This paper focuses on the methodological quantification (through a case study approach of BIM’s benefits in building construction resource management and real-time costs control, in contrast to traditional non-BIM technologies. Through the use of BIM technology for the dynamic querying and statistical analysis of construction schedules, engineering, resources and costs, the three implementations considered demonstrate how BIM can facilitate the comprehensive grasp of a project’s implementation and progress, identify and solve the contradictions and conflicts between construction resources and costs controls, reduce project over-spends and protect the supply of resources.

  6. Quantification of Posterior Globe Flattening: Methodology Development and Validationc

    Science.gov (United States)

    Lumpkins, S. B.; Garcia, K. M.; Sargsyan, A. E.; Hamilton, D. R.; Berggren, M. D.; Antonsen, E.; Ebert, D.

    2011-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts, and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in several astronauts. This phenomenon is known to affect some terrestrial patient populations, and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT), or B-mode ultrasound (US), without consistent objective criteria. NASA uses a semi-quantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was to initiate development of an objective quantification methodology for posterior globe flattening.

  7. Quantification of Posterior Globe Flattening: Methodology Development and Validation

    Science.gov (United States)

    Lumpkins, Sarah B.; Garcia, Kathleen M.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Berggren, Michael D.; Ebert, Douglas

    2012-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in the eyes of several astronauts. This phenomenon is known to affect some terrestrial patient populations and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT) or B-mode Ultrasound (US), without consistent objective criteria. NASA uses a semiquantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was ot initiate development of an objective quantification methodology to monitor small changes in posterior globe flattening.

  8. A defined methodology for reliable quantification of Western blot data.

    Science.gov (United States)

    Taylor, Sean C; Berkelman, Thomas; Yadav, Geetha; Hammond, Matt

    2013-11-01

    Chemiluminescent western blotting has been in common practice for over three decades, but its use as a quantitative method for measuring the relative expression of the target proteins is still debatable. This is mainly due to the various steps, techniques, reagents, and detection methods that are used to obtain the associated data. In order to have confidence in densitometric data from western blots, researchers should be able to demonstrate statistically significant fold differences in protein expression. This entails a necessary evolution of the procedures, controls, and the analysis methods. We describe a methodology to obtain reliable quantitative data from chemiluminescent western blots using standardization procedures coupled with the updated reagents and detection methods.

  9. Methodological strategies for transgene copy number quantification in goats (Capra hircus) using real-time PCR.

    Science.gov (United States)

    Batista, Ribrio I T P; Luciano, Maria C S; Teixeira, Dárcio I A; Freitas, Vicente J F; Melo, Luciana M; Andreeva, Lyudmila E; Serova, Irina A; Serov, Oleg L

    2014-01-01

    Taking into account the importance of goats as transgenic models, as well as the rarity of copy number (CN) studies in farm animals, the present work aimed to evaluate methodological strategies for accurate and precise transgene CN quantification in goats using quantitative polymerase chain reaction (qPCR). Mouse and goat lines transgenic for human granulocyte-colony stimulating factor were used. After selecting the best genomic DNA extraction method to be applied in mouse and goat samples, intra-assay variations, accuracy and precision of CN quantifications were assessed. The optimized conditions were submitted to mathematical strategies and used to quantify CN in goat lines. The findings were as follows: validation of qPCR conditions is required, and amplification efficiency is the most important. Absolute and relative quantifications are able to produce similar results. For normalized absolute quantification, the same plasmid fragment used to generate goat lines must be mixed with wild-type goat genomic DNA, allowing the choice of an endogenous reference gene for data normalization. For relative quantifications, a resin-based genomic DNA extraction method is strongly recommended when using mouse tail tips as calibrators to avoid tissue-specific inhibitors. Efficient qPCR amplifications (≥95%) allow reliable CN measurements with SYBR technology. TaqMan must be used with caution in goats if the nucleotide sequence of the endogenous reference gene is not yet well understood. Adhering to these general guidelines can result in more exact CN determination in goats. Even when working under nonoptimal circumstances, if assays are performed that respect the minimum qPCR requirements, good estimations of transgene CN can be achieved.

  10. Recommendations for benefit-risk assessment methodologies and visual representations

    DEFF Research Database (Denmark)

    Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul

    2016-01-01

    was evident, with various classes of methodologies having roles to play at different stages. Descriptive and quantitative frameworks were widely used throughout to structure problems, with other methods such as metrics, estimation techniques and elicitation techniques providing ways to incorporate technical...... or numerical data from various sources. Similarly, tree diagrams and effects tables were universally adopted, with other visualisations available to suit specific methodologies or tasks as required. Every assessment was found to follow five broad stages: (i) Planning, (ii) Evidence gathering and data...

  11. Benefit Transfer: A Review of Methodologies and Challenges

    Directory of Open Access Journals (Sweden)

    John V. Westra

    2013-10-01

    Full Text Available For policy makers, regulators and natural resource managers, the resources necessary for original empirical resource valuations are often unavailable. A common alternative to original valuation studies is the practice of benefit transfer—the use of an empirical value estimate or estimates from a previous study or studies for application in a similar context. In order to reduce the error inherent in applying values from one parcel of land to another, researchers commonly use meta-analysis, or the “study of studies”, to provide a more thorough and statistically valid value estimate for use in a benefit transfer. In the practice of benefit transfer, much emphasis has been placed on improving the validity of values for transfer, but fewer studies have focused on the appropriate application of the established estimates. In this article, several often disregarded concerns that should be addressed when practicing benefit transfer are identified. A special focus is placed on spatial considerations and the recent progress that has been made to incorporate spatial trends. Geographic information systems (GIS are advocated as a useful tool for incorporating the spatial aspects of benefit transfer. Consensuses and trends in the literature are acknowledged, and areas of potential improvement are highlighted.

  12. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Xiao-Ying; Yao, Juan; He, Hua; Glantz, Clifford S.; Booth, Alexander E.

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  13. Methodology for nonlinear quantification of a flexible beam with a local, strong nonlinearity

    Science.gov (United States)

    Herrera, Christopher A.; McFarland, D. Michael; Bergman, Lawrence A.; Vakakis, Alexander F.

    2017-02-01

    This study presents a methodology for nonlinear quantification, i.e., the identification of the linear and nonlinear regimes and estimation of the degree of nonlinearity, for a cantilever beam with a local, strongly nonlinear stiffness element. The interesting feature of this system is that it behaves linearly in the limits of extreme values of the nonlinear stiffness. An Euler-Bernoulli cantilever beam with two nonlinear configurations is used to develop and demonstrate the methodology. One configuration considers a cubic spring attached at a distance from the beam root to achieve a smooth nonlinear effect. The other configuration considers a vibro-impact element that generates non-smooth effects. Both systems have the property that, in the limit of small and large values of a configuration parameter, the system is almost linear and can be modeled as such with negligible error. For the beam with a cubic spring attachment, the forcing amplitude is the varied parameter, while for the vibro-impact beam, this parameter is the clearance between the very stiff stops and the beam at static equilibrium. Proper orthogonal decomposition is employed to obtain an optimal orthogonal basis used to describe the nonlinear system dynamics for varying parameter values. The frequencies of the modes that compose the basis are then estimated using the Rayleigh quotient. The variations of these frequencies are studied to identify parameter values for which the system behaves approximately linearly and those for which the dynamical response is highly nonlinear. Moreover, a criterion based on the Betti-Maxwell reciprocity theorem is used to verify the existence of nonlinear behavior for the set of parameter values suggested by the described methodology. The developed methodology is general and applicable to discrete or continuous systems with smooth or nonsmooth nonlinearities.

  14. Quantification of aortic annulus in computed tomography angiography: Validation of a fully automatic methodology.

    Science.gov (United States)

    Gao, Xinpei; Boccalini, Sara; Kitslaar, Pieter H; Budde, Ricardo P J; Attrach, Mohamed; Tu, Shengxian; de Graaf, Michiel A; Ondrus, Tomas; Penicka, Martin; Scholte, Arthur J H A; Lelieveldt, Boudewijn P F; Dijkstra, Jouke; Reiber, Johan H C

    2017-08-01

    Automatic accurate measuring of the aortic annulus and determination of the optimal angulation of X-ray projection are important for the trans-catheter aortic valve replacement (TAVR) procedure. The objective of this study was to present a novel fully automatic methodology for the quantification of the aortic annulus in computed tomography angiography (CTA) images. CTA datasets of 26 patients were analyzed retrospectively with the proposed methodology, which consists of a knowledge-based segmentation of the aortic root and detection of the orientation and size of the aortic annulus. The accuracy of the methodology was determined by comparing the automatically derived results with the reference standard obtained by semi-automatic delineation of the aortic root and manual definition of the annulus plane. The difference between the automatic annulus diameter and the reference standard by observer 1 was 0.2±1.0mm, with an inter-observer variability of 1.2±0.6mm. The Pearson correlation coefficient for the diameter was good (0.92 for observer 1). For the first time, a fully automatic tool to assess the optimal projection curves was presented and validated. The mean difference between the optimal projection curves calculated based on the automatically defined annulus plane and the reference standard was 6.4° in the cranial/caudal (CRA/CAU) direction. The mean computation time was short with around 60s per dataset. The new fully automatic and fast methodology described in this manuscript not only provided precise measurements about the aortic annulus size with results comparable to experienced observers, but also predicted optimal X-ray projection curves from CTA images. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Costs without benefits? Methodological issues in assessing costs, benefits and effectiveness of water protection policies. Paper

    Energy Technology Data Exchange (ETDEWEB)

    Walz, R.; Schleich, J.

    2000-07-01

    In the last few years, the conditions for extending environmental policy in general and policy dealing with the prevention of water pollution in particular have undergone extensive changes. On the one hand, there has been indisputable considerable success in preventing water pollution which has led to less direct pressure for policy action. On the other hand, the rising sewage levies and the lower political priority assigned in general to environmental policy documented in, e. g. public opinion surveys, has led to water pollution control policy facing very different pressures of justification: more efficient use of funds, improved planning processes, proof of the achievable benefit, but also stopping the increase in levies or not hindering economic development, these or similar slogans are the objections brought against water pollution control. Regardless of how unambiguous these terms appear when used as slogans in this way, they become diffuse and unclear if regarded more closely. This paper therefore attempts to reveal the reasons for possible misunderstandings and misinterpretations on the one hand and, on the other, to reveal the basic problems and uncertainties which are necessarily linked with an assessment of costs and benefits. In order to do this, three areas are examined: level of actors and analysis, evaluation methods and assessment of costs and benefits. (orig.)

  16. Costs without benefits? Methodological issues in assessing costs, benefits and effectiveness of water protection policies. Paper

    Energy Technology Data Exchange (ETDEWEB)

    Walz, R.; Schleich, J.

    2000-07-01

    In the last few years, the conditions for extending environmental policy in general and policy dealing with the prevention of water pollution in particular have undergone extensive changes. On the one hand, there has been indisputable considerable success in preventing water pollution which has led to less direct pressure for policy action. On the other hand, the rising sewage levies and the lower political priority assigned in general to environmental policy documented in, e. g. public opinion surveys, has led to water pollution control policy facing very different pressures of justification: more efficient use of funds, improved planning processes, proof of the achievable benefit, but also stopping the increase in levies or not hindering economic development, these or similar slogans are the objections brought against water pollution control. Regardless of how unambiguous these terms appear when used as slogans in this way, they become diffuse and unclear if regarded more closely. This paper therefore attempts to reveal the reasons for possible misunderstandings and misinterpretations on the one hand and, on the other, to reveal the basic problems and uncertainties which are necessarily linked with an assessment of costs and benefits. In order to do this, three areas are examined: level of actors and analysis, evaluation methods and assessment of costs and benefits. (orig.)

  17. Cost-Benefit Analysis Methodology: Install Commercially Compliant Engines on National Security Exempted Vessels?

    Science.gov (United States)

    2015-11-05

    technologies follow: 1. Selective catalytic reduction (SCR) 2. Diesel particulate filter (DPF) – electrically regenerated active (ERADPF...insurmountable obstacles such as vessel range, engine room space, SLM, additional electric power, etc. Recommendations are developed on the basis of both...Cost-Benefit Analysis Methodology: Install Commercially Compliant Engines on National Security Exempted Vessels? Jonathan DeHart 1 (M

  18. Quantification of the least limiting water range in an oxisol using two methodological strategies

    Directory of Open Access Journals (Sweden)

    Wagner Henrique Moreira

    2014-12-01

    Full Text Available The least limiting water range (LLWR has been used as an indicator of soil physical quality as it represents, in a single parameter, the soil physical properties directly linked to plant growth, with the exception of temperature. The usual procedure for obtaining the LLWR involves determination of the water retention curve (WRC and the soil resistance to penetration curve (SRC in soil samples with undisturbed structure in the laboratory. Determination of the WRC and SRC using field measurements (in situ is preferable, but requires appropriate instrumentation. The objective of this study was to determine the LLWR from the data collected for determination of WRC and SRC in situ using portable electronic instruments, and to compare those determinations with the ones made in the laboratory. Samples were taken from the 0.0-0.1 m layer of a Latossolo Vermelho distrófico (Oxisol. Two methods were used for quantification of the LLWR: the traditional, with measurements made in soil samples with undisturbed structure; and in situ , with measurements of water content (θ, soil water potential (Ψ, and soil resistance to penetration (SR through the use of sensors. The in situ measurements of θ, Ψ and SR were taken over a period of four days of soil drying. At the same time, samples with undisturbed structure were collected for determination of bulk density (BD. Due to the limitations of measurement of Ψ by tensiometer, additional determinations of θ were made with a psychrometer (in the laboratory at the Ψ of -1500 kPa. The results show that it is possible to determine the LLWR by the θ, Ψ and SR measurements using the suggested approach and instrumentation. The quality of fit of the SRC was similar in both strategies. In contrast, the θ and Ψ in situ measurements, associated with those measured with a psychrometer, produced a better WRC description. The estimates of the LLWR were similar in both methodological strategies. The quantification of

  19. A methodology for estimating health benefits of electricity generation using renewable technologies.

    Science.gov (United States)

    Partridge, Ian; Gamkhar, Shama

    2012-02-01

    At Copenhagen, the developed countries agreed to provide up to $100 bn per year to finance climate change mitigation and adaptation by developing countries. Projects aimed at cutting greenhouse gas (GHG) emissions will need to be evaluated against dual criteria: from the viewpoint of the developed countries they must cut emissions of GHGs at reasonable cost, while host countries will assess their contribution to development, or simply their overall economic benefits. Co-benefits of some types of project will also be of interest to host countries: for example some projects will contribute to reducing air pollution, thus improving the health of the local population. This paper uses a simple damage function methodology to quantify some of the health co-benefits of replacing coal-fired generation with wind or small hydro in China. We estimate the monetary value of these co-benefits and find that it is probably small compared to the added costs. We have not made a full cost-benefit analysis of renewable energy in China as some likely co-benefits are omitted from our calculations. Our results are subject to considerable uncertainty however, after careful consideration of their likely accuracy and comparisons with other studies, we believe that they provide a good first cut estimate of co-benefits and are sufficiently robust to stand as a guide for policy makers. In addition to these empirical results, a key contribution made by the paper is to demonstrate a simple and reasonably accurate methodology for health benefits estimation that applies the most recent academic research in the field to the solution of an increasingly important problem. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Enlightenment of ISO Methodology to Assess and Communicate the Economic Benefits of Consensus-Based Standards

    Institute of Scientific and Technical Information of China (English)

    Wang Zhongmin

    2011-01-01

    In March 2010,ISO released the Methodology to Assess and Communicate the Economic Benefits of Consensus-Based Standards,which is of a great theoretical and practical significance.Standards and standardization is not a simple administrative work,but a kind of highly professional and technical business.The reality is just the opposite in that the standardization work worldwide always lacks the support of theories.The case is even worse in China.

  1. Methodology for Benefit Analysis of CAD/CAM (Computer-Aided Design/Computer-Aided Manufacturing) in USN Shipyards.

    Science.gov (United States)

    1984-03-01

    benefits of CAD/CAR and of the next generation technology, CIDER . The CADOS study (Ref. 13] offers a method to measure the intangibles of CAD/CAR...methodology that measures both tangible and intangible benefits of present CAD technology. This method would be hard to extend to CIDER technology because of...D-Ri38 398 METHODOLOGY FOR BENEFIT ANALYSIS OF CAD/CAM / (COMPUTER-HIDED DESIGN/COMPUTER-AIDED MANUFACTURING) IN USN SHIPYARDS(U) NAVAL POSTGRADUATE

  2. Socially responsible ethnobotanical surveys in the Cape Floristic Region: ethical principles, methodology and quantification of data

    Directory of Open Access Journals (Sweden)

    Ben-Erik Van Wyk

    2012-03-01

    Full Text Available A broad overview of published and unpublished ethnobotanical surveys in the Cape Floristic Region (the traditional home of the San and Khoi communities shows that the data is incomplete. There is an urgent need to record the rich indigenous knowledge about plants in a systematic and social responsible manner in order to preserve this cultural and scientific heritage for future generations. Improved methods for quantifying data are introduced, with special reference to the simplicity and benefits of the new Matrix Method. This methodology prevents or reduces the number of false negatives, and also ensures the participation of elderly people who might be immobile. It also makes it possible to compare plant uses in different local communities. This method enables the researcher to quantify the knowledge on plant use that was preserved in a community, and to determine the relative importance of a specific plant in a more objective way. Ethical considerations for such ethnobotanical surveys are discussed, through the lens of current ethical codes and international conventions. This is an accessible approach, which can also be used in the life sciences classroom.

  3. A methodology for assessing the market benefits of alternative motor fuels: The Alternative Fuels Trade Model

    Energy Technology Data Exchange (ETDEWEB)

    Leiby, P.N.

    1993-09-01

    This report describes a modeling methodology for examining the prospective economic benefits of displacing motor gasoline use by alternative fuels. The approach is based on the Alternative Fuels Trade Model (AFTM). AFTM development was undertaken by the US Department of Energy (DOE) as part of a longer term study of alternative fuels issues. The AFTM is intended to assist with evaluating how alternative fuels may be promoted effectively, and what the consequences of substantial alternative fuels use might be. Such an evaluation of policies and consequences of an alternative fuels program is being undertaken by DOE as required by Section 502(b) of the Energy Policy Act of 1992. Interest in alternative fuels is based on the prospective economic, environmental and energy security benefits from the substitution of these fuels for conventional transportation fuels. The transportation sector is heavily dependent on oil. Increased oil use implies increased petroleum imports, with much of the increase coming from OPEC countries. Conversely, displacement of gasoline has the potential to reduce US petroleum imports, thereby reducing reliance on OPEC oil and possibly weakening OPEC`s ability to extract monopoly profits. The magnitude of US petroleum import reduction, the attendant fuel price changes, and the resulting US benefits, depend upon the nature of oil-gas substitution and the supply and demand behavior of other world regions. The methodology applies an integrated model of fuel market interactions to characterize these effects.

  4. Vehicle technologies heavy vehicle program : FY 2008 benefits analysis, methodology and results --- final report.

    Energy Technology Data Exchange (ETDEWEB)

    Singh, M.; Energy Systems; TA Engineering

    2008-02-29

    This report describes the approach to estimating the benefits and analysis results for the Heavy Vehicle Technologies activities of the Vehicle Technologies (VT) Program of EERE. The scope of the effort includes: (1) Characterizing baseline and advanced technology vehicles for Class 3-6 and Class 7 and 8 trucks, (2) Identifying technology goals associated with the DOE EERE programs, (3) Estimating the market potential of technologies that improve fuel efficiency and/or use alternative fuels, and (4) Determining the petroleum and greenhouse gas emissions reductions associated with the advanced technologies. In FY 08 the Heavy Vehicles program continued its involvement with various sources of energy loss as compared to focusing more narrowly on engine efficiency and alternative fuels. These changes are the result of a planning effort that first occurred during FY 04 and was updated in the past year. (Ref. 1) This narrative describes characteristics of the heavy truck market as they relate to the analysis, a description of the analysis methodology (including a discussion of the models used to estimate market potential and benefits), and a presentation of the benefits estimated as a result of the adoption of the advanced technologies. The market penetrations are used as part of the EERE-wide integrated analysis to provide final benefit estimates reported in the FY08 Budget Request. The energy savings models are utilized by the VT program for internal project management purposes.

  5. Quantification of phytochelatins and their metal(loid) complexes: critical assessment of current analytical methodology.

    Science.gov (United States)

    Wood, B Alan; Feldmann, Jörg

    2012-04-01

    Whilst there are a variety of methods available for the quantification of biothiols in sample extracts, each has their own inherent advantages and limitations. The ease with which thiols readily oxidise not only hinders their quantification but also alters the speciation profile. The challenge faced by the analyst is not only to preserve the speciation of the sample, but also to select a method which allows the retrieval of the desired information. Given that sulfur is not a chromophore and that it cannot easily be monitored by ICP-MS, a number of direct and indirect methods have been developed for this purpose. In order to assess these methods, they are compared in the context of the measurement of arsenic-phytochelatin complexes in plant extracts. The inherent instability of such complexes, along with the instabilities of reduced glutathione and phytochelatin species,necessitates a rapid and sensitive analytical protocol. Whilst being a specific example, the points raised and discussed in this review will also be applicable to the quantification of biothiols and thiol-metal(loid) species in a wide range of systems other than just the analysis of arsenic-phytochelatin species in plant extracts.

  6. A simple dilute and shoot methodology for the identification and quantification of illegal insulin$

    Institute of Scientific and Technical Information of China (English)

    Celine Vanhee n; Steven Janvier; Goedele Moens; Eric Deconinck; Patricia Courselle

    2016-01-01

    The occurrence of illegal medicines is a well-established global problem and concerns mostly small molecules. However, due to the advances in genomics and recombinant expression technologies there is an increased development of polypeptide therapeutics. Insulin is one of the best known polypeptide drug, and illegal versions of this medicine led to lethal incidents in the past. Therefore, it is crucial for the public health sector to develop reliable, efficient, cheap, unbiased and easily applicable active pharma-ceutical ingredient (API) identification and quantification strategies for routine analysis of suspected il-legal insulins. Here we demonstrate that our combined label-free full scan approach is not only able to distinguish between all those different versions of insulin and the insulins originating from different species, but also able to chromatographically separate human insulin and insulin lispro in conditions that are compatible with mass spectrometry (MS). Additionally, we were also able to selectively quantify the different insulins, including human insulin and insulin lispro according to the validation criteria, put forward by the United Nations (UN), for the analysis of seized illicit drugs. The proposed identification and quantification method is currently being used in our official medicines control laboratory to analyze insulins retrieved from the illegal market.

  7. Quantification of Benefits and Cost from Applying a Product Configuration System

    DEFF Research Database (Denmark)

    Kristjansdottir, Katrin; Shafiee, Sara; Hvam, Lars

    This article aims at analyzing the long-term benefits and the cost from developing, implementing and maintaining product configuration systems (PCSs). The results presented indicate that over 5 years period a case company has achieved significant savings as a result to reduced workload of generat......This article aims at analyzing the long-term benefits and the cost from developing, implementing and maintaining product configuration systems (PCSs). The results presented indicate that over 5 years period a case company has achieved significant savings as a result to reduced workload...... of generating the products’ specifications. In addition the lead-time for generating products’ specifications has been reduced and indications of improved quality of the products’ specifications and additional sales are identified. The research verifies the benefits described in the current literature...... and contributes by linking the benefits to the direct cost savings companies can expect from utilizing PCSs....

  8. Quantification of the effects on greenhouse gas emissions of policies and measures. Methodologies report

    Energy Technology Data Exchange (ETDEWEB)

    Forster, D.; Falconer, A. [AEA Technology, Didcot (United Kingdom); Buttazoni, M.; Greenleaf, J. [Ecofys, Utrecht (Netherlands); Eichhammer, W. [Fraunhofer Institut fuer System- und Innovationsforschung ISI, Karlsruhe (DE)] (and others)

    2009-12-15

    The primary aim of the report is to describe the methodologies that have been developed during the project to evaluate, ex-post, the impact of selected EU Climate Change Policies and Measures (PAMS) on greenhouse gas (GHG) emissions. The secondary aim of the document is to provide guidance to Member State (MS) representatives on ex-post evaluation, and to provide references and tools that facilitate the implementation of a consistent approach across the EU27 countries. The focus of the guidance is on approaches to evaluate the effectiveness of the policies and measures. Evaluating the efficiency of policies is another important component of policy evaluation, but is only considered to a limited extent within these guidelines. Section 2 discusses the broad methodological issues associated with expost evaluation, illustrating the main approaches available and their strengths and weaknesses. Section 3 describes the methodological framework proposed for the evaluation of EU Climate Change Policies, providing explanation of key decisions informing the approach and the actual guidelines for the policy evaluation of individual directives. Section 4 includes policy evaluation guidelines for a large number of different EU climate change policies. Section 5 includes concluding remarks on the role the guidelines could play in EU and MS climate change policy and on their possible future evolution. The Appendices comprise (1) a working Paper on methodological issues related to the calculation of emission factors; (2) case study applications of a Tier 3 methodology.

  9. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    Science.gov (United States)

    Akram, Muhammad Farooq Bin

    The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for

  10. Comparison of DNA quantification methodology used in the DNA extraction protocol for the UK Biobank cohort.

    Science.gov (United States)

    Welsh, Samantha; Peakman, Tim; Sheard, Simon; Almond, Rachael

    2017-01-05

    UK Biobank is a large prospective cohort study in the UK established by the Medical Research Council (MRC) and the Wellcome Trust to enable approved researchers to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. A wide range of phenotypic data has been collected at recruitment and has recently been enhanced by the UK Biobank Genotyping Project. All UK Biobank participants (500,000) have been genotyped on either the UK Biobank Axiom® Array or the Affymetrix UK BiLEVE Axiom® Array and the workflow for preparing samples for genotyping is described. The genetic data is hoped to provide further insight into the genetics of disease. All data, including the genetic data, is available for access to approved researchers. Data for two methods of DNA quantification (ultraviolet-visible spectroscopy [UV/Vis]) measured on the Trinean DropSense™ 96 and PicoGreen®) were compared by two laboratories (UK Biobank and Affymetrix). The sample processing workflow established at UK Biobank, for genotyping on the custom Affymetrix Axiom® array, resulted in high quality DNA (average DNA concentration 38.13 ng/μL, average 260/280 absorbance 1.91). The DNA generated high quality genotype data (average call rate 99.48% and pass rate 99.45%). The DNA concentration measured on the Trinean DropSense™ 96 at UK Biobank correlated well with DNA concentration measured by PicoGreen® at Affymetrix (r = 0.85). The UK Biobank Genotyping Project demonstrated that the high throughput DNA extraction protocol described generates high quality DNA suitable for genotyping on the Affymetrix Axiom array. The correlation between DNA concentration derived from UV/Vis and PicoGreen® quantification methods suggests, in large-scale genetic studies involving two laboratories, it may be possible to remove the DNA quantification step in one laboratory without affecting downstream analyses. This would result in

  11. On a PLIF quantification methodology in a nonlinear dye response regime

    Science.gov (United States)

    Baj, P.; Bruce, P. J. K.; Buxton, O. R. H.

    2016-06-01

    A new technique of planar laser-induced fluorescence calibration is presented in this work. It accounts for a nonlinear dye response at high concentrations, an illumination light attenuation and a secondary fluorescence's influence in particular. An analytical approximation of a generic solution of the Beer-Lambert law is provided and utilized for effective concentration evaluation. These features make the technique particularly well suited for high concentration measurements, or those with a large range of concentration values, c, present (i.e. a high dynamic range of c). The method is applied to data gathered in a water flume experiment where a stream of a fluorescent dye (rhodamine 6G) was released into a grid-generated turbulent flow. Based on these results, it is shown that the illumination attenuation and the secondary fluorescence introduce a significant error into the data quantification (up to 15 and 80 %, respectively, for the case considered in this work) unless properly accounted for.

  12. Efficient uncertainty quantification methodologies for high-dimensional climate land models

    Energy Technology Data Exchange (ETDEWEB)

    Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Berry, Robert Dan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ray, Jaideep [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Debusschere, Bert J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2011-11-01

    In this report, we proposed, examined and implemented approaches for performing efficient uncertainty quantification (UQ) in climate land models. Specifically, we applied Bayesian compressive sensing framework to a polynomial chaos spectral expansions, enhanced it with an iterative algorithm of basis reduction, and investigated the results on test models as well as on the community land model (CLM). Furthermore, we discussed construction of efficient quadrature rules for forward propagation of uncertainties from high-dimensional, constrained input space to output quantities of interest. The work lays grounds for efficient forward UQ for high-dimensional, strongly non-linear and computationally costly climate models. Moreover, to investigate parameter inference approaches, we have applied two variants of the Markov chain Monte Carlo (MCMC) method to a soil moisture dynamics submodel of the CLM. The evaluation of these algorithms gave us a good foundation for further building out the Bayesian calibration framework towards the goal of robust component-wise calibration.

  13. Absolute quantification of olive oil DNA by droplet digital-PCR (ddPCR): Comparison of isolation and amplification methodologies.

    Science.gov (United States)

    Scollo, Francesco; Egea, Leticia A; Gentile, Alessandra; La Malfa, Stefano; Dorado, Gabriel; Hernandez, Pilar

    2016-12-15

    Olive oil is considered a premium product for its nutritional value and health benefits, and the ability to define its origin and varietal composition is a key step towards ensuring the traceability of the product. However, isolating the DNA from such a matrix is a difficult task. In this study, the quality and quantity of olive oil DNA, isolated using four different DNA isolation protocols, was evaluated using the qRT-PCR and ddPCR techniques. The results indicate that CTAB-based extraction methods were the best for unfiltered oil, while Nucleo Spin-based extraction protocols showed greater overall reproducibility. The use of both qRT-PCR and ddPCR led to the absolute quantification of the DNA copy number. The results clearly demonstrate the importance of the choice of DNA-isolation protocol, which should take into consideration the qualitative aspects of DNA and the evaluation of the amplified DNA copy number.

  14. A methodology for the quantification of doctrine and materiel approaches in a capability-based assessment

    Science.gov (United States)

    Tangen, Steven Anthony

    Due to the complexities of modern military operations and the technologies employed on today's military systems, acquisition costs and development times are becoming increasingly large. Meanwhile, the transformation of the global security environment is driving the U.S. military's own transformation. In order to meet the required capabilities of the next generation without buying prohibitively costly new systems, it is necessary for the military to evolve across the spectrum of doctrine, organization, training, materiel, leadership and education, personnel, and facilities (DOTMLPF). However, the methods for analyzing DOTMLPF approaches within the early acquisition phase of a capability-based assessment (CBA) are not as well established as the traditional technology design techniques. This makes it difficult for decision makers to decide if investments should be made in materiel or non-materiel solutions. This research develops an agent-based constructive simulation to quantitatively assess doctrine alongside materiel approaches. Additionally, life-cycle cost techniques are provided to enable a cost-effectiveness trade. These techniques are wrapped together in a decision-making environment that brings crucial information forward so informed and appropriate acquisition choices can be made. The methodology is tested on a future unmanned aerial vehicle design problem. Through the implementation of this quantitative methodology on the proof-of-concept study, it is shown that doctrinal changes including fleet composition, asset allocation, and patrol pattern were capable of dramatic improvements in system effectiveness at a much lower cost than the incorporation of candidate technologies. Additionally, this methodology was able to quantify the precise nature of strong doctrine-doctrine and doctrine-technology interactions which have been observed only qualitatively throughout military history. This dissertation outlines the methodology and demonstrates how potential

  15. Advancing health-related cluster analysis methodology: quantification of pairwise activity cluster similarities.

    Science.gov (United States)

    Ferrar, Katia; Maher, Carol; Petkov, John; Olds, Tim

    2015-03-01

    To date, most health-related time-use research has investigated behaviors in isolation; more recently, however, researchers have begun to conceptualize behaviors in the form of multidimensional patterns or clusters. The study employed 2 techniques: radar graphs and centroid vector length, angles and distance to quantify pairwise time-use cluster similarities among adolescents living in Australia (N = 1853) and in New Zealand (N = 679). Based on radar graph shape, 2 pairs of clusters were similar for both boys and girls. Using vector angles (VA), vector length (VL) and centroid distances (CD), 1 pair for each sex was considered most similar (boys: VA = 63°, VL = 44 and 50 units, and CD = 48 units; girls: VA = 23°, VL = 65 and 85 units, and CD = 36 units). Both methods employed to determine similarity had strengths and weaknesses. The description and quantification of cluster similarity is an important step in the research process. An ability to track and compare clusters may provide greater understanding of complex multidimensional relationships, and in relation to health behavior clusters, present opportunities to monitor and to intervene.

  16. Methodology for quantification of waste generated in Spanish railway construction works.

    Science.gov (United States)

    de Guzmán Báez, Ana; Villoria Sáez, Paola; del Río Merino, Mercedes; García Navarro, Justo

    2012-05-01

    In the last years, the European Union (EU) has been focused on the reduction of construction and demolition (C&D) waste. Specifically, in 2006, Spain generated roughly 47million tons of C&D waste, of which only 13.6% was recycled. This situation has lead to the drawing up of many regulations on C&D waste during the past years forcing EU countries to include new measures for waste prevention and recycling. Among these measures, the mandatory obligation to quantify the C&D waste expected to be originated during a construction project is mandated. However, limited data is available on civil engineering projects. Therefore, the aim of this research study is to improve C&D waste management in railway projects, by developing a model for C&D waste quantification. For this purpose, we develop two equations which estimate in advance the amount, both in weight and volume, of the C&D waste likely to be generated in railway construction projects, including the category of C&D waste generated for the entire project.

  17. Quantification of rainfall prediction uncertainties using a cross-validation based technique. Methodology description and experimental validation.

    Science.gov (United States)

    Fraga, Ignacio; Cea, Luis; Puertas, Jerónimo; Salsón, Santiago; Petazzi, Alberto

    2016-04-01

    In this paper we present a new methodology to compute rainfall fields including the quantification of predictions uncertainties using raingauge network data. The proposed methodology comprises two steps. Firstly, the ordinary krigging technique is used to determine the estimated rainfall depth in every point of the study area. Then multiple equi-probable errors fields, which comprise both interpolation and measuring uncertainties, are added to the krigged field resulting in multiple rainfall predictions. To compute these error fields first the standard deviation of the krigging estimation is determined following the cross-validation based procedure described in Delrieu et al. (2014). Then, the standard deviation field is sampled using non-conditioned Gaussian random fields. The proposed methodology was applied to study 7 rain events in a 60x60 km area of the west coast of Galicia, in the Northwest of Spain. Due to its location at the junction between tropical and polar regions, the study area suffers from frequent intense rainfalls characterized by a great variability in terms of both space and time. Rainfall data from the tipping bucket raingauge network operated by MeteoGalicia were used to estimate the rainfall fields using the proposed methodology. The obtained predictions were then validated using rainfall data from 3 additional rain gauges installed within the CAPRI project (Probabilistic flood prediction with high resolution hydrologic models from radar rainfall estimates, funded by the Spanish Ministry of Economy and Competitiveness. Reference CGL2013-46245-R.). Results show that both the mean hyetographs and the peak intensities are correctly predicted. The computed hyetographs present a good fit to the experimental data and most of the measured values fall within the 95% confidence intervals. Also, most of the experimental values outside the confidence bounds correspond to time periods of low rainfall depths, where the inaccuracy of the measuring devices

  18. Methodological issues in the quantification of subgingival microorganisms using the checkerboard technique.

    Science.gov (United States)

    Dahlen, G; Preus, H R; Baelum, V

    2015-03-01

    The reproducibility and reliability of quantitative microbiological assessments using the DNA-DNA hybridization "checkerboard method" (CKB) were assessed. The data originated from 180 chronic periodontitis patients, who were enrolled in a clinical trial and sampled at baseline, and 3 and 12m post-therapy. The samples were divided into two portions allowing evaluation of reproducibility. In total, 531 samples were analyzed in a first run, using standard bacterial preparations of cells and 513 samples were accessible for analysis in the second, using standards based on purified DNA from the species. The microbial probe panel consisted of periodontitis marker bacteria as well as non-oral microorganisms. Three different ways of quantifying and presenting data; the visual scoring method, VSM, the standard curve method, SCM, and the percent method, PM, were compared. The second set of analyses based on the use of standard preparations of pure DNA was shown to be more consistent than the first set using standards based on cells, while the effect of storage time per se up to 2.5y seemed to be marginal. The best reproducibility was found for Tannerella forsythia, irrespective of quantification technique (Spearman's rho=0.587, Pearson's r≥0.540). The percent method (PM) based on percent of High Standard (10(6) cells) was more reliable than SCM based on a linear calibration of the High Standard and a Low Standard (10(5) cells). It was concluded that the reproducibility of the CBK method varied between different bacteria. High quality and pure specific DNA whole genomic probes and standards may have a stronger impact on the precision of the data than storage time and conditions. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. A generic model-based methodology for quantification of mass transfer limitations in microreactors

    DEFF Research Database (Denmark)

    Van Daele, Timothy; Fernandes del Pozo, David; Van Hauwermeiren, Daan

    2016-01-01

    Microreactors are becoming more popular in the biocatalytic field to speed up reactions and thus achieve process intensification. However, even these small-scale reactors can suffer from mass transfer limitations. Traditionally, dimensionless numbers such as the second Damköhler number are used...... to determine whether the reaction is either kinetically or mass transfer limited. However, these dimensionless numbers only give a qualitative measure of the extent of the mass transfer limitation, and are only applicable to simple reactor configurations. In practice, this makes it difficult to rapidly...... quantify the importance of such mass transfer limitations and compare different reactor configurations. This paper presents a novel generic methodology to quantify mass transfer limitations. It was applied to two microreactor configurations: a microreactor with immobilised enzyme at the wall and a Y...

  20. Campylobacter inoculation and quantification from broiler cecal samples to compare two plate counting methodologies

    Directory of Open Access Journals (Sweden)

    Anderlise Borsoi

    2015-02-01

    Full Text Available Campylobacteriosis is a zoonosis, a disease transmitted to humans from animals or animal products. The primarily source of Campylobacter infection in human is believed to be the handling and/or consumption of contaminated meat, especially poultry meat. Although in humans such infections are generally self-limiting, complications can arise and may include bacteraemia, Guillain-Barré syndrome, reactive arthritis and abortion. In this study, 32 birds were divided in 2 groups: a control (C group and an inoculated (I group, with 16 birds each. The I group was inoculated orally with 108 CFU/mL of Campylobacter jejuni ATCC 33291, whereas the C group was inoculated with a saline solution. Four chicks per group were euthanized by cervical dislocation at 0, 7, 14 and 21 days post-inoculation (pi. Cecum samples were collected for microbiological analyses. The samples were processed by two plate count methodologies, one developed by the USDA (United States Department of Agriculture in 2011 (B method and the other a serial dilution direct count method (A method. All birds from the C group remained negative until day 21. For the I group, the B method was found to be statistically superior to the A method for counting the recovered cells from the cecal contents at 7, 14 and 21 days pi. The microbiological direct plating counting method is a cost effective and rapid method to determine the level of contamination in broilers to help risk assessment programs at the industry level.

  1. A High-Performance Embedded Hybrid Methodology for Uncertainty Quantification With Applications

    Energy Technology Data Exchange (ETDEWEB)

    Iaccarino, Gianluca

    2014-04-01

    Multiphysics processes modeled by a system of unsteady di erential equations are natu- rally suited for partitioned (modular) solution strategies. We consider such a model where probabilistic uncertainties are present in each module of the system and represented as a set of random input parameters. A straightforward approach in quantifying uncertainties in the predicted solution would be to sample all the input parameters into a single set, and treat the full system as a black-box. Although this method is easily parallelizable and requires minimal modi cations to deterministic solver, it is blind to the modular structure of the underlying multiphysical model. On the other hand, using spectral representations polynomial chaos expansions (PCE) can provide richer structural information regarding the dynamics of these uncertainties as they propagate from the inputs to the predicted output, but can be prohibitively expensive to implement in the high-dimensional global space of un- certain parameters. Therefore, we investigated hybrid methodologies wherein each module has the exibility of using sampling or PCE based methods of capturing local uncertainties while maintaining accuracy in the global uncertainty analysis. For the latter case, we use a conditional PCE model which mitigates the curse of dimension associated with intru- sive Galerkin or semi-intrusive Pseudospectral methods. After formalizing the theoretical framework, we demonstrate our proposed method using a numerical viscous ow simulation and benchmark the performance against a solely Monte-Carlo method and solely spectral method.

  2. Quantification of the Service Life Extension and Environmental Benefit of Chloride Exposed Self-Healing Concrete.

    Science.gov (United States)

    Van Belleghem, Bjorn; Van den Heede, Philip; Van Tittelboom, Kim; De Belie, Nele

    2016-12-23

    Formation of cracks impairs the durability of concrete elements. Corrosion inducing substances, such as chlorides, can enter the matrix through these cracks and cause steel reinforcement corrosion and concrete degradation. Self-repair of concrete cracks is an innovative technique which has been studied extensively during the past decade and which may help to increase the sustainability of concrete. However, the experiments conducted until now did not allow for an assessment of the service life extension possible with self-healing concrete in comparison with traditional (cracked) concrete. In this research, a service life prediction of self-healing concrete was done based on input from chloride diffusion tests. Self-healing of cracks with encapsulated polyurethane precursor formed a partial barrier against immediate ingress of chlorides through the cracks. Application of self-healing concrete was able to reduce the chloride concentration in a cracked zone by 75% or more. As a result, service life of steel reinforced self-healing concrete slabs in marine environments could amount to 60-94 years as opposed to only seven years for ordinary (cracked) concrete. Subsequent life cycle assessment calculations indicated important environmental benefits (56%-75%) for the ten CML-IA (Center of Environmental Science of Leiden University-Impact Assessment) baseline impact indicators which are mainly induced by the achievable service life extension.

  3. Quantification of the Service Life Extension and Environmental Benefit of Chloride Exposed Self-Healing Concrete

    Directory of Open Access Journals (Sweden)

    Bjorn Van Belleghem

    2016-12-01

    Full Text Available Formation of cracks impairs the durability of concrete elements. Corrosion inducing substances, such as chlorides, can enter the matrix through these cracks and cause steel reinforcement corrosion and concrete degradation. Self-repair of concrete cracks is an innovative technique which has been studied extensively during the past decade and which may help to increase the sustainability of concrete. However, the experiments conducted until now did not allow for an assessment of the service life extension possible with self-healing concrete in comparison with traditional (cracked concrete. In this research, a service life prediction of self-healing concrete was done based on input from chloride diffusion tests. Self-healing of cracks with encapsulated polyurethane precursor formed a partial barrier against immediate ingress of chlorides through the cracks. Application of self-healing concrete was able to reduce the chloride concentration in a cracked zone by 75% or more. As a result, service life of steel reinforced self-healing concrete slabs in marine environments could amount to 60–94 years as opposed to only seven years for ordinary (cracked concrete. Subsequent life cycle assessment calculations indicated important environmental benefits (56%–75% for the ten CML-IA (Center of Environmental Science of Leiden University–Impact Assessment baseline impact indicators which are mainly induced by the achievable service life extension.

  4. Deliverable 2. Methodology framework, Update. COBRA Cooperative Benefits for Road Authorities

    NARCIS (Netherlands)

    Faber, F.; Noort, M. van; Hopkin, J.; Ball, S.; Vermaat, P.; Nitsche, P.; Deix, S.

    2013-01-01

    The COBRA project aims to help road authorities to position themselves to optimally benefit from changes in the field of cooperative systems. It does so by providing insight on the costs and benefits of investments, both from a societal perspective and a business case perspective. These insights

  5. Development of the cost-benefit analysis methodology for FR cycle research and development

    Energy Technology Data Exchange (ETDEWEB)

    Shiotani, Hiroki; Shinoda, Yoshihiko; Ono, Kiyoshi [Japan Nuclear Cycle Development Inst., Oarai, Ibaraki (Japan). Oarai Engineering Center; Yasumatsu, Naoto [Nuclear Energy System Inc., Tokyo (Japan)

    2002-09-01

    A cost-benefit analysis system for Fast Reactor (FR) cycle research and development (R and D) has been developed. The benefits derived from FR cycle research and development, which are compared with the cost for the FR cycle R and D in the system, are environmental burden reduction, risk reduction, contribution to energy security and resource import reduction, as well as power generation cost reduction. Cost-benefit analyses for a typical FR cycle R and D and the sensitivity analyses for some parameters have confirmed the validity of the system and the effects of the parameters. Different cost-benefit ratios were obtained form the analyses on the R and Ds for six FR cycle concepts in the first phase of the Feasibility Studies on commercialized FR cycle system. Those analyses showed that several-fold benefit will be derived from FR cycle R and D investment but there remains uncertainty of the parameters on future society. (author)

  6. Methodological approach to determine minor, considerable, and major treatment effects in the early benefit assessment of new drugs.

    Science.gov (United States)

    Skipka, Guido; Wieseler, Beate; Kaiser, Thomas; Thomas, Stefanie; Bender, Ralf; Windeler, Jürgen; Lange, Stefan

    2016-01-01

    At the beginning of 2011, the early benefit assessment of new drugs was introduced in Germany with the Act on the Reform of the Market for Medicinal Products (AMNOG). The Federal Joint Committee (G-BA) generally commissions the Institute for Quality and Efficiency in Health Care (IQWiG) with this type of assessment, which examines whether a new drug shows an added benefit (a positive patient-relevant treatment effect) over the current standard therapy. IQWiG is required to assess the extent of added benefit on the basis of a dossier submitted by the pharmaceutical company responsible. In this context, IQWiG was faced with the task of developing a transparent and plausible approach for operationalizing how to determine the extent of added benefit. In the case of an added benefit, the law specifies three main extent categories (minor, considerable, major). To restrict value judgements to a minimum in the first stage of the assessment process, an explicit and abstract operationalization was needed. The present paper is limited to the situation of binary data (analysis of 2 × 2 tables), using the relative risk as an effect measure. For the treatment effect to be classified as a minor, considerable, or major added benefit, the methodological approach stipulates that the (two-sided) 95% confidence interval of the effect must exceed a specified distance to the zero effect. In summary, we assume that our approach provides a robust, transparent, and thus predictable foundation to determine minor, considerable, and major treatment effects on binary outcomes in the early benefit assessment of new drugs in Germany. After a decision on the added benefit of a new drug by G-BA, the classification of added benefit is used to inform pricing negotiations between the umbrella organization of statutory health insurance and the pharmaceutical companies.

  7. Validation and evaluation of an HPLC methodology for the quantification of the potent antimitotic compound (+)-discodermolide in the Caribbean marine sponge Discodermia dissoluta.

    Science.gov (United States)

    Valderrama, Katherine; Castellanos, Leonardo; Zea, Sven

    2010-08-01

    The sponge Discodermia dissoluta is the source of the potent antimitotic compound (+)-discodermolide. The relatively abundant and shallow populations of this sponge in Santa Marta, Colombia, allow for studies to evaluate the natural and biotechnological supply options of (+)-discodermolide. In this work, an RP-HPLC-UV methodology for the quantification of (+)-discodermolide from sponge samples was tested and validated. Our protocol for extracting this compound from the sponge included lyophilization, exhaustive methanol extraction, partitioning using water and dichloromethane, purification of the organic fraction in RP-18 cartridges and then finally retrieving the (+)-discodermolide in the methanol-water (80:20 v/v) fraction. This fraction was injected into an HPLC system with an Xterra RP-18 column and a detection wavelength of 235 nm. The calibration curve was linear, making it possible to calculate the LODs and quantification in these experiments. The intra-day and inter-day precision showed relative standard deviations lower than 5%. The accuracy, determined as the percentage recovery, was 99.4%. Nine samples of the sponge from the Bahamas, Bonaire, Curaçao and Santa Marta had concentrations of (+)-discodermolide ranging from 5.3 to 29.3 microg/g(-1) of wet sponge. This methodology is quick and simple, allowing for the quantification in sponges from natural environments, in situ cultures or dissociated cells.

  8. FreedomCAR and vehicle technologies heavy vehicle program FY 2006. Benefits analysis : methodology and results - final report.

    Energy Technology Data Exchange (ETDEWEB)

    Singh, M.; Energy Systems; TA Engineering, Inc.

    2006-01-31

    This report describes the approach to estimating benefits and the analysis results for the Heavy Vehicle Technologies activities of the Freedom Car and Vehicle Technologies (FCVT) Program of EERE. The scope of the effort includes: (1) Characterizing baseline and advanced technology vehicles for Class 3-6 and Class 7 and 8 trucks, (2) Identification of technology goals associated with the DOE EERE programs, (3) Estimating the market potential of technologies that improve fuel efficiency and/or use alternative fuels, (4) Determining the petroleum and greenhouse gas emissions reductions associated with the advanced technologies. In FY 05 the Heavy Vehicles program activity expanded its technical involvement to more broadly address various sources of energy loss as compared to focusing more narrowly on engine efficiency and alternative fuels. This broadening of focus has continued in the activities planned for FY 06. These changes are the result of a planning effort that occurred during FY 04 and 05. (Ref. 1) This narrative describes characteristics of the heavy truck market as they relate to the analysis, a description of the analysis methodology (including a discussion of the models used to estimate market potential and benefits), and a presentation of the benefits estimated as a result of the adoption of the advanced technologies. These benefits estimates, along with market penetrations and other results, are then modeled as part of the EERE-wide integrated analysis to provide final benefit estimates reported in the FY06 Budget Request.

  9. METHODOLOGICAL APPROACHES IN REALIZING AND APPLYING COST-BENEFIT ANALYSIS FOR THE INVESTMENT PROJECTS

    Directory of Open Access Journals (Sweden)

    Pelin Andrei

    2009-05-01

    Full Text Available Cost-benefit analysis represents the most frequent technique used for a rational allocation of resources. This modality of evaluating the expenditure programs is an attempt to measure the costs and gains of a community as a result of running the evaluated

  10. Solar thermal technologies benefits assessment: Objectives, methodologies and results for 1981

    Science.gov (United States)

    Gates, W. R.

    1982-07-01

    The economic and social benefits of developing cost competitive solar thermal technologies (STT) were assessed. The analysis was restricted to STT in electric applications for 16 high insolation/high energy price states. Three fuel price scenarios and three 1990 STT system costs were considered, reflecting uncertainty over fuel prices and STT cost projections. After considering the numerous benefits of introducing STT into the energy market, three primary benefits were identified and evaluated: (1) direct energy cost savings were estimated to range from zero to $50 billion; (2) oil imports may be reduced by up to 9 percent, improving national security; and (3) significant environmental benefits can be realized in air basins where electric power plant emissions create substantial air pollution problems. STT research and development was found to be unacceptably risky for private industry in the absence of federal support. The normal risks associated with investments in research and development are accentuated because the OPEC cartel can artificially manipulate oil prices and undercut the growth of alternative energy sources.

  11. What do social scientists know about the benefits of marriage? : A review of quantitative methodologies

    OpenAIRE

    David C. Ribar

    2004-01-01

    This study critically reviews quantitative methods that have been employed and evidence that has been gathered to assess the benefits of marriage and consequences of other family structures. The study begins by describing theoretical models of the determinants of different well-being outcomes and the role of family structure in producing those outcomes. It also discusses models of the determinants of marriage. The study then overviews specific statistical techniques that have been applied in ...

  12. Freedom car and vehicle technologies heavy vehicle program : FY 2007 benefits analysis, methodology and results -- final report.

    Energy Technology Data Exchange (ETDEWEB)

    SIngh, M.; Energy Systems; TA Engineering

    2008-02-29

    This report describes the approach to estimating the benefits and analysis results for the Heavy Vehicle Technologies activities of the FreedomCar and Vehicle Technologies (FCVT) Program of EERE. The scope of the effort includes: (1) Characterizing baseline and advanced technology vehicles for Class 3-6 and Class 7 and 8 trucks, (2) Identifying technology goals associated with the DOE EERE programs, (3) Estimating the market potential of technologies that improve fuel efficiency and/or use alternative fuels, (4) Determining the petroleum and greenhouse gas emissions reductions associated with the advanced technologies. In FY 05 the Heavy Vehicles program activity expanded its technical involvement to more broadly address various sources of energy loss as compared to focusing more narrowly on engine efficiency and alternative fuels. This broadening of focus has continued in subsequent activities. These changes are the result of a planning effort that occurred during FY 04 and 05. (Ref. 1) This narrative describes characteristics of the heavy truck market as they relate to the analysis, a description of the analysis methodology (including a discussion of the models used to estimate market potential and benefits), and a presentation of the benefits estimated as a result of the adoption of the advanced technologies. The market penetrations are used as part of the EERE-wide integrated analysis to provide final benefit estimates reported in the FY07 Budget Request. The energy savings models are utilized by the FCVT program for internal project management purposes.

  13. A methodology for spacecraft technology insertion analysis balancing benefit, cost, and risk

    Science.gov (United States)

    Bearden, David Allen

    Emerging technologies are changing the way space missions are developed and implemented. Technology development programs are proceeding with the goal of enhancing spacecraft performance and reducing mass and cost. However, it is often the case that technology insertion assessment activities, in the interest of maximizing performance and/or mass reduction, do not consider synergistic system-level effects. Furthermore, even though technical risks are often identified as a large cost and schedule driver, many design processes ignore effects of cost and schedule uncertainty. This research is based on the hypothesis that technology selection is a problem of balancing interrelated (and potentially competing) objectives. Current spacecraft technology selection approaches are summarized, and a Methodology for Evaluating and Ranking Insertion of Technology (MERIT) that expands on these practices to attack otherwise unsolved problems is demonstrated. MERIT combines the modern techniques of technology maturity measures, parametric models, genetic algorithms, and risk assessment (cost and schedule) in a unique manner to resolve very difficult issues including: user-generated uncertainty, relationships between cost/schedule and complexity, and technology "portfolio" management. While the methodology is sufficiently generic that it may in theory be applied to a number of technology insertion problems, this research focuses on application to the specific case of small (goals, not-to-exceed costs, or hard schedule requirements. MERIT'S contributions to the engineering community are its: unique coupling of the aspects of performance, cost, and schedule; assessment of system level impacts of technology insertion; procedures for estimating uncertainties (risks) associated with advanced technology; and application of heuristics to facilitate informed system-level technology utilization decisions earlier in the conceptual design phase. MERIT extends the state of the art in technology

  14. Interconnection Assessment Methodology and Cost Benefit Analysis for High-Penetration PV Deployment in the Arizona Public Service System

    Energy Technology Data Exchange (ETDEWEB)

    Baggu, Murali; Giraldez, Julieta; Harris, Tom; Brunhart-Lupo, Nicholas; Lisell, Lars; Narang, David

    2015-06-14

    In an effort to better understand the impacts of high penetrations of photovoltaic (PV) generators on distribution systems, Arizona Public Service and its partners completed a multi-year project to develop the tools and knowledge base needed to safely and reliably integrate high penetrations of utility- and residential-scale PV. Building upon the APS Community Power Project-Flagstaff Pilot, this project investigates the impact of PV on a representative feeder in northeast Flagstaff. To quantify and catalog the effects of the estimated 1.3 MW of PV that will be installed on the feeder (both smaller units at homes and large, centrally located systems), high-speed weather and electrical data acquisition systems and digital 'smart' meters were designed and installed to facilitate monitoring and to build and validate comprehensive, high-resolution models of the distribution system. These models are being developed to analyze the impacts of PV on distribution circuit protection systems (including coordination and anti-islanding), predict voltage regulation and phase balance issues, and develop volt/VAr control schemes. This paper continues from a paper presented at the 2014 IEEE PVSC conference that described feeder model evaluation and high penetration advanced scenario analysis, specifically feeder reconfiguration. This paper presents results from Phase 5 of the project. Specifically, the paper discusses tool automation; interconnection assessment methodology and cost benefit analysis.

  15. Methodological modifications on quantification of phosphatidylethanol in blood from humans abusing alcohol, using high-performance liquid chromatography and evaporative light scattering detection

    Directory of Open Access Journals (Sweden)

    Aradottir Steina

    2005-09-01

    Full Text Available Abstract Background Phosphatidylethanol (PEth is an abnormal phospholipid formed slowly in cell membranes by a transphosphatidylation reaction from phosphatidylcholine in the presence of ethanol and catalyzed by the enzyme phospholipase D. PEth in blood is a promising new marker of ethanol abuse depending on the high specificity and sensitivity of this marker. None of the biological markers used in clinical routine at the present time are sensitive and specific enough for the diagnosis of alcohol abuse. The method for PEth analysis includes lipid extraction of whole blood, a one-hour HPLC separation of lipids and ELSD (evaporative light scattering detection of PEth. Results Methodological improvements are presented which comprise a simpler extraction procedure, the use of phosphatidylbutanol as internal standard and a new algorithm for evaluation of unknown samples. It is further demonstrated that equal test results are obtained with blood collected in standard test tubes with EDTA as with the previously used heparinized test tubes. The PEth content in blood samples is stable for three weeks in the refrigerator. Conclusion Methodological changes make the method more suitable for routine laboratory use, lower the limit of quantification (LOQ and improve precision.

  16. Benefit-risk assessment in a post-market setting: a case study integrating real-life experience into benefit-risk methodology.

    Science.gov (United States)

    Hallgreen, Christine E; van den Ham, Hendrika A; Mt-Isa, Shahrul; Ashworth, Simon; Hermann, Richard; Hobbiger, Steve; Luciani, Davide; Micaleff, Alain; Thomson, Andrew; Wang, Nan; van Staa, Tjeerd P; Downey, Gerald; Hirsch, Ian; Hockley, Kimberley; Juhaeri, Juhaeri; Metcalf, Marilyn; Mwangi, Jeremiah; Nixon, Richard; Peters, Ruth; Stoeckert, Isabelle; Waddingham, Ed; Tzoulaki, Ioanna; Ashby, Deborah; Wise, Lesley

    2014-09-01

    Difficulties may be encountered when undertaking a benefit-risk assessment for an older product with well-established use but with a benefit-risk balance that may have changed over time. This case study investigates this specific situation by applying a formal benefit-risk framework to assess the benefit-risk balance of warfarin for primary prevention of patients with atrial fibrillation. We used the qualitative framework BRAT as the starting point of the benefit-risk analysis, bringing together the relevant available evidence. We explored the use of a quantitative method (stochastic multi-criteria acceptability analysis) to demonstrate how uncertainties and preferences on multiple criteria can be integrated into a single measure to reduce cognitive burden and increase transparency in decision making. Our benefit-risk model found that warfarin is favourable compared with placebo for the primary prevention of stroke in patients with atrial fibrillation. This favourable benefit-risk balance is fairly robust to differences in preferences. The probability of a favourable benefit-risk for warfarin against placebo is high (0.99) in our model despite the high uncertainty of randomised clinical trial data. In this case study, we identified major challenges related to the identification of relevant benefit-risk criteria and taking into account the diversity and quality of evidence available to inform the benefit-risk assessment. The main challenges in applying formal methods for medical benefit-risk assessment for a marketed drug are related to outcome definitions and data availability. Data exist from many different sources (both randomised clinical trials and observational studies), and the variability in the studies is large. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Application of analytic methodologies for image quantification in neuroendocrine tumor therapy with {sup 177}Lu-DOTA

    Energy Technology Data Exchange (ETDEWEB)

    Kubo, T.T.A.; Oliveira, S.M.V. [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Marco, L.; Mamede, M., E-mail: tadeukubo@gmail.com [Instituto Nacional do Cancer, Rio de Janeiro, RJ (Brazil)

    2012-07-01

    Neuroendocrine tumors have annual incidence of 1 to 2 cases per one hundred thousand inhabitants. The {sup 177}Lu-DOTA-octreotate treatments in 3 or 4 cycles has been effective in controlling disease progression and, in some cases, promote tumor remission. To estimate radiation side effects in healthy organs, image quantification techniques have been broadcast for individualized patient dosimetry. In this paper, image data processing methods are presented to allowing comparisons between different image conjugate views, combined with attenuation correction and system sensitivity. Images were acquired 24, 72 and 192 h after administration of 74 GBq of {sup 177}Lu-DOTA using a dual-head gamma camera detection system and they were evaluated with ImageJ software. 4 female patients underwent to two cycles of treatment. The kidneys, liver and whole-body regions of interest were separately assessed by 4 techniques for counts method and 12 techniques for pixel intensity method, considering the main photopeak separately and aided by the attenuation correction map and adjacent windows to photopeak energy. The pixel intensity method was combined with mathematical correction for pixels with null value. The results obtained by the two methods were strongly correlated (r>0.9) (p<0.001). The paired t-test accepted the null hypothesis of compatibility between the two methods (with and without attenuation correction map) (p<0.05), but rejected it when the adjacent windows were combined. No significant tumor reduction (p>0.05) was found between the treatment cycles. In conclusion, the pixel intensity method is faster and allows macros, minimizing operator error, and may optimize dosimetry in tumor therapies with {sup 177}Lu-DOTA-octreotate. (author)

  18. Development and validation of methodologies for the quantification of phytosterols and phytosterol oxidation products in cooked and baked food products.

    Science.gov (United States)

    Menéndez-Carreño, María; Knol, Diny; Janssen, Hans-Gerd

    2016-01-01

    Chromatography-mass spectrometry (GC-MS) methodologies for the analysis of the main phytosterols (PS) and phytosterol oxidation products (POPs) present in 19 different foodstuffs cooked or baked using margarines with or without added plant sterols are presented. Various methods for fat extraction were evaluated to allow the GC-MS analysis of large numbers of prepared vegetable, fish and meat products, egg and bakery items in a practically feasible manner. The optimized methods resulted in a good sensitivity and allowed the analysis of both PS and POPs in the broad selection of foods at a wide range of concentrations. Calibration curves for both PS and POPs showed correlation coefficients (R(2)) better than 0.99. Detection limits were below 0.24mgkg(-1) for PS and 0.02mgkg(-1) for POPs, respectively. Average recovery data were between 81% and 105.1% for PS and between 65.5 and 121.8% for POPs. Good results were obtained for within- and between-day repeatability, with most values being below 10%. Entire sample servings were analyzed, avoiding problems with inhomogeneity and making the method an exact representation of the typical use of the food by the consumer.

  19. Optimization of a cloud point extraction procedure with response surface methodology for the quantification of dissolved iron in produced water from the petroleum industry using FAAS.

    Science.gov (United States)

    Gondim, Tamyris A; Guedes, Jhonyson A C; Ribeiro, Livia P D; Lopes, Gisele S; Matos, Wladiana O

    2017-01-30

    The characterization of inorganic elements in the produced water (PW) samples is a difficult task because of the complexity of the matrix. This work deals with a study of a methodology for dissolved Fe quantification in PW from oil industry by flame atomic absorption spectrometry (FAAS) after cloud point extraction (CPE). The procedure is based on the CPE using PAN as complexing agent and Triton X-114 as surfactant. The best conditions for Fe extraction parameters were studied using a Box-Behnken design. The proposed method presented a LOQ of 0.010μgmL(-1) and LOD of 0.003μgmL(-1). The precision of the method was evaluated in terms of repeatability, obtaining a coefficient of variation of 2.54%. The accuracy of the method was assessed by recovery experiments of Fe spiked that presented recovery of 103.28%. The method was applied with satisfactory performance to determine Fe by FAAS in PW samples.

  20. A Methodology to Assess the Benefit of Operational or Tactic Adjustments to Reduce Marine Corps Fuel Consumption

    Science.gov (United States)

    2015-12-01

    heart of the methodology was a discrete event model developed to simulate the conditions of a close air support (CAS) operation and ground combat...methodology to assess potential improvements to operational reach in the context of a Marine Expeditionary Unit (MEU) operation. At the heart of the...Factor Effects 195 xv LIST OF ACRONYMS AND ABBREVIATIONS ABMS agent based modeling and simulation ACE air combat element AH-1Z Super Cobra

  1. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    Energy Technology Data Exchange (ETDEWEB)

    Laborda, Francisco, E-mail: flaborda@unizar.es; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-21

    dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. - Highlights: • The challenge to analyze inorganic nanomaterials is described. • Techniques for detection, characterization and quantification of inorganic nanomaterials are presented. • Sample preparation methods for the analysis of nanomaterials in complex samples are presented. • Methodological approaches posed by stakeholders for solving nanometrological problems are discussed.

  2. Is law enforcement of drug-impaired driving cost-efficient? An explorative study of a methodology for cost-benefit analysis.

    Science.gov (United States)

    Veisten, Knut; Houwing, Sjoerd; Mathijssen, M P M René; Akhtar, Juned

    2013-03-01

    Road users driving under the influence of psychoactive substances may be at much higher relative risk (RR) in road traffic than the average driver. Legislation banning blood alcohol concentrations above certain threshold levels combined with roadside breath-testing of alcohol have been in lieu for decades in many countries, but new legislation and testing of drivers for drug use have recently been implemented in some countries. In this article we present a methodology for cost-benefit analysis (CBA) of increased law enforcement of roadside drug screening. This is an analysis of the profitability for society, where costs of control are weighed against the reduction in injuries expected from fewer drugged drivers on the roads. We specify assumptions regarding costs and the effect of the specificity of the drug screening device, and quantify a deterrence effect related to sensitivity of the device yielding the benefit estimates. Three European countries with different current enforcement levels were studied, yielding benefit-cost ratios in the approximate range of 0.5-5 for a tripling of current levels of enforcement, with costs of about 4000 EUR per convicted and in the range of 1.5 and 13 million EUR per prevented fatality. The applied methodology for CBA has involved a simplistic behavioural response to enforcement increase and control efficiency. Although this methodology should be developed further, it is clearly indicated that the cost-efficiency of increased law enforcement of drug driving offences is dependent on the baseline situation of drug-use in traffic and on the current level of enforcement, as well as the RR and prevalence of drugs in road traffic. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Four years of early benefit assessment of new drugs in Germany: a qualitative study on methodological requirements for quality of life data.

    Science.gov (United States)

    Blome, Christine; Augustin, Matthias; Metin, Hidayet; Lohrberg, David

    2017-03-01

    Since 2011, an early benefit assessment (EBA) of new drugs constricts free price setting in Germany. According to the Pharmaceutical Market Restructuring Act (AMNOG), pharmaceutical companies are obliged to demonstrate added benefit of new drugs over comparative treatment. Benefit is usually evaluated by the Institute for Quality and Efficiency in Health Care (IQWiG). The final appraisal is made by the Federal Joint Committee, Germany's highest-ranking decision body in the health sector, triggering drug prize negotiations between companies and statutory health insurance funds. One of four evaluation criteria is quality of life (QoL). QoL outcomes have, however, only rarely been pivotal in EBAs. This study determined methodological requirements for QoL measurement and data presentation in the EBA. In a qualitative content analysis, documents of all EBAs completed by 2014 were searched for the term QoL. Relevant passages of all EBAs of 2011-2013 were independently extracted and reduced to key content by two researchers. Recurring patterns were identified and verified through comparison with EBAs of 2014. We identified a range of requirements regarding QoL assessment, analysis, presentation, and interpretation, which go beyond official regulations. Disease-specific questionnaires are preferred and have to be validated according to certain standards and in the respective patient group. Effects must exceed the minimal important difference, which in turn must be validated in compliance with specific requirements. Often, instruments were not accepted as QoL measures, sometimes inconsistently across EBAs. Another frequent reason for non-acceptance of QoL data was that more than 30 % of randomized patients could not be analyzed due to missing data. Non-compliance with methodological requirements for QoL evidence impairs chances for positive benefit evaluation.

  4. Extraction Methodological Contributions Toward Ultra-Performance Liquid ChromatographyTime-of-Flight Mass Spectrometry: Quantification of Free GB from Various Food Matrices

    Science.gov (United States)

    2016-02-01

    detection and emphasize its importance. The mere existence of these molecules in either the environment or the food supply could indicate a compliance...S.N.; Vasudev, K.; Rao , M.V.V. Quantification of Organophosphate Insecticides and Herbicides in Vegetable Samples Using the “Quick Easy Cheap

  5. Methodology to estimate the cost of the severe accidents risk / maximum benefit; Metodologia para estimar el costo del riesgo de accidentes severos / beneficio maximo

    Energy Technology Data Exchange (ETDEWEB)

    Mendoza, G.; Flores, R. M.; Vega, E., E-mail: gozalo.mendoza@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2016-09-15

    For programs and activities to manage aging effects, any changes to plant operations, inspections, maintenance activities, systems and administrative control procedures during the renewal period should be characterized, designed to manage the effects of aging as required by 10 Cfr Part 54 that could impact the environment. Environmental impacts significantly different from those described in the final environmental statement for the current operating license should be described in detail. When complying with the requirements of a license renewal application, the Severe Accident Mitigation Alternatives (SAMA) analysis is contained in a supplement to the environmental report of the plant that meets the requirements of 10 Cfr Part 51. In this paper, the methodology for estimating the cost of severe accidents risk is established and discussed, which is then used to identify and select the alternatives for severe accident mitigation, which are analyzed to estimate the maximum benefit that an alternative could achieve if this eliminate all risk. Using the regulatory analysis techniques of the US Nuclear Regulatory Commission (NRC) estimates the cost of severe accidents risk. The ultimate goal of implementing the methodology is to identify candidates for SAMA that have the potential to reduce the severe accidents risk and determine if the implementation of each candidate is cost-effective. (Author)

  6. Quantification of fatty acids in the muscle of Antarctic fish Trematomus bernacchii by gas chromatography-mass spectrometry: Optimization of the analytical methodology.

    Science.gov (United States)

    Truzzi, C; Illuminati, S; Annibaldi, A; Antonucci, M; Scarponi, G

    2017-04-01

    This work presents data on the quantification of fatty acids (FAs, in terms of mass unit per tissue weight) in the muscle of Trematomus bernacchii, a key species in Antarctica, often used as bioindicator for contamination studies. Modifications in fatty acids content should be considered a useful biomarker to study how contaminants affect Antarctic biota. Until now, very few studies quantified fatty acids of muscle of T. bernacchii, and only as percentage of a single fatty acid on total lipids. To perform the quantification of fatty acids, we used an analytical method based on a fast microwave-assisted extraction of lipids from a lyophilized sample, a base-catalyzed trans-esterification of lipid extract to obtain Fatty Acids Methyl Esters (FAMEs), and a separation and identification of FAMEs by gas chromatography-mass spectrometry. With the optimized and validated method, a fast and accurate separation of Fatty Acids Methyl Esters was performed in 43 min. The linearity was checked up to about 320 μg mL(-1); limit of detection and limit of quantification are in the range 4-22 μg mL(-1) and 13-66 μg mL(-1), respectively. The optimized method showed a good accuracy and precision. Major fatty acids were 14:0, 16:0, 16:1n7, 18:1n9, 18:1n7, 20:1n9, 20:5n3 and 22:6n3. Quantified FAs compute for about 47 mg g(-1) tissue dry weight (dw), with 9.1 ± 0.1 mg g(-1) dw of saturated FAs, 25.5 ± 0.1 mg g(-1) dw of mono-unsaturated FAs, and 12.2 ± 0.1 mg g(-1) dw of poly-unsaturated FAs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Optimization of cloud point extraction procedure with response surface methodology for quantification of iron by means of flame atomic absorption spectrometry

    Directory of Open Access Journals (Sweden)

    Abdolmohammad-Zadeh Hossein

    2013-01-01

    Full Text Available A simple micelle-mediated phase separation method has been developed for the pre-concentration of trace levels of iron as a prior step to determination by flame atomic absorption spectrometry (FAAS. The method is based on the cloud point extraction (CPE of iron using non-ionic surfactant polyethyleneglycolmono-p-nonylphenylether (PONPE 7.5 without adding any chelating agent. Several variables affecting the extraction efficiency were studied and optimized utilizing central composite design (CCD and three levels full factorial design. Under the optimum conditions, the limit of detection (LOD, limit of quantification (LOQ and pre-concentration factor were 1.5 μg L-1, 5.0 μg L-1 and 100, respectively. The relative standard deviation (RSD for six replicate determinations at 50 μg L−1 Fe(III level was 1.97%. The calibration graph was linear in the rage of 5-100 μg L-1, with a correlation coefficient of 0.9921. The developed method was validated by the analysis of two certified reference materials and applied successfully to the determination of trace amounts of Fe(III in water and rice samples.

  8. Lung involvement quantification in chest radiographs; Quantificacao de comprometimento pulmonar em radiografias de torax

    Energy Technology Data Exchange (ETDEWEB)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Instituto de Biociencias. Departamento de Fisica e Biofisica; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M., E-mail: giacomini@ibb.unesp.br [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Faculdade de Medicina. Departamento de Doencas Tropicais e Diagnostico por Imagem

    2014-12-15

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  9. Grid of the Future: Quantification of Benefits from Flexible Energy Resources in Scenarios With Extra-High Penetration of Renewable Energy

    Energy Technology Data Exchange (ETDEWEB)

    Bebic, Jovan [General Electric International, Inc., Schenectady, NY (United States). Energy Consulting; Hinkle, Gene [General Electric International, Inc., Schenectady, NY (United States). Energy Consulting; Matic, Slobodan [General Electric International, Inc., Schenectady, NY (United States). Energy Consulting; Schmitt, William [General Electric International, Inc., Schenectady, NY (United States). Energy Consulting

    2015-01-15

    The main objective of this study is to quantify the entitlement for system benefits attainable by pervasive application of flexible energy resources in scenarios with extra-high penetration of renewable energy. The quantified benefits include savings in thermal energy and reduction of CO2 emissions. Both are primarily a result of displacement of conventional thermal generation by renewable energy production, but there are secondary improvements that arise from lowering operating reserves, removing transmission constraints, and by partially removing energy-delivery losses due to energy production by distributed solar. The flexible energy resources in the context of this study include energy storage and adjustable loads. The flexibility of both was constrained to a time horizon of one day. In case of energy storage this means that the state of charge is restored to the starting value at the end of each day, while for load this means that the daily energy consumed is maintained constant. The extra-high penetration of renewable energy in the context of this study means the level of penetration resulting in significant number of hours where instantaneous power output from renewable resources added to the power output from baseload nuclear fleet surpasses the instantaneous power consumption by the load.

  10. Human error probability quantification using fuzzy methodology in nuclear plants; Aplicacao da metodologia fuzzy na quantificacao da probabilidade de erro humano em instalacoes nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, Claudio Souza do

    2010-07-01

    This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations a hypothesis on Research Reactor IEA-R1 from IPEN. It was also obtained a Performance Shaping Factors (PSF) evaluation in order to classify them according to their influence level onto the operator's actions and to determine these PSF actual states over the plant. Both HEP estimation and PSF evaluation were done based on Specialists Evaluation using interviews and questionnaires. Specialists group was composed from selected IEA-R1 operators. Specialist's knowledge representation into linguistic variables and group evaluation values were obtained through Fuzzy Logic and Fuzzy Set Theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on Human Reliability Analysis (HRA). (author)

  11. DNA-based methodologies for the quantification of live and dead cells in formulated biocontrol products based on Pantoea agglomerans CPA-2.

    Science.gov (United States)

    Soto-Muñoz, Lourdes; Torres, Rosario; Usall, Josep; Viñas, Inmaculada; Solsona, Cristina; Teixidó, Neus

    2015-10-01

    Pantoea agglomerans strain CPA-2 is an effective biocontrol agent (BCA) against the major postharvest pathogens present on pome and citrus fruits. Dehydration, such as freeze-drying, spray-drying and fluidized bed drying is one of the best ways to formulate BCAs. In this work, the survival of CPA-2 cells after formulation was determined by dilution plating and molecular methods as qPCR alone and combined with a sample pretreatment with a propidium monoazide dye (PMA-qPCR) and they were used to calculate treatment concentrations in efficacy trials on postharvest oranges. Furthermore, no significant differences in CPA-2 survival were observed as determined by dilution plating and PMA-qPCR after both the freeze drying and fluidized bed drying processes; however, an interesting significant difference was observed in the spray dried product comparing all quantitative methods. A difference of 0.48 and 2.17 log10 CFU or cells g/dw was observed among PMA-qPCR with qPCR and dilution plating, respectively. According to our study, dilution plating was shown to be an unreliable tool for monitoring the survival of CPA-2 after spray drying. In contrast, the combination of PMA and qPCR enabled a quick and unequivocal methodology to enumerate viable and VBNC CPA-2 cells under stress-dried conditions. Efficacy trials showed that, after 3 days, spray drying formulation rehydrated with 10% non-fat skimmed milk (NFSM) was as effective as fresh cells to control Penicillium digitatum in oranges. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Uncertainty Quantification in Aerodynamics Simulations Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  13. Case Studies Conducted in China Based on ISO Economic Benefits Assessment Methodology of Standards%基于ISO标准经济效益评估方法在中国开展的案例研究

    Institute of Scientific and Technical Information of China (English)

    付强; 王益谊; 王丽君; 刘辉

    2013-01-01

    With the support of Roland Berger Strategy Consultants, ISO developed a methodology for the assessment of standards economic benefits with the objective to quantify the impacts of standards on organizations and to compare the assessment results of different organizations. This paper systematically studies the ISO assessment methodology and gives the brief introduction of the ISO research project of standard economic benefits and some preliminary conclusions. Based on the case studies of the application of ISO assessment methodology in China, this paper provides thoughts on how to improve the ISO assessment methodology based on China’s national condition.%国际标准化组织(ISO)在德国罗兰·贝格战略咨询公司的支持下,制定了一个方法来评估标准所产生的经济效益,目的是量化标准对组织产生的影响,并比较不同组织的评估结果。本文系统研究了ISO评估方法,介绍了ISO标准经济效益研究项目的基本情况及得到的初步结论。基于中国应用ISO评估方法开展的案例研究,探讨了如何基于中国国情完善ISO评估方法的思路。

  14. Ferrite Quantification Methodologies for Duplex Stainless Steel

    Directory of Open Access Journals (Sweden)

    Arnaldo Forgas Júnior

    2016-07-01

    Full Text Available In order to quantify ferrite content, three techniques, XRD, ferritoscope and optical metallography, were applied to a duplex stainless steel UNS S31803 solution-treated for 30 min at 1,000, 1,100 and 1,200 °C, and then compared to equilibrium of phases predicted by ThermoCalc® simulation. As expected, the microstructure is composed only by austenite and ferrite phases, and ferrite content increases as the solution treatment temperature increases. The microstructure presents preferred grains orientation along the rolling directions even for a sample solution treated for 30 min at 1,200 °C. For all solution treatment temperatures, the ferrite volume fractions obtained by XRD measurements were higher than those achieved by the other two techniques and ThermoCalc® simulation, probably due to texturing effect of previous rolling process. Values obtained by quantitative metallography look more assertive as it is a direct measurement method but the ferritoscope technique should be considered mainly for in loco measurement.

  15. Public health benefits of reducing air pollution in Shanghai: a proof-of-concept methodology with application to BenMAP.

    Science.gov (United States)

    Voorhees, A Scott; Wang, Jiandong; Wang, Cuicui; Zhao, Bin; Wang, Shuxiao; Kan, Haidong

    2014-07-01

    In recent years, levels of particulate matter (PM) air pollution in China have been relatively high, exceeding China's Class II standards in many cities and impacting public health. This analysis takes Chinese health impact functions and underlying health incidence, applies 2010-2012 modeled and monitored PM air quality data, and estimates avoided cases of mortality and morbidity in Shanghai, assuming achievement of China's Class II air quality standards. In Shanghai, the estimated avoided all cause mortality due to PM10 ranged from 13 to 55 cases per day and from 300 to 800 cases per year. The estimated avoided impact on hospital admissions due to PM10 ranged from 230 cases to 580 cases per day and from 5400 to 7900 per year. The estimated avoided impact on all cause mortality due to PM2.5 ranged from 6 to 26 cases per day and from 39 to 1400 per year. The estimated impact on all cause mortality of a year exposure to an annual or monthly mean PM2.5 concentration ranged from 180 to 3500 per year. In Shanghai, the avoided cases of all cause mortality had an estimated monetary value ranging from 170 million yuan (1 US dollar=4.2 yuan Purchasing Power Parity) to 1200 million yuan. Avoided hospital admissions had an estimated value from 20 to 43 million yuan. Avoided emergency department visits had an estimated value from 5.6 million to 15 million yuan. Avoided outpatient visits had an estimated value from 21 million to 31 million yuan. In this analysis, available data were adequate to estimate avoided health impacts and assign monetary value. Sufficient supporting documentation was available to construct and format data sets for use in the United States Environmental Protection Agency's health and environmental assessment model, known as the Environmental Benefits Mapping and Analysis Program - Community Edition ("BenMAP-CE").

  16. Research Methodology

    CERN Document Server

    Rajasekar, S; Philomination, P

    2006-01-01

    In this manuscript various components of research are listed and briefly discussed. The topics considered in this write-up cover a part of the research methodology paper of Master of Philosophy (M.Phil.) course and Doctor of Philosophy (Ph.D.) course. The manuscript is intended for students and research scholars of science subjects such as mathematics, physics, chemistry, statistics, biology and computer science. Various stages of research are discussed in detail. Special care has been taken to motivate the young researchers to take up challenging problems. Ten assignment works are given. For the benefit of young researchers a short interview with three eminent scientists is included at the end of the manuscript.

  17. Estimation of Social Benefits in Cost-benefit Analysis

    OpenAIRE

    Beáta Fodor

    2012-01-01

    While examining the cost-benefit analysis related to public policy decisions in the Hungarian and international literature, this paper is looking for the answer to the question of what the methodological principles are according to which the benefit impacts can be determined. The processed Hungarian and English-language studies indicate that the theoretical-methodological questions of the determination of benefit impacts are not clear cut. The author has constructed a model that contains the ...

  18. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    , multiple development goals can be reinforced by specific climate funding granted on the basis of multiple benefits and synergies, for instance through currently negotiated mechanisms such as Nationally Appropriate Mitigation Actions (NAMAs) (REDD+, Kissinger et al 2012). 3. Challenges to quantifying GHG information for the agricultural sector The quantification of GHG emissions from agriculture is fundamental to identifying mitigation solutions that are consistent with the goals of achieving greater resilience in production systems, food security, and rural welfare. GHG emissions data are already needed for such varied purposes as guiding national planning for low-emissions development, generating and trading carbon credits, certifying sustainable agriculture practices, informing consumers' choices with regard to reducing their carbon footprints, assessing product supply chains, and supporting farmers in adopting less carbon-intensive farming practices. Demonstrating the robustness, feasibility, and cost effectiveness of agricultural GHG inventories and monitoring is a necessary technical foundation for including agriculture in the international negotiations under the United Nations Framework Convention on Climate Change (UNFCCC), and is needed to provide robust data and methodology platforms for global corporate supply-chain initiatives (e.g., SAFA, FAO 2012). Given such varied drivers for GHG reductions, there are a number of uses for agricultural GHG information, including (1) reporting and accounting at the national or company level, (2) land-use planning and management to achieve specific objectives, (3) monitoring and evaluating impact of management, (4) developing a credible and thus tradable offset credit, and (5) research and capacity development. The information needs for these uses is likely to differ in the required level of certainty, scale of analysis, and need for comparability across systems or repeatability over time, and they may depend on whether

  19. Cost-benefit analysis of a ''clean car'': methodology and application to the electric vehicle; Analyse couts-benefices d'une voiture propre: methodologie et application a la voiture electrique

    Energy Technology Data Exchange (ETDEWEB)

    Rabl, A. [Ecole des Mines de Paris, Centre d' Energetique, 75 (France)

    2003-03-01

    This paper evaluates the benefits of a 'clean car' (electric vehicle, fuel cell vehicle...) compared to a conventional car (gasoline or diesel). To put the results in perspective, other approaches for reducing automotive air pollution are briefly discussed. The damage costs attributable to pollution are quantified by means of an impact pathway analysis, using the methodology of the ExternE (External Costs of Energy) project series of the EC. For atmospheric dispersion models for short (street), intermediate (city) and long range (France and Europe) are combined. The totality of impacts over the life cycle of a car is taken into account by using emissions inventories obtained from life cycle assessments. The impact pathways analysis provides as key result the cost per kg of a pollutant (for the site and conditions where it is emitted). The damage cost per km follows as product of the cost per kg and the emission per km. The discounted life cycle costs of the cars are evaluated and compared, using data for the Peugeot 106 which is available in all three versions (gasoline, diesel and electric). Specific results are shown in terms of cost per km for two utilizations: 25 km/day and 45 km/day. Two different points of view are presented: that of the individual (including taxes and subsidies, but without damage costs) and that of society (excluding taxes and subsidies, but with damage costs). The uncertainty range of the damage cost estimates is indicated. The results show that the damage costs due to the pollution of conventional cars are considerable, especially for older models (comparable to the cost of the fuel), and can justify major expenditures for the replacement by cleaner cars. However, the present cost of an electric car is so high that it may outweigh its environmental advantage, even in Paris where the benefits of an electric car are higher than anywhere else in the world (large health impacts because of large population, and electricity production

  20. Accelerating time to benefit

    DEFF Research Database (Denmark)

    Svejvig, Per; Geraldi, Joana; Grex, Sara

    Despite the ubiquitous pressure for speed, our approaches to accelerate projects remain constrained to the old-fashioned understanding of the project as a vehicle to deliver products and services, not value. This article explores an attempt to accelerate time to benefit. We describe and deconstruct...... the implementation of a large intervention undertaken in five project-based organizations in Denmark – the Project Half Double where the same project methodology has been applied in five projects, each of them in five distinct organizations in Denmark, as a bold attempt to realize double the benefit in half...... of the time. Although all cases valued speed and speed to benefit, and implemented most practices proposed by the methodology, only three of the five projects were more successful in decreasing time to speed. Based on a multi-case study comparison between these five different projects and their respective...

  1. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  2. Core benefits

    National Research Council Canada - National Science Library

    Keith, Brian W

    2010-01-01

    This SPEC Kit explores the core employment benefits of retirement, and life, health, and other insurance -benefits that are typically decided by the parent institution and often have significant governmental regulation...

  3. Improving qPCR methodology for detection of foaming bacteria by analysis of broad-spectrum primers and a highly specific probe for quantification of Nocardia spp. in activated sludge.

    Science.gov (United States)

    Asvapathanagul, P; Olson, B H

    2017-01-01

    To develop qPCR broad-spectrum primers combined with a Nocardia genus-specific probe for the identification of a broad spectrum of Nocardia spp. and to analyse the effects of using this developed primer and probe set on the ability to quantify Nocardia spp. in mixed DNA. The consequences of using a degenerative primer set and species-specific probe for the genus Nocardia on qPCR assays were examined using DNA extracts of pure cultures and activated sludge. The mixed DNA extracts where the target organism Nocardia flavorosea concentration ranged from 5 × 10(2) to 5 × 10(6) copies per reaction, while the background organism's DNA (Mycobacterium bovis) concentration was held at 5 × 10(6) copies per reaction, only produced comparable cycle threshold florescence levels when N. flavorosea concentration was greater than or equal to the background organism concentration. When concentrations of N. flavorosea were lowered in increments of 1 log, while holding M. bovis concentrations constant at 5 × 10(6) copies per reaction, all assays demonstrated delayed cycle threshold values with a maximum 34·6-fold decrease in cycle threshold at a ratio of 10(6) M. bovis: 10(2) N. flavorosea copies per reaction. The data presented in this study indicated that increasing the ability of a primer set to capture a broad group of organisms can affect the accuracy of quantification even when a highly specific probe is used. This study examined several applications of molecular tools in complex communities such as evaluating the effect of mispriming vs interference. It also elucidates the importance of understanding the community genetic make-up on primer design. Degenerative primers are very useful in amplifying bacterial DNA across genera, but reduce the efficiency of qPCR reactions. Therefore, standards that address closely related background species must be used to obtain accurate qPCR results. © 2016 The Society for Applied Microbiology.

  4. Validação de metodologia analítica e estudo de estabilidade para quantificação sérica de paracetamol Analytical methodology validation and stability study for serum quantification of acetaminophen

    Directory of Open Access Journals (Sweden)

    Viviane Cristina Sebben

    2010-04-01

    paracetamol is currently one of the most used analgesic-antipyretic agents, mainly with children. However, both the easy access to this medicine and the population’s unawareness of its toxic effects have contributed to a rise in the number of intoxications caused by this drug. Assessment of serum acetaminophen confirms the diagnosis. Not only does the result have diagnostic reliability but it also evaluates the risk of hepatotoxicity, indicating or not the administration of the specific antidote n-acetylcysteine. The aim of this study is to present an analytical method to the assessment of serum acetaminophen by spectrophotometric detection at 430 nm. MATERIALS AND METHODS: After sample deproteinization, acetaminophen (n-acetyl-p-aminophenol reacts with sodium nitrite forming 2.4-nitro-4-acetaminophenol, which becomes yellowish in alkaline medium. For method validation, linearity, precision, accuracy, robustness, recovery and detection limits were evaluated according to ICH and ANVISA criteria. The stability study was carried out after freezing/defreezing cycles, short-time duration, long-time duration under refrigeration and long-time duration under freezing. RESULTS: The method showed to be linear from 20 to 300 mg/l. The detection and quantification limits were 3.6 mg/l and 20 mg/l, respectively. CONCLUSION: The method was precise, accurate and robust and showed good recovery. The control-samples were stable in all tested conditions. The method developed presented all the necessary parameters to be applied in acetaminophen quantification in plasma samples or human serum for emergency analyzes. Furthermore, it is a simple, time and cost-effective technique.

  5. Accelerating time to benefit

    DEFF Research Database (Denmark)

    Svejvig, Per; Geraldi, Joana; Grex, Sara

    Despite the ubiquitous pressure for speed, our approaches to accelerate projects remain constrained to the old-fashioned understanding of the project as a vehicle to deliver products and services, not value. This article explores an attempt to accelerate time to benefit. We describe and deconstruct...... of the time. Although all cases valued speed and speed to benefit, and implemented most practices proposed by the methodology, only three of the five projects were more successful in decreasing time to speed. Based on a multi-case study comparison between these five different projects and their respective...

  6. Tourism Methodologies

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  7. THE BENEFITS OF NEUROECONOMICS

    Directory of Open Access Journals (Sweden)

    Michał Krawczyk

    2011-06-01

    Full Text Available This paper serves as a brief introduction into the methods, results and problems of the new interdisciplinary field of neuroecomics (and its relatives. The focus is on the practical benefits that may result from it for the economic profession. These primarily involve the possibility of setting new promising research directions and providing novel tools raising hopes of enabling direct observation of human preference. The author also discusses methodological and ethical challenges that neuroeconomics is or will soon be facing

  8. Co-benefits of private investment in climate change mitigation and adaptation in developing countries. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Bystricky, E.; Gilbert, A.; Klaus, S.; Rordorf, J. [Ecofys Group, Utrecht (Netherlands); Ward, M. [GtripleC, Wellington (New Zealand)

    2010-11-15

    The aim of this report is to inform the international community of the potential benefits for development that can be gained from adding private sector finance to public finance for climate change mitigation and adaptation. Specifically it considers whether, in addition to helping to reduce emissions, leveraging private finance through public-private financing mechanisms can result in other benefits that may not be achieved through public financing alone. These include among others access to electricity for the poorest communities from off-grid renewable electricity investments, new jobs, and transfer and development of skills and expertise. An initial literature review suggests that there has been little quantification of the developmental co-benefits of private investment, and little methodology available to estimate the additional benefits that may result. The purpose of this document is to address this analytical gap. Without a clear understanding of the co-benefits, developing countries will continue to view private finance as being less important than public finance. This may act as a barrier to them enjoying the developmental benefits of private investment. Section 2 defines co-benefits, and their link to private sector finance. Section 3 presents the methodology needed to help quantify these co-benefits, and section 4 presents some numbers based on projects and case studies. Forestry and adaptation have been looked at specifically, with results presented in section 5. Co-benefits can also carry risks, and there may be pre-conditions for them to be realised, as discussed in section 6. Section 7 gives conclusions and further steps needed. Appendices A and B cover general aspects of methodology and job creation.

  9. A METHODOLOGY FOR DEVELOPING A ROADMAP TOWARDS LOCAL LOW-CARBON SOCIETY COSIDERING IMPLEMENTATION COST

    Science.gov (United States)

    Gomi, Kei; Kim, Jaegyu; Matsuoka, Yuzuru

    We have developed a methodology for developing roadmaps towards low-carbon society in local government. A quantification tool called "Backcasting Tool" (BCT) was developed. BCT estimates implementation schedule of all policies and actions considering their relationship, financial constraints of the actors, and co-benefit of the policies. The methodology was applied in Shiga prefecture, Japan, and a roadmap which consists of more than 240 policies is estimated considering direct costs paid by public and private sectors. As a result, cumulative implementation cost was 7.3 trillion yen in which public sector bear 17%. Cumulative emission reduction was 101MtCO2, and average emission reduction cost was 73 thousand yen/tCO2.

  10. Socioeconomic benefits

    African Journals Online (AJOL)

    User

    perception on the benefits of shade trees in coffee production systems in southwestern part of Ethiopia. ... with growing coffee without shade tree plants that included stunted growth which ultimately ...... coffee producers' price risk. J. Inter.

  11. Medicaid Benefits

    Science.gov (United States)

    ... Benefits Close About Us Messages from CMCS Program History Leadership Organization Visit CMS Contact Us Close Home > Medicaid > ... for Coverage LTSS Prescription Drugs About Us Program History Leadership Organization Visit CMS Contact Us State Medicaid & CHIP ...

  12. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  13. Cost-benefit analysis of space technology

    Science.gov (United States)

    Hein, G. F.; Stevenson, S. M.; Sivo, J. N.

    1976-01-01

    A discussion of the implications and problems associated with the use of cost-benefit techniques is presented. Knowledge of these problems is useful in the structure of a decision making process. A methodology of cost-benefit analysis is presented for the evaluation of space technology. The use of the methodology is demonstrated with an evaluation of ion thrusters for north-south stationkeeping aboard geosynchronous communication satellites. A critique of the concept of consumers surplus for measuring benefits is also presented.

  14. Introduction to uncertainty quantification

    CERN Document Server

    Sullivan, T J

    2015-01-01

    Uncertainty quantification is a topic of increasing practical importance at the intersection of applied mathematics, statistics, computation, and numerous application areas in science and engineering. This text provides a framework in which the main objectives of the field of uncertainty quantification are defined, and an overview of the range of mathematical methods by which they can be achieved. Complete with exercises throughout, the book will equip readers with both theoretical understanding and practical experience of the key mathematical and algorithmic tools underlying the treatment of uncertainty in modern applied mathematics. Students and readers alike are encouraged to apply the mathematical methods discussed in this book to their own favourite problems to understand their strengths and weaknesses, also making the text suitable as a self-study. This text is designed as an introduction to uncertainty quantification for senior undergraduate and graduate students with a mathematical or statistical back...

  15. Identification and Quantification of Protein Glycosylation

    Directory of Open Access Journals (Sweden)

    Ziv Roth

    2012-01-01

    Full Text Available Glycosylation is one of the most abundant posttranslation modifications of proteins, and accumulating evidence indicate that the vast majority of proteins in eukaryotes are glycosylated. Glycosylation plays a role in protein folding, interaction, stability, and mobility, as well as in signal transduction. Thus, by regulating protein activity, glycosylation is involved in the normal functioning of the cell and in the development of diseases. Indeed, in the past few decades there has been a growing realization of the importance of protein glycosylation, as aberrant glycosylation has been implicated in metabolic, neurodegenerative, and neoplastic diseases. Thus, the identification and quantification of protein-borne oligosaccharides have become increasingly important both in the basic sciences of biochemistry and glycobiology and in the applicative sciences, particularly biomedicine and biotechnology. Here, we review the state-of-the-art methodologies for the identification and quantification of oligosaccharides, specifically N- and O-glycosylated proteins.

  16. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  17. Multifractal methodology

    CERN Document Server

    Salat, Hadrien; Arcaute, Elsa

    2016-01-01

    Various methods have been developed independently to study the multifractality of measures in many different contexts. Although they all convey the same intuitive idea of giving a "dimension" to sets where a quantity scales similarly within a space, they are not necessarily equivalent on a more rigorous level. This review article aims at unifying the multifractal methodology by presenting the multifractal theoretical framework and principal practical methods, namely the moment method, the histogram method, multifractal detrended fluctuation analysis (MDFA) and modulus maxima wavelet transform (MMWT), with a comparative and interpretative eye.

  18. Risk assessment for benefits analysis: framework for analysis of a thyroid-disrupting chemical.

    Science.gov (United States)

    Axelrad, Daniel A; Baetcke, Karl; Dockins, Chris; Griffiths, Charles W; Hill, Richard N; Murphy, Patricia A; Owens, Nicole; Simon, Nathalie B; Teuschler, Linda K

    Benefit-cost analysis is of growing importance in developing policies to reduce exposures to environmental contaminants. To quantify health benefits of reduced exposures, economists generally rely on dose-response relationships estimated by risk assessors. Further, to be useful for benefits analysis, the endpoints that are quantified must be expressed as changes in incidence of illnesses or symptoms that are readily understood by and perceptible to the layperson. For most noncancer health effects and for nonlinear carcinogens, risk assessments generally do not provide the dose-response functions necessary for economic benefits analysis. This article presents the framework for a case study that addresses these issues through a combination of toxicology, epidemiology, statistics, and economics. The case study assesses a chemical that disrupts proper functioning of the thyroid gland, and considers the benefits of reducing exposures in terms of both noncancer health effects (hypothyroidism) and thyroid cancers. The effects are presumed to be due to a mode of action involving interference with thyroid-pituitary functioning that would lead to nonlinear dose response. The framework integrates data from animal testing, statistical modeling, human data from the medical and epidemiological literature, and economic methodologies and valuation studies. This interdisciplinary collaboration differs from the more typical approach in which risk assessments and economic analyses are prepared independently of one another. This framework illustrates particular approaches that may be useful for expanded quantification of adverse health effects, and demonstrates the potential of such interdisciplinary approaches. Detailed implementation of the case study framework will be presented in future publications.

  19. Validación de la metodología analítica para cuantificación de Selenio en alimentos de la canasta básica del costarricense Validation of analytical methodology for quantification of selenium in the food basket of the Costa Rican

    Directory of Open Access Journals (Sweden)

    Paulina Silva Trejos

    2011-06-01

    Costa Rican basic consumption by atomic absorption spectroscopy with hydride generation. Materials and methods: The foods were processed according to the pattern of consumption in Costa Rica. Once cooked, samples were taken for moisture determination and for microwave digestion. The quantification was performed on digested samples by atomic absorption spectroscopy with hydride generation using sodium borohydride 0,6 % as a reducing agent in sodium hydroxide 0,5 % and hydrochloric acid 10 mol / L. The samples and standard solutions for calibration curve were acidified with concentrated hydrochloric acid to pH 1, heated to (70-90 ° C for 15 minutes to ensure the +4 oxidation state. Validation of the methodology was carried out with selenium standard solutions traceable to NIST ® and certified reference materials traceable. Discussion: The quantification was realized in the concentration range between (1.3 to 50 μg / L, with a correlation coefficient of 0.9984, the detection and quantification limits were (1.3 ± 0.2 mg / L and (2.2 ± 0.2 mg / L. The precision was evaluated in terms of repeatability and reproducibility, the results were 0.7 μg / L and 0.9 μg / L calculated as standard deviation, respectively. The veracity was determined using a NIST standard ® certificate, SRM 1846 Infant Formula the bias was of -2.5 %. The foods containing selenium detectable and measurable were: white rice, rice earlier maturity, rolled oats, sweet chile mature peas, beef liver, egg yolk, lentils, tripe, bread, pasta, cream cheese, mozzarella cheese, tilapia fillet, peeled carrot

  20. External Costs and Benefits of Energy. Methodologies, Results and Effects on Renewable Energies Competitivity; Costes y Beneficios Externos de la Energia. Metodologias, Resultados e Influencia sobre la Competitividad de las Energias Renovables

    Energy Technology Data Exchange (ETDEWEB)

    Saez, R.; Cabal, H.; Varela, M. [CIEMAT. Madrid (Spain)

    1999-09-01

    This study attempts to give a summarised vision of the concept of externally in energy production, the social and economic usefulness of its evaluation and consideration as support to the political decision-marking in environmental regulation matters, technologies selection of new plants, priorities establishment on energy plans, etc. More relevant environmental externalities are described, as are the effects on the health, ecosystems, materials and climate, as well as some of the socioeconomic externalities such as the employment, increase of the GDP and the reduction and depletion of energy resources. Different methodologies used during the last years have been reviewed as well as the principals results obtained in the most relevant studies accomplished internationally on this topic. Special mention has deserved the European study National Implementation of the ExternE Methodology in the EU. Results obtained are represented in Table 2 of this study. Also they are exposed, in a summarised way, the results obtained in the evaluation of environmental externalities of the Spanish electrical system in function of the fuel cycle. In this last case the obtained results are more approximated since have been obtained by extrapolation from the obtained for ten representative plants geographically distributed trough the Peninsula. Finally it has been analysed the influence that the internalization of the external costs of conventional energies can have in the competitiveness and in the market of renewable energy, those which originate less environmental effects and therefore produce much smaller external costs. The mechanisms of internalization and the consideration on the convenience or not of their incorporation in the price of energy have been also discussed. (Author) 30 refs.

  1. Methodological advances

    Directory of Open Access Journals (Sweden)

    Lebreton, J.-D.

    2004-06-01

    Full Text Available The study of population dynamics has long depended on methodological progress. Among many striking examples, continuous time models for populations structured in age (Sharpe & Lotka, 1911 were made possible by progress in the mathematics of integral equations. Therefore the relationship between population ecology and mathematical and statistical modelling in the broad sense raises a challenge in interdisciplinary research. After the impetus given in particular by Seber (1982, the regular biennial EURING conferences became a major vehicle to achieve this goal. It is thus not surprising that EURING 2003 included a session entitled “Methodological advances”. Even if at risk of heterogeneity in the topics covered and of overlap with other sessions, such a session was a logical way of ensuring that recent and exciting new developments were made available for discussion, further development by biometricians and use by population biologists. The topics covered included several to which full sessions were devoted at EURING 2000 (Anderson, 2001 such as: individual covariates, Bayesian methods, and multi–state models. Some other topics (heterogeneity models, exploited populations and integrated modelling had been addressed by contributed talks or posters. Their presence among “methodological advances”, as well as in other sessions of EURING 2003, was intended as a response to their rapid development and potential relevance to biological questions. We briefly review all talks here, including those not published in the proceedings. In the plenary talk, Pradel et al. (in prep. developed GOF tests for multi–state models. Until recently, the only goodness–of–fit procedures for multistate models were ad hoc, and non optimal, involving use of standard tests for single state models (Lebreton & Pradel, 2002. Pradel et al. (2003 proposed a general approach based in particular on mixtures of multinomial distributions. Pradel et al. (in prep. showed

  2. Who benefits?

    DEFF Research Database (Denmark)

    Hjorth, Frederik Georg

    2016-01-01

    Cross-border welfare rights for citizens of European Union member states are intensely contested, yet there is limited research into voter opposition to such rights, sometimes denoted ‘welfare chauvinism’. We highlight an overlooked aspect in scholarly work: the role of stereotypes about...... beneficiaries of cross-border welfare. We present results from an original large-scale survey experiment (N=2525) among Swedish voters, randomizing exposure to cues about recipients' country of origin and family size. Consistent with a model emphasizing the role of stereotypes, respondents react to cues about...... recipient identity. These effects are strongest among respondents high in ethnic prejudice and economic conservatism. The findings imply that stereotypes about who benefits from cross-border welfare rights condition public support for those rights....

  3. Brief review of uncertainty quantification for particle image velocimetry

    Science.gov (United States)

    Farias, M. H.; Teixeira, R. S.; Koiller, J.; Santos, A. M.

    2016-07-01

    Metrological studies for particle image velocimetry (PIV) are recent in literature. An attempt to evaluate the uncertainty quantifications (UQ) of the PIV velocity field are in evidence. Therefore, a short review on main sources of uncertainty in PIV and available methodologies for its quantification are presented. In addition, the potential of some mathematical techniques, coming from the area of geometric mechanics and control, that could interest the fluids UQ community are highlighted as good possibilities. “We must measure what is measurable and make measurable what cannot be measured” (Galileo)

  4. Feminist methodologies and engineering education research

    Science.gov (United States)

    Beddoes, Kacey

    2013-03-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory. The paper begins with a literature review that covers a broad range of topics featured in the literature on feminist methodologies. Next, data from interviews with engineering educators and researchers who have engaged with feminist methodologies are presented. The ways in which feminist methodologies shape their research topics, questions, frameworks of analysis, methods, practices and reporting are each discussed. The challenges and barriers they have faced are then discussed. Finally, the benefits of further and broader engagement with feminist methodologies within the engineering education community are identified.

  5. Development of economic consequence methodology for process risk analysis.

    Science.gov (United States)

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  6. Stereo-particle image velocimetry uncertainty quantification

    Science.gov (United States)

    Bhattacharya, Sayantan; Charonko, John J.; Vlachos, Pavlos P.

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  7. Cost benefit analysis for climate change adaption

    NARCIS (Netherlands)

    Ierland, van E.C.; Weikard, H.P.; Wesseler, J.H.H.; Groeneveld, R.A.; Ansink, E.J.H.; Bruin, de K.; Rietveld, P.; Bockarjova, M.; Hofkes, M.; Brouwer, R.; Dekker, T.

    2012-01-01

    The focus of this programme was on the development of decision making tools based on cost benefit analysis under uncertainty, for analysing adaptation and mitigation options related to spatial planning in the Netherlands. The full programme focused on the methodological issues for cost benefit analy

  8. Uncertainty Quantification in Climate Modeling

    Science.gov (United States)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  9. A remark on collective quantification

    NARCIS (Netherlands)

    Kontinen, J.; Szymanik, J.

    2008-01-01

    We consider collective quantification in natural language. For many years the common strategy in formalizing collective quantification has been to define the meanings of collective determiners, quantifying over collections, using certain type-shifting operations. These type-shifting operations, i.e.

  10. Exercise: Benefits of Exercise

    Medline Plus

    Full Text Available ... A-Z > Exercise: Benefits of Exercise: Health Benefits In This Topic Health Benefits Benefits for Everyday Life ... Try Exercise: How to Stay Active The information in this topic was provided by the National Institute ...

  11. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    useful in quantifying disease severity, they require an extensive clinical experience and carry a risk of subjectivity. We explore the opportunity to use in vivo near-infrared (NIR) spectra as an objective and noninvasive method for local disease severity assessment in 31 psoriasis patients in whom...... selected plaques were scored clinically. A partial least squares (PLS) regression model was used to analyze and predict the severity scores on the NIR spectra of psoriatic and uninvolved skin. The correlation between predicted and clinically assigned scores was R=0.94 (RMSE=0.96), suggesting that in vivo...... NIR provides accurate clinical quantification of psoriatic plaques. Hence, NIR may be a practical solution to clinical severity assessment of psoriasis, providing a continuous, linear, numerical value of severity....

  12. Semiautomatic quantification of angiogenesis.

    Science.gov (United States)

    Boettcher, Markus; Gloe, Torsten; de Wit, Cor

    2010-07-01

    Angiogenesis is of major interest in developmental biology and cancer research. Different experimental approaches are available to study angiogenesis that have in common the need for microscopy, image acquisition, and analysis. Problems that are encountered hereby are the size of the structures, which requires generation of composite images and difficulties in quantifying angiogenic activity reliably and rapidly. Most graphic software packages lack some of the required functions for easy, semiautomatic quantification of angiogenesis and, consequently, multiple software packages or expensive programs have to be used to cover all necessary functions. A software package (AQuaL) to analyze angiogenic activity was developed using Java, which can be used platform-independently. It includes image acquisition relying on the Java Media Framework and an easy to use image alignment tool. Multiple overlapping images can be aligned and saved without limitations and loss of resolution into a composite image, which requires only the selection of a single point representing a characteristic structure in adjacent images. Angiogenic activity can be quantified in composite images semiautomatically by the assessment of the area overgrown by cells after filtering and image binarization. In addition, tagging of capillary-like structures allows quantification of their length and branching pattern. Both developed methods deliver reliable and correlating data as exemplified in the aortic ring angiogenesis assay. The developed software provides modular functions specifically targeted to quantify angiogenesis. Whereas the area measurement is time saving, length measurement provides additional information about the branching patterns, which is required for a qualitative differentiation of capillary growth. (c) 2010 Elsevier Inc. All rights reserved.

  13. Cytomegalovirus quantification: where to next in optimising patient management?

    Science.gov (United States)

    Atkinson, Claire; Emery, Vincent C

    2011-08-01

    Over the years quantification of cytomegalovirus (HCMV) load in blood has become a mainstay of clinical management helping direct deployment of antiviral therapy, assess response to therapy and highlight cases of drug resistance. The review focuses on a brief historical perspective of HCMV quantification and the ways in which viral load is being used to improve patient management. A review of the published literature and also personal experience at the Royal Free Hospital. Quantification of HCMV is essential for efficient patient management. The ability to use real time quantitative PCR to drive pre-emptive therapy has improved patient management after transplantation although the threshold viral loads for deployment differ between laboratories. The field would benefit from access to a universal standard for quantification. We see that HCMV quantification will continue to be central to delivering individualised patient management and facilitating multicentre trials of new antiviral agents and vaccines in a variety of clinical settings. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Quantification of prebiotics in commercial infant formulas.

    Science.gov (United States)

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics.

  15. Validación de la metodología de cuantificación del magnesio por espectroscopia de absorción atómica de llama en la canasta básica de Costa Rica Validation methodology of quantification of magnesium by atomic flame spectroscopy in the basic food basket of Costa Rica

    Directory of Open Access Journals (Sweden)

    Paulina Silva Trejos

    2010-06-01

    , carnes (tilapia, pechuga de pollo, lomo e hígado de re, frutas (banano criollo y de exportación, productos industrializados (pan a base de harina de trigo y tortillas de maíz.Objective: Analytical methodology was validated for the determination of magnesium in foods by atomic absorption flame spectroscopy. Material and Methods: Foods that are eaten cooked were baked in the usual manner of consumption in Costa Rica, and samples were taken for moisture determination. The digestion of the samples was made in a microwave oven and magnesium content measured by atomic absorption flame spectroscopy. Validation of the methodology was carried out using magnesium standard solutions traceable to NIST ® and certified traceable, reference materials. Results: The linearity range was (0,021 to 0,65 mg/L with a correlation coefficient of 0,992. The detection and quantification limits obtained were 0,021 mg/L and 0,037 mg/L, respectively. The sensitivity calibration was 0.688 ALmg-1 and the sensitivity was 215 Lmg-1. The accuracy was evaluated in terms of reproducibility, the RDSr value was 1.1 %. The accuracy was determined using three standards certified by NIST ®, SRM 1846 Infant Formula with a reported value for magnesium (538 ± 29 mg / kg, SRM 1846 Bovine Muscle Powder with a reported value for magnesium (960 ± 95 mg / kg, and SRM 8415 Whole Egg Powder with a reported value for magnesium (305 ± 27 mg / kg in mass; in the three cases, the bias obtained, on average, was -0.0014 mg/L. Discussion: 46 foods selected from the Costa Rican food basket were analyzed, fresh or cooked in the usual way of consumption, without adding additives. Foods containing magnesium were: legumes (chickpeas, lentils, black and red beans, grains and cereals (white and pre-cooked rice, dairy products (whole milk powder, mozzarella cheese, fresh cheese, vegetables (sweet potato, garlic, sweet plantain, grains (beans, lentils, chickpeas, beef (tilapia, chicken breast, loin and beef liver, fruits

  16. Uncertainty Quantification in Hybrid Dynamical Systems

    CERN Document Server

    Sahai, Tuhin

    2011-01-01

    Uncertainty quantification (UQ) techniques are frequently used to ascertain output variability in systems with parametric uncertainty. Traditional algorithms for UQ are either system-agnostic and slow (such as Monte Carlo) or fast with stringent assumptions on smoothness (such as polynomial chaos and Quasi-Monte Carlo). In this work, we develop a fast UQ approach for hybrid dynamical systems by extending the polynomial chaos methodology to these systems. To capture discontinuities, we use a wavelet-based Wiener-Haar expansion. We develop a boundary layer approach to propagate uncertainty through separable reset conditions. We also introduce a transport theory based approach for propagating uncertainty through hybrid dynamical systems. Here the expansion yields a set of hyperbolic equations that are solved by integrating along characteristics. The solution of the partial differential equation along the characteristics allows one to quantify uncertainty in hybrid or switching dynamical systems. The above method...

  17. Uncertainty quantification in hybrid dynamical systems

    Science.gov (United States)

    Sahai, Tuhin; Pasini, José Miguel

    2013-03-01

    Uncertainty quantification (UQ) techniques are frequently used to ascertain output variability in systems with parametric uncertainty. Traditional algorithms for UQ are either system-agnostic and slow (such as Monte Carlo) or fast with stringent assumptions on smoothness (such as polynomial chaos and Quasi-Monte Carlo). In this work, we develop a fast UQ approach for hybrid dynamical systems by extending the polynomial chaos methodology to these systems. To capture discontinuities, we use a wavelet-based Wiener-Haar expansion. We develop a boundary layer approach to propagate uncertainty through separable reset conditions. We also introduce a transport theory based approach for propagating uncertainty through hybrid dynamical systems. Here the expansion yields a set of hyperbolic equations that are solved by integrating along characteristics. The solution of the partial differential equation along the characteristics allows one to quantify uncertainty in hybrid or switching dynamical systems. The above methods are demonstrated on example problems.

  18. Safety impact--the risk/benefits of functional foods.

    Science.gov (United States)

    Pascal, Gérard

    2009-12-01

    It is amazing to see how much the approach of the food risk analysis evolved in the recent years. For half a century and the birth of the risk assessment methodology in the food domain, only no appreciable health risk was considered acceptable by the manager. This is the vocabulary used in the case of a voluntary, deliberated human action, as the use of food additives (definition of ADI). In the case of risks not resulting from such an action, as that of the presence of contaminants, the risk assessor allocates provisional tolerable daily, weekly or monthly intake that are the basis for regulation. This vocabulary is in agreement with the objective which consists in approaching closer possible of the zero risk which is the wish of a majority of the consumers. Some years ago, the risk managers insisted to obtain from the assessors as often as possible a quantitative risk evaluation. More recently even, the managers would like to decide on the basis of a balance of risk and benefit acceptable for management purposes. Finally, they hope that general principles and tools will be available for conducting a quantitative risk-benefit analysis for foods and food ingredients. What is possible in the case of functional foods (FF)? Based on the definition of FF proposed in the programme FUFOSE, one has to distinguish between different situations in order to assess the risk: that of a micro-, that of a macro-component or that of a whole food. These situations have been clearly described in the document resulting from FOSIE. The standardized methodology relevant to assess micro-components is not well adapted to the assessment of whole food. Concepts of substantial equivalence and of history of safe use could be useful tools in this case. However, quantitative risk assessment remains a very difficult exercise. If a process for the assessment of health benefit of FF has been proposed as an outcome of the PASSCLAIM action, the quantification of this benefit needs adequate tools

  19. A Paradigm for Spreadsheet Engineering Methodologies

    CERN Document Server

    Grossman, Thomas A

    2008-01-01

    Spreadsheet engineering methodologies are diverse and sometimes contradictory. It is difficult for spreadsheet developers to identify a spreadsheet engineering methodology that is appropriate for their class of spreadsheet, with its unique combination of goals, type of problem, and available time and resources. There is a lack of well-organized, proven methodologies with known costs and benefits for well-defined spreadsheet classes. It is difficult to compare and critically evaluate methodologies. We present a paradigm for organizing and interpreting spreadsheet engineering recommendations. It systematically addresses the myriad choices made when developing a spreadsheet, and explicitly considers resource constraints and other development parameters. This paradigm provides a framework for evaluation, comparison, and selection of methodologies, and a list of essential elements for developers or codifiers of new methodologies. This paradigm identifies gaps in our knowledge that merit further research.

  20. Office Automation Cost/Benefit Evaluation: A Methodology.

    Science.gov (United States)

    1986-09-01

    called the "reactive effect of measurent" bias by Campbell (1957). Selltiz et al. (1959) wrote: If people feel that they are "guinea pigs" being...34good impression" as described above by Selltiz . There are at least two reasons for this. First, the survey has been directed by the Deputy Under...C4C, Subject: Aiministrative Modernization, July 20, 1979. Selltiz , C., and others, Research Methods in Social Relations, Hoit, Rinehart & Winston

  1. Technical Report on Methodology: Cost Benefit Analysis and Policy Responses

    NARCIS (Netherlands)

    Pearce DW; Howarth A; MNV

    2001-01-01

    The economic assessment of priorities for a European environmental policy plan focuses on twelve identified Prominent European Environmental Problems such as climate change, chemical risks and biodiversity. The study, commissioned by the European Commission (DG Environment) to a European consortium

  2. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  3. Uncertainty Quantification in Aeroelasticity

    Science.gov (United States)

    Beran, Philip; Stanford, Bret; Schrock, Christopher

    2017-01-01

    Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.

  4. Big Data Analytics Methodology in the Financial Industry

    Science.gov (United States)

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  5. Christiansen Revisited: Rethinking Quantification of Uniformity in Rainfall Simulator Studies

    Science.gov (United States)

    Green, Daniel; Pattison, Ian

    2016-04-01

    Rainfall simulators, whether based within a laboratory or field setting are used extensively within a number of fields of research, including plot-scale runoff, infiltration and erosion studies, irrigation and crop management and scaled investigations into urban flooding. Rainfall simulators offer a number of benefits, including the ability to create regulated and repeatable rainfall characteristics (e.g. intensity, duration, drop size distribution and kinetic energy) without relying on unpredictable natural precipitation regimes. Ensuring and quantifying spatially uniform simulated rainfall across the entirety of the plot area is of particular importance to researchers undertaking rainfall simulation. As a result, numerous studies have focused on the quantification and improvement of uniformity values. Several statistical methods for the assessment of rainfall simulator uniformity have been developed. However, the Christiansen Uniformity Coefficient (CUC) suggested by Christiansen (1942) is most frequently used. Despite this, there is no set methodology and researchers can adapt or alter factors such as the quantity, as well as the spacing, distance and location of the measuring beakers used to derive CUC values. Because CUC values are highly sensitive to the resolution of the data, i.e. the number of observations taken, many densely distributed measuring containers subjected to the same experimental conditions may generate a significantly lower CUC value than fewer, more sparsely distributed measuring containers. Thus, the simulated rainfall under a higher resolution sampling method could appear less uniform than when using a coarser resolution sampling method, despite being derived from the same initial rainfall conditions. Expressing entire plot uniformity as a single, simplified percentage value disregards valuable qualitative information about plot uniformity, such as the small-scale spatial distribution of rainfall over the plot surface and whether these

  6. Titan Science Return Quantification

    Science.gov (United States)

    Weisbin, Charles R.; Lincoln, William

    2014-01-01

    Each proposal for a NASA mission concept includes a Science Traceability Matrix (STM), intended to show that what is being proposed would contribute to satisfying one or more of the agency's top-level science goals. But the information traditionally provided cannot be used directly to quantitatively compare anticipated science return. We added numerical elements to NASA's STM and developed a software tool to process the data. We then applied this methodology to evaluate a group of competing concepts for a proposed mission to Saturn's moon, Titan.

  7. Environmental and Sustainability Education Policy Research: A Systematic Review of Methodological and Thematic Trends

    Science.gov (United States)

    Aikens, Kathleen; McKenzie, Marcia; Vaughter, Philip

    2016-01-01

    This paper reports on a systematic literature review of policy research in the area of environmental and sustainability education. We analyzed 215 research articles, spanning four decades and representing 71 countries, and which engaged a range of methodologies. Our analysis combines quantification of geographic and methodological trends with…

  8. Environmental and Sustainability Education Policy Research: A Systematic Review of Methodological and Thematic Trends

    Science.gov (United States)

    Aikens, Kathleen; McKenzie, Marcia; Vaughter, Philip

    2016-01-01

    This paper reports on a systematic literature review of policy research in the area of environmental and sustainability education. We analyzed 215 research articles, spanning four decades and representing 71 countries, and which engaged a range of methodologies. Our analysis combines quantification of geographic and methodological trends with…

  9. Exercise: Benefits of Exercise

    Medline Plus

    Full Text Available ... please turn Javascript on. Exercise: Benefits of Exercise Health Benefits One of the Healthiest Things You Can ... yourself. Studies have shown that exercise provides many health benefits and that older adults can gain a ...

  10. Benefits of Physical Activity

    Science.gov (United States)

    ... page from the NHLBI on Twitter. Benefits of Physical Activity Physical activity has many health benefits. These benefits ... of physical activity for your heart and lungs. Physical Activity Strengthens Your Heart and Improves Lung Function When ...

  11. Electronics Environmental Benefits Calculator

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Electronics Environmental Benefits Calculator (EEBC) was developed to assist organizations in estimating the environmental benefits of greening their purchase,...

  12. Glycaemic index methodology.

    Science.gov (United States)

    Brouns, F; Bjorck, I; Frayn, K N; Gibbs, A L; Lang, V; Slama, G; Wolever, T M S

    2005-06-01

    The glycaemic index (GI) concept was originally introduced to classify different sources of carbohydrate (CHO)-rich foods, usually having an energy content of >80 % from CHO, to their effect on post-meal glycaemia. It was assumed to apply to foods that primarily deliver available CHO, causing hyperglycaemia. Low-GI foods were classified as being digested and absorbed slowly and high-GI foods as being rapidly digested and absorbed, resulting in different glycaemic responses. Low-GI foods were found to induce benefits on certain risk factors for CVD and diabetes. Accordingly it has been proposed that GI classification of foods and drinks could be useful to help consumers make 'healthy food choices' within specific food groups. Classification of foods according to their impact on blood glucose responses requires a standardised way of measuring such responses. The present review discusses the most relevant methodological considerations and highlights specific recommendations regarding number of subjects, sex, subject status, inclusion and exclusion criteria, pre-test conditions, CHO test dose, blood sampling procedures, sampling times, test randomisation and calculation of glycaemic response area under the curve. All together, these technical recommendations will help to implement or reinforce measurement of GI in laboratories and help to ensure quality of results. Since there is current international interest in alternative ways of expressing glycaemic responses to foods, some of these methods are discussed.

  13. Methodological Aspects Regarding The Organizational Stress Analysis

    Science.gov (United States)

    Irimie, Sabina; Pricope (Muntean), Luminiţa Doina; Pricope, Sorin; Irimie, Sabin Ioan

    2015-07-01

    This work presents a research of methodology in occupational stress analyse in the educational field, as a part of a larger study. The objectives of the work are in finding accents in existence of significant relations between stressors and effects, meaning the differences between the indicators of occupational stress to teaching staff in primary and gymnasium school, taking notice of each specific condition: the institution as an entity, the working community, the discipline he/she is teaching others, the geographic and administrative district (urban/rural) and the quantification of stress level.

  14. TRAGG best estimate methodology applications

    Energy Technology Data Exchange (ETDEWEB)

    Hoang, H.

    2014-07-01

    The TRACG model simulates a multi-dimensional vessel and contains a flexible modular structure with control system capability. TRACG has undergone benchmarking qualifications with extensive testing and actual plant data. This best estimate methodology has been used widely in BWR safety analyses as well as in the qualification of the GEH advanced BWR designs. The application of TRACG methodology for loss of coolant accident (LOCA) analyses will provide realistic fuel bundle thermal responses. By taking advantage of these additional margins, the utility owner can justify the optimization of plant-specific emergency core cooling system performance requirements and justify certain equipment declared inoperable. The resulting benefits are improved plant capacity and reliability and improved equipment reliability and lifetime. (Author)

  15. Validação de metodologia analítica por cromatografia líquida para doseamento e estudo da estabilidade de pantoprazol sódico Validation of analytical methodology by hplc for quantification and stability evaluation of sodium pantoprazole

    Directory of Open Access Journals (Sweden)

    Renata Platcheck Raffin

    2007-08-01

    Full Text Available Pantoprazole is a proton pump inhibitor used in the treatment of digestive ulcers, gastro-esophageal reflux disease and in the eradication of Helicobacter pylori. In this work, an analytical method was developed and validated for the quantification of sodium pantoprazole by HPLC. The method was specific, linear, precise and exact. In order to verify the stability of pantoprazole during dissolution assays, pantoprazole solution in phosphate buffer pH 7.4 was kept at room temperature and protected from light for 22 days. Pantoprazole presented less than 5% of degradation in 6 hours and the half live of the degradation was 124 h.

  16. Methodologies for tracking learning paths

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth; Gilje, Øystein; Lindstrand, Fredrik

    2009-01-01

    filmmakers: what furthers their interest and/or hinders it, and what learning patterns emerge. The aim of this article is to present and discuss issues regarding the methodology and meth- ods of the study, such as developing a relationship with interviewees when conducting inter- views online (using MSN). We...... suggest two considerations about using online interviews: how the interviewees value the given subject of conversation and their familiarity with being online. The benefit of getting online communication with the young filmmakers offers ease, because it is both practical and appropriates a meeting...... platform that is familiar to our participants....

  17. Organic and total mercury determination in sediments by cold vapor atomic absorption spectrometry: methodology validation and uncertainty measurements

    Directory of Open Access Journals (Sweden)

    Robson L. Franklin

    2012-01-01

    Full Text Available The purpose of the present study was to validate a method for organic Hg determination in sediment. The procedure for organic Hg was adapted from literature, where the organomercurial compounds were extracted with dichloromethane in acid medium and subsequent destruction of organic compounds by bromine chloride. Total Hg was performed according to 3051A USEPA methodology. Mercury quantification for both methodologies was then performed by CVAAS. Methodology validation was verified by analyzing certified reference materials for total Hg and methylmercury. The uncertainties for both methodologies were calculated. The quantification limit of 3.3 µg kg-1 was found for organic Hg by CVAAS.

  18. Uncertainty Quantification and Validation for RANS Turbulence Models

    Science.gov (United States)

    Oliver, Todd; Moser, Robert

    2011-11-01

    Uncertainty quantification and validation procedures for RANS turbulence models are developed and applied. The procedures used here rely on a Bayesian view of probability. In particular, the uncertainty quantification methodology requires stochastic model development, model calibration, and model comparison, all of which are pursued using tools from Bayesian statistics. Model validation is also pursued in a probabilistic framework. The ideas and processes are demonstrated on a channel flow example. Specifically, a set of RANS models--including Baldwin-Lomax, Spalart-Allmaras, k- ɛ, k- ω, and v2- f--and uncertainty representations are analyzed using DNS data for fully-developed channel flow. Predictions of various quantities of interest and the validity (or invalidity) of the various models for making those predictions will be examined. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  19. A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.

    MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.

  20. THE AGILE METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Charul Deewan

    2012-09-01

    Full Text Available The technologies are numerous and Software is the one whichis most widely used. Some companies have their owncustomized methodology for developing their software but themajority speaks about two kinds of methodologies: Traditionaland Agile methodologies. In this paper, we will discuss someof the aspects of what Agile methodology is, how it can beused to get the best result from a project, how do we get it towork in an organization.

  1. QUANTIFICATION AND COSTING OF DOMESTIC ELECTRICITY GENERATION FOR ARMIDALE, NEW SOUTH WALES, AUSTRALIA UTILISING MICRO WIND TURBINES

    OpenAIRE

    Yasser Maklad

    2014-01-01

    In this study, a general overview of energy and renewable energy sources available in Australia was introduced, household’s electricity situation in Australia was presented, and focus wind energy was conducted. A theoretical methodology for quantification and costing of selected micro wind turbines was introduced. This methodology was applied to Armidale city, New South Wales (NSW), Australia as a case study. The methodology involved utilisation of spread sheet application and HOMER software....

  2. Micro-RNA quantification using DNA polymerase and pyrophosphate quantification.

    Science.gov (United States)

    Yu, Hsiang-Ping; Hsiao, Yi-Ling; Pan, Hung-Yin; Huang, Chih-Hung; Hou, Shao-Yi

    2011-12-15

    A rapid quantification method for micro-RNA based on DNA polymerase activity and pyrophosphate quantification has been developed. The tested micro-RNA serves as the primer, unlike the DNA primer in all DNA sequencing methods, and the DNA probe serves as the template for DNA replication. After the DNA synthesis, the pyrophosphate detection and quantification indicate the existence and quantity of the tested miRNA. Five femtomoles of the synthetic RNA could be detected. In 20-100 μg RNA samples purified from SiHa cells, the measurement was done using the proposed assay in which hsa-miR-16 and hsa-miR-21 are 0.34 fmol/μg RNA and 0.71 fmol/μg RNA, respectively. This simple and inexpensive assay takes less than 5 min after total RNA purification and preparation. The quantification is not affected by the pre-miRNA which cannot serve as the primer for the DNA synthesis in this assay. This assay is general for the detection of the target RNA or DNA with a known matched DNA template probe, which could be widely used for detection of small RNA, messenger RNA, RNA viruses, and DNA. Therefore, the method could be widely used in RNA and DNA assays. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Language Policy and Methodology

    Science.gov (United States)

    Liddicoat, Antony J.

    2004-01-01

    The implementation of a language policy is crucially associated with questions of methodology. This paper explores approaches to language policy, approaches to methodology and the impact that these have on language teaching practice. Language policies can influence decisions about teaching methodologies either directly, by making explicit…

  4. Chromatic and anisotropic cross-recurrence quantification analysis of interpersonal behavior

    NARCIS (Netherlands)

    Cox, R.F.A; van der Steen, Stephanie; Guevara Guerrero, Marlenny; Hoekstra, Lisette; van Dijk, Marijn; Webber, Charles; Ioana, Cornel; Marwan, Norbert

    2016-01-01

    Cross-recurrence quantification analysis (CRQA) is a powerful nonlinear time-series method to study coordination and cooperation between people. This chapter concentrates on two methodological issues related to CRQA on categorical data streams, which are commonly encountered in the behavioral scienc

  5. Sustainable Facility Development: Perceived Benefits and Challenges

    Science.gov (United States)

    Stinnett, Brad; Gibson, Fred

    2016-01-01

    Purpose: The purpose of this paper is to assess the perceived benefits and challenges of implementing sustainable initiatives in collegiate recreational sports facilities. Additionally, this paper intends to contribute to the evolving field of facility sustainability in higher education. Design/methodology/approach The design included qualitative…

  6. Renewable Portfolio Standards: Costs and Benefits (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Bird, L.; Heeter, J.; Barbose, G.; Weaver, S.; Flores, F.; Kuskova-Burns, K.; Wiser, R.

    2014-10-01

    This report summarizes state-level RPS costs to date, and considers how those costs may evolve going forward given scheduled increases in RPS targets and cost containment mechanisms. The report also summarizes RPS benefits estimates, based on published studies for individual states and discusses key methodological considerations.

  7. Sustainable Facility Development: Perceived Benefits and Challenges

    Science.gov (United States)

    Stinnett, Brad; Gibson, Fred

    2016-01-01

    Purpose: The purpose of this paper is to assess the perceived benefits and challenges of implementing sustainable initiatives in collegiate recreational sports facilities. Additionally, this paper intends to contribute to the evolving field of facility sustainability in higher education. Design/methodology/approach The design included qualitative…

  8. MAMA Software Features: Quantification Verification Documentation-1

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  9. MAMA Software Features: Quantification Verification Documentation-1

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  10. Economic benefits of metrology in manufacturing

    DEFF Research Database (Denmark)

    Savio, Enrico; De Chiffre, Leonardo; Carmignato, S.

    2016-01-01

    In streamlined manufacturing systems, the added value of inspection activities is often questioned, and metrology in particular is sometimes considered only as an avoidable expense. Documented quantification of economic benefits of metrology is generally not available. This work presents concrete...... examples from industrial production, in which the added value of metrology in manufacturing is discussed and quantified. Case studies include: general manufacturing, forging, machining, and related metrology. The focus of the paper is on the improved effectiveness of metrology when used at product...... and process design stages, as well as on the improved accuracy and efficiency of manufacturing through better measuring equipment and process chains with integrated metrology for process control....

  11. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    Science.gov (United States)

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.

  12. Water Footprint Symposium: where next for water footprint and water assessment methodology?

    OpenAIRE

    Tillotson, MR; Liu, J.; Guan, D; Wu, P; X. Zhao; G. Zhang; Pfister, S.; Pahlow, M.

    2014-01-01

    Recognizing the need for a comprehensive review of the tools and metrics for the quantification and assessment of water footprints, and allowing for the opportunity for open discussion on the challenges and future of water footprinting methodology, an international symposium on water footprint was organized. The Water Footprint Symposium was held in December 2013 at the University of Leeds, UK. In particular, four areas were highlighted for discussion: water footprint and agriculture, quantif...

  13. Accurate Quantification of Lipid Species by Electrospray Ionization Mass Spectrometry — Meets a Key Challenge in Lipidomics

    Directory of Open Access Journals (Sweden)

    Kui Yang

    2011-11-01

    Full Text Available Electrospray ionization mass spectrometry (ESI-MS has become one of the most popular and powerful technologies to identify and quantify individual lipid species in lipidomics. Meanwhile, quantitative analysis of lipid species by ESI-MS has also become a major obstacle to meet the challenges of lipidomics. Herein, we discuss the principles, advantages, and possible limitations of different mass spectrometry-based methodologies for lipid quantification, as well as a few practical issues important for accurate quantification of individual lipid species. Accordingly, accurate quantification of individual lipid species, one of the key challenges in lipidomics, can be practically met.

  14. Methodology for ranking restoration options

    Energy Technology Data Exchange (ETDEWEB)

    Hedemann Jensen, Per

    1999-04-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developing generic methodologies for ranking restoration techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps: characterisation of relevant contaminated sites; identification and characterisation of relevant restoration techniques; assessment of the radiological impact; development and application of a selection methodology for restoration options; formulation of generic conclusions and development of a manual. The project is intended to apply to situations in which sites with nuclear installations have been contaminated with radioactive materials as a result of the operation of these installations. The areas considered for remedial measures include contaminated land areas, rivers and sediments in rivers, lakes, and sea areas. Five contaminated European sites have been studied. Various remedial measures have been envisaged with respect to the optimisation of the protection of the populations being exposed to the radionuclides at the sites. Cost-benefit analysis and multi-attribute utility analysis have been applied for optimisation. Health, economic and social attributes have been included and weighting factors for the different attributes have been determined by the use of scaling constants. (au)

  15. Exercise: Benefits of Exercise

    Medline Plus

    Full Text Available ... version of this page please turn Javascript on. Exercise: Benefits of Exercise Health Benefits One of the Healthiest Things You ... activity campaign from the National Institute on Aging. Exercise or Physical Activity? Some people may wonder what ...

  16. Exercise: Benefits of Exercise

    Medline Plus

    Full Text Available ... Exercise: Benefits of Exercise Health Benefits One of the Healthiest Things You Can Do Like most people, ... active on a regular basis is one of the healthiest things you can do for yourself. Studies ...

  17. Medicare Hospice Benefits

    Science.gov (United States)

    CENTERS for MEDICARE & MEDICAID SERVICES Medicare Hospice Benefits This official government booklet includes information about Medicare hospice benefits: Who’s eligible for hospice care What services are included in hospice ...

  18. Benefits of Exercise

    Science.gov (United States)

    ... start slowly, and find ways to fit more physical activity into your life. To get the most benefit, ... the health benefits of exercise? Regular exercise and physical activity may Help you control your weight. Along with ...

  19. Absolute quantification of myocardial blood flow.

    Science.gov (United States)

    Yoshinaga, Keiichiro; Manabe, Osamu; Tamaki, Nagara

    2016-07-21

    With the increasing availability of positron emission tomography (PET) myocardial perfusion imaging, the absolute quantification of myocardial blood flow (MBF) has become popular in clinical settings. Quantitative MBF provides an important additional diagnostic or prognostic information over conventional visual assessment. The success of MBF quantification using PET/computed tomography (CT) has increased the demand for this quantitative diagnostic approach to be more accessible. In this regard, MBF quantification approaches have been developed using several other diagnostic imaging modalities including single-photon emission computed tomography, CT, and cardiac magnetic resonance. This review will address the clinical aspects of PET MBF quantification and the new approaches to MBF quantification.

  20. RESONANCE SELF-SHIELDING EFFECT IN UNCERTAINTY QUANTIFICATION OF FISSION REACTOR NEUTRONICS PARAMETERS

    Directory of Open Access Journals (Sweden)

    GO CHIBA

    2014-06-01

    Full Text Available In order to properly quantify fission reactor neutronics parameter uncertainties, we have to use covariance data and sensitivity profiles consistently. In the present paper, we establish two consistent methodologies for uncertainty quantification: a self-shielded cross section-based consistent methodology and an infinitely-diluted cross section-based consistent methodology. With these methodologies and the covariance data of uranium-238 nuclear data given in JENDL-3.3, we quantify uncertainties of infinite neutron multiplication factors of light water reactor and fast reactor fuel cells. While an inconsistent methodology gives results which depend on the energy group structure of neutron flux and neutron-nuclide reaction cross section representation, both the consistent methodologies give fair results with no such dependences.

  1. Exercise: Benefits of Exercise

    Medline Plus

    Full Text Available ... version of this page please turn Javascript on. Exercise: Benefits of Exercise Health Benefits One of the Healthiest Things You Can Do ... can do for yourself. Studies have shown that exercise provides many health benefits and that older adults can gain a lot ...

  2. Exercise: Benefits of Exercise

    Science.gov (United States)

    ... version of this page please turn Javascript on. Exercise: Benefits of Exercise Health Benefits One of the Healthiest Things You Can Do ... can do for yourself. Studies have shown that exercise provides many health benefits and that older adults can gain a lot ...

  3. The methodological defense of realism scrutinized.

    Science.gov (United States)

    Wray, K Brad

    2015-12-01

    I revisit an older defense of scientific realism, the methodological defense, a defense developed by both Popper and Feyerabend. The methodological defense of realism concerns the attitude of scientists, not philosophers of science. The methodological defense is as follows: a commitment to realism leads scientists to pursue the truth, which in turn is apt to put them in a better position to get at the truth. In contrast, anti-realists lack the tenacity required to develop a theory to its fullest. As a consequence, they are less likely to get at the truth. My aim is to show that the methodological defense is flawed. I argue that a commitment to realism does not always benefit science, and that there is reason to believe that a research community with both realists and anti-realists in it may be better suited to advancing science. A case study of the Copernican Revolution in astronomy supports this claim.

  4. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  5. Scenario development methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Eng, T. [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Hudson, J. [Rock Engineering Consultants, Welwyn Garden City, Herts (United Kingdom); Stephansson, O. [Royal Inst. of Tech., Stockholm (Sweden). Div. of Engineering Geology; Skagius, K.; Wiborgh, M. [Kemakta, Stockholm (Sweden)

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are (a) Event tree analysis, (b) Influence diagrams and (c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs.

  6. LANGUAGE POLICY AND METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Antony J. Liddicoat

    2004-06-01

    Full Text Available The implementation of a language policy is crucially associated with questions of methodology. This paper explores approaches to language policy, approaches to methodology and the impact that these have on language teaching practice. Language policies can influence decisions about teaching methodologies either directly, by making explicit recommendations about the methods to be used in classroom practice, or indirectly, through the conceptualisation of language leaming which underlies the policy. It can be argued that all language policies have the potential to influence teaching methodologies indirectly and that those policies which have explicit recommendations about methodology are actually functioning of two levels. This allows for the possibility of conflict between the direct and indirect dimensions of the policy which results from an inconsistency between the explicitly recommended methodology and the underlying conceptualisation of language teaching and learning which informs the policy.

  7. Open verification methodology cookbook

    CERN Document Server

    Glasser, Mark

    2009-01-01

    Functional verification is an art as much as a science. It requires not only creativity and cunning, but also a clear methodology to approach the problem. The Open Verification Methodology (OVM) is a leading-edge methodology for verifying designs at multiple levels of abstraction. It brings together ideas from electrical, systems, and software engineering to provide a complete methodology for verifying large scale System-on-Chip (SoC) designs. OVM defines an approach for developing testbench architectures so they are modular, configurable, and reusable. This book is designed to help both novic

  8. Comparison of biochemical and microscopic methods for quantification of mycorrhizal fungi in soil and roots

    Science.gov (United States)

    Arbuscular mycorrhizal fungi (AMF) are well-known plant symbionts which provide enhanced phosphorus uptake as well as other benefits to their host plants. Quantification of mycorrhizal biomass and root colonization has traditionally been performed by root staining and microscopic examination methods...

  9. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  10. Precise Quantification of Nanoparticle Internalization

    OpenAIRE

    Gottstein, Claudia; Wu, Guohui; Wong, Benjamin J.; Zasadzinski, Joseph Anthony

    2013-01-01

    Nanoparticles have opened new exciting avenues for both diagnostic and therapeutic applications in human disease, and targeted nanoparticles are increasingly used as specific drug delivery vehicles. The precise quantification of nanoparticle internalization is of importance to measure the impact of physical and chemical properties on the uptake of nanoparticles into target cells or into cells responsible for rapid clearance. Internalization of nanoparticles has been measured...

  11. Ensuring VGI Credibility in Urban-Community Data Generation: A Methodological Research Design

    Directory of Open Access Journals (Sweden)

    Jamie O'Brien

    2016-06-01

    Full Text Available In this paper we outline the methodological development of current research into urban community formations based on combinations of qualitative (volunteered and quantitative (spatial analytical and geo-statistical data. We outline a research design that addresses problems of data quality relating to credibility in volunteered geographic information (VGI intended for Web-enabled participatory planning. Here we have drawn on a dual notion of credibility in VGI data, and propose a methodological workflow to address its criteria. We propose a ‘super-positional’ model of urban community formations, and report on the combination of quantitative and participatory methods employed to underpin its integration. The objective of this methodological phase of study is to enhance confidence in the quality of data for Web-enabled participatory planning. Our participatory method has been supported by rigorous quantification of area characteristics, including participant communities’ demographic and socio-economic contexts. This participatory method provided participants with a ready and accessible format for observing and mark-making, which allowed the investigators to iterate rapidly a system design based on participants’ responses to the workshop tasks. Participatory workshops have involved secondary school-age children in socio-economically contrasting areas of Liverpool (Merseyside, UK, which offers a test-bed for comparing communities’ formations in comparative contexts, while bringing an under-represented section of the population into a planning domain, whose experience may stem from public and non-motorised transport modalities. Data has been gathered through one-day participatory workshops, featuring questionnaire surveys, local site analysis, perception mapping and brief, textual descriptions. This innovative approach will support Web-based participation among stakeholding planners, who may benefit from well-structured, community

  12. Quantitative evaluation of geodiversity: development of methodological procedures with application to territorial management

    Science.gov (United States)

    Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.

    2012-04-01

    Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity

  13. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  14. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss...

  15. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  16. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  17. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  18. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works...

  19. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works as a b...

  20. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  1. Rapid Dialogue Prototyping Methodology

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Sojka, P.; Rajman, M.; Kopecek, I.; Melichar, M.; Pala, K.

    2004-01-01

    This paper is about the automated production of dialogue models. The goal is to propose and validate a methodology that allows the production of finalized dialogue models (i.e. dialogue models specific for given applications) in a few hours. The solution we propose for such a methodology, called the

  2. Accurate proteome-wide protein quantification from high-resolution 15N mass spectra.

    Science.gov (United States)

    Khan, Zia; Amini, Sasan; Bloom, Joshua S; Ruse, Cristian; Caudy, Amy A; Kruglyak, Leonid; Singh, Mona; Perlman, David H; Tavazoie, Saeed

    2011-12-19

    In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods.

  3. The Service Learning Projects: Stakeholder Benefits and Potential Class Topics

    Science.gov (United States)

    Rutti, Raina M.; LaBonte, Joanne; Helms, Marilyn Michelle; Hervani, Aref Agahei; Sarkarat, Sy

    2016-01-01

    Purpose: The purpose of this paper is to summarize the benefits of including a service learning project in college classes and focusses on benefits to all stakeholders, including students, community, and faculty. Design/methodology/approach: Using a snowball approach in academic databases as well as a nominal group technique to poll faculty, key…

  4. Requirements and benefits of flow forecasting for improving hydropower generation

    NARCIS (Netherlands)

    Dong, X.; Dohmen-Janssen, C.M.; Booij, M.J.; Hulscher, S.J.M.H.

    2005-01-01

    This paper presents a methodology to identify the required lead time and accuracy of flow forecasting for improving hydropower generation of a reservoir, by simulating the benefits (in terms of electricity generated) obtained from the forecasting with varying lead times and accuracies. The benefit-l

  5. RNA quantification using gold nanoprobes - application to cancer diagnostics

    Directory of Open Access Journals (Sweden)

    Baptista Pedro V

    2010-02-01

    Full Text Available Abstract Molecular nanodiagnostics applied to cancer may provide rapid and sensitive detection of cancer related molecular alterations, which would enable early detection even when those alterations occur only in a small percentage of cells. The use of gold nanoparticles derivatized with thiol modified oligonucleotides (Au-nanoprobes for the detection of specific nucleic acid targets has been gaining momentum as an alternative to more traditional methodologies. Here, we present an Au-nanoparticles based approach for the molecular recognition and quantification of the BCR-ABL fusion transcript (mRNA, which is responsible for chronic myeloid leukemia (CML, and to the best of our knowledge it is the first time quantification of a specific mRNA directly in cancer cells is reported. This inexpensive and very easy to perform Au-nanoprobe based method allows quantification of unamplified total human RNA and specific detection of the oncogene transcript. The sensitivity settled by the Au-nanoprobes allows differential gene expression from 10 ng/μl of total RNA and takes less than 30 min to complete after total RNA extraction, minimizing RNA degradation. Also, at later stages, accumulation of malignant mutations may lead to resistance to chemotherapy and consequently poor outcome. Such a method, allowing for fast and direct detection and quantification of the chimeric BCR-ABL mRNA, could speed up diagnostics and, if appropriate, revision of therapy. This assay may constitute a promising tool in early diagnosis of CML and could easily be extended to further target genes with proven involvement in cancer development.

  6. Salary or Benefits?

    OpenAIRE

    Oyer, Paul

    2004-01-01

    Employer-provided benefits are a large and growing share of compensation costs. In this paper, I consider three factors that can affect the value created by employer-sponsored benefits. First, firms have a comparative advantage (for example, due to scale economies or tax treatment) in purchasing relative to employees. This advantage can vary across firms based on size and other differences in cost structure. Second, employees differ in their valuations of benefits and it is costly for workers...

  7. Transit Benefit Program Data -

    Data.gov (United States)

    Department of Transportation — This data set contains information about any US government agency participating in the transit benefits program, funding agreements, individual participating Federal...

  8. Neural Networks Methodology and Applications

    CERN Document Server

    Dreyfus, Gérard

    2005-01-01

    Neural networks represent a powerful data processing technique that has reached maturity and broad application. When clearly understood and appropriately used, they are a mandatory component in the toolbox of any engineer who wants make the best use of the available data, in order to build models, make predictions, mine data, recognize shapes or signals, etc. Ranging from theoretical foundations to real-life applications, this book is intended to provide engineers and researchers with clear methodologies for taking advantage of neural networks in industrial, financial or banking applications, many instances of which are presented in the book. For the benefit of readers wishing to gain deeper knowledge of the topics, the book features appendices that provide theoretical details for greater insight, and algorithmic details for efficient programming and implementation. The chapters have been written by experts ands seemlessly edited to present a coherent and comprehensive, yet not redundant, practically-oriented...

  9. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  10. Unemployment Benefit Exhaustion

    DEFF Research Database (Denmark)

    Filges, Trine; Pico Geerdsen, Lars; Knudsen, Anne-Sofie Due

    2015-01-01

    This systematic review studied the impact of exhaustion of unemployment benefits on the exit rate out of unemployment and into employment prior to benefit exhaustion or shortly thereafter. Method: We followed Campbell Collaboration guidelines to prepare this review, and ultimately located 12...

  11. Hospital benefit segmentation.

    Science.gov (United States)

    Finn, D W; Lamb, C W

    1986-12-01

    Market segmentation is an important topic to both health care practitioners and researchers. The authors explore the relative importance that health care consumers attach to various benefits available in a major metropolitan area hospital. The purposes of the study are to test, and provide data to illustrate, the efficacy of one approach to hospital benefit segmentation analysis.

  12. Wellbeing or welfare benefits

    DEFF Research Database (Denmark)

    Handlos, Line Neerup; Kristiansen, Maria; Nørredam, Marie Louise

    2016-01-01

    This debate article debunks the myth that migrants are driven primarily by the size of the welfare benefits in the host country, when they decide where to migrate to. We show that instead of welfare benefits, migrants are driven by a desire for safety, wellbeing, social networks and opportunities...

  13. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...... Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers....

  14. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....

  15. Orthographic Structuring of Human Speech and Texts Linguistic Application of Recurrence Quantification Analysis

    CERN Document Server

    Orsucci, F; Giuliani, A; Webber, C L; Zbilut, J P

    1997-01-01

    A methodology based upon recurrence quantification analysis is proposed for the study of orthographic structure of written texts. Five different orthographic data sets (20th century Italian poems, 20th century American poems, contemporary Swedish poems with their corresponding Italian translations, Italian speech samples, and American speech samples) were subjected to recurrence quantification analysis, a procedure which has been found to be diagnostically useful in the quantitative assessment of ordered series in fields such as physics, molecular dynamics, physiology, and general signal processing. Recurrence quantification was developed from recurrence plots as applied to the analysis of nonlinear, complex systems in the physical sciences, and is based on the computation of a distance matrix of the elements of an ordered series (in this case the letters consituting selected speech and poetic texts). From a strictly mathematical view, the results show the possibility of demonstrating invariance between diffe...

  16. "Absolute" quantification in magnetic resonance spectroscopy: validation of a clinical protocol in multiple sclerosis.

    Science.gov (United States)

    Bagory, Matthieu; Durand-Dubief, Françoise; Ibarrola, Danielle; Confavreux, Christian; Sappey-Marinier, Dominique

    2007-01-01

    MRS allows to measure cerebral metabolites, thus helping to characterize brain disease diagnosis and followup. Metabolite concentration quantification is usually based on metabolite ratio referring to creatine. If this metabolite concentration is supposed to be constant, it may vary in pathological processes. Therefore, "absolute" concentration methodology is needed. The aim of this study is to validate a clinical "absolute" quantification protocol through the development of an external metabolic phantom, calibration and correction, and the investigation of reproducibility issues. When phantom stability was investigated by a short-term and a long-term reproducibility study, both Standard Deviations (SD) were in agreement with literature values. This "absolute" quantification method was applied to patients with Multiple Sclerosis (MS). The results show a significant decrease in both N-Acetyl Aspartate (NAA) and choline concentrations.

  17. Methodologies of health impact assessment as part of an integrated approach to reduce effects of air pollution

    Energy Technology Data Exchange (ETDEWEB)

    Aunan, K.; Seip, H.M.

    1995-12-01

    Quantification of average frequencies of health effects on a population level is an essential part of an integrated assessment of pollution effects. Epidemiological studies seem to provide the best basis for such estimates. This paper gives an introduction to a methodology for health impact assessment and also the results from selected parts of a case study in Hungary. This case study is aimed at testing and improving the methodology for integrated assessment and focuses on energy production and consumption and implications for air pollution. Using monitoring data from Budapest, the paper gives estimates of excess frequencies of respiratory illness, mortality and other health end-points. For a number of health end-points, particles probably may serve as a good indicator component. Stochastic simulation is used to illustrate the uncertainties imbedded in the exposure-response function applied. The paper uses the ``bottom up approach`` to find cost-effective abatement strategies against pollution damages, where specific abatement measures such as emission standards for vehicles are explored in detail. It is concluded that in spite of large uncertainties in every step of the analysis, an integrated assessment of costs and benefits of different abatement measures is valuable as it clarifies the main objectives of an abatement policy and explicitly describes the adverse impacts of different activities and their relative importance. 46 refs., 11 figs., 2 tabs.

  18. Methodological pitfalls of the Unconscious Thought paradigm

    Directory of Open Access Journals (Sweden)

    David Marchiori

    2009-12-01

    Full Text Available According to Unconscious Thought Theory (UTT: Dijksterhuis and Nordgren, 2006, complex decisions are best made after a period of distraction assumed to elicit ``unconscious thought''. Over three studies, respectively offering a conceptual, an identical and a methodologically improved replication of Dijksterhuis et al. (2006, we reassessed UTT's predictions and dissected the decision task used to demonstrate these predictions. We failed to find any evidence for the benefits of unconscious decision-making. By contrast, we found some evidence that conscious deliberation can lead to better decisions. Further, we identified methodological weaknesses in the UTT decision task: (a attributes weighting was neglected although attributes were seen as different in importance; (b the material was not properly counterbalanced; and (c there was some confusion in the experimental instructions. We propose methodological improvements that address these concerns.

  19. Issues connected with indirect cost quantification: a focus on the transportation system

    Science.gov (United States)

    Křivánková, Zuzana; Bíl, Michal; Kubeček, Jan; Vodák, Rostislav

    2017-04-01

    Transportation and communication networks in general are vital parts of modern society. The economy relies heavily on transportation system performance. A number of people commutes to work regularly. Stockpiles in many companies are being reduced as the just-in-time production process is able to supply resources via the transportation network on time. Natural hazards have the potential to disturb transportation systems. Earthquakes, flooding or landsliding are examples of high-energetic processes which are capable of causing direct losses (i.e. physical damage to the infrastructure). We have focused on quantification of the indirect cost of natural hazards which are not easy to estimate. Indirect losses can also emerge as a result of meteorological hazards with low energy which only seldom cause direct losses, e.g. glaze, snowfall. Whereas evidence of repair work and general direct costs usually exist or can be estimated, indirect costs are much more difficult to identify particularly when they are not covered by insurance agencies. Delimitations of alternative routes (detours) are the most frequent responses to blocked road links. Indirect costs can then be related to increased fuel consumption and additional operating costs. Detours usually result in prolonged travel times. Indirect costs quantification has to therefore cover the value of the time. The costs from the delay are a nonlinear function of travel time, however. The existence of an alternative transportation pattern may also result in an increased number of traffic crashes. This topic has not been studied in depth but an increase in traffic crashes has been reported when people suddenly changed their traffic modes, e.g. when air traffic was not possible. The lost user benefit from those trips that were cancelled or suppressed is also difficult to quantify. Several approaches, based on post-event questioner surveys, have been applied to communities and companies affected by transportation accessibility

  20. Quantification and Propagation of Nuclear Data Uncertainties

    Science.gov (United States)

    Rising, Michael E.

    The use of several uncertainty quantification and propagation methodologies is investigated in the context of the prompt fission neutron spectrum (PFNS) uncertainties and its impact on critical reactor assemblies. First, the first-order, linear Kalman filter is used as a nuclear data evaluation and uncertainty quantification tool combining available PFNS experimental data and a modified version of the Los Alamos (LA) model. The experimental covariance matrices, not generally given in the EXFOR database, are computed using the GMA methodology used by the IAEA to establish more appropriate correlations within each experiment. Then, using systematics relating the LA model parameters across a suite of isotopes, the PFNS for both the uranium and plutonium actinides are evaluated leading to a new evaluation including cross-isotope correlations. Next, an alternative evaluation approach, the unified Monte Carlo (UMC) method, is studied for the evaluation of the PFNS for the n(0.5 MeV)+Pu-239 fission reaction and compared to the Kalman filter. The UMC approach to nuclear data evaluation is implemented in a variety of ways to test convergence toward the Kalman filter results and to determine the nonlinearities present in the LA model. Ultimately, the UMC approach is shown to be comparable to the Kalman filter for a realistic data evaluation of the PFNS and is capable of capturing the nonlinearities present in the LA model. Next, the impact that the PFNS uncertainties have on important critical assemblies is investigated. Using the PFNS covariance matrices in the ENDF/B-VII.1 nuclear data library, the uncertainties of the effective multiplication factor, leakage, and spectral indices of the Lady Godiva and Jezebel critical assemblies are quantified. Using principal component analysis on the PFNS covariance matrices results in needing only 2-3 principal components to retain the PFNS uncertainties. Then, using the polynomial chaos expansion (PCE) on the uncertain output

  1. Detection and Quantification of Neurotransmitters in Dialysates

    OpenAIRE

    Zapata, Agustin; Chefer, Vladimir I.; Shippenberg, Toni S.; Denoroy, Luc

    2009-01-01

    Sensitive analytical methods are needed for the separation and quantification of neurotransmitters obtained in microdialysate studies. This unit describes methods that permit quantification of nanomolar concentrations of monoamines and their metabolites (high-pressure liquid chromatography electrochemical detection), acetylcholine (HPLC-coupled to an enzyme reactor), and amino acids (HPLC-fluorescence detection; capillary electrophoresis with laser-induced fluorescence detection).

  2. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes;

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than ...

  3. Simpler methods do it better: Success of Recurrence Quantification Analysis as a general purpose data analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Webber, Charles L., E-mail: cwebber@lumc.ed [Department of Cell and Molecular Physiology, Loyola University Medical Center, Maywood, IL (United States); Marwan, Norbert, E-mail: marwan@pik-potsdam.d [Potsdam Institute for Climate Impact Research (PIK), 14412 Potsdam (Germany); Facchini, Angelo, E-mail: a.facchini@unisi.i [Center the Study of Complex Systmes and Department of Information Enginering, University of Siena, 53100 Siena (Italy); Giuliani, Alessandro, E-mail: alessandro.giuliani@iss.i [Environment and Health Department, Istituto Superiore di Sanita, Roma (Italy)

    2009-10-05

    Over the last decade, Recurrence Quantification Analysis (RQA) has become a new standard tool in the toolbox of nonlinear methodologies. In this Letter we trace the history and utility of this powerful tool and cite some common applications. RQA continues to wend its way into numerous and diverse fields of study.

  4. Unemployment Benefit Exhaustion

    DEFF Research Database (Denmark)

    Filges, Trine; Pico Geerdsen, Lars; Knudsen, Anne-Sofie Due

    2015-01-01

    studies for final analysis and interpretation. Twelve studies could be included in the data synthesis. Results: We found clear evidence that the prospect of exhaustion of benefits results in a significantly increased incentive for finding work. Discussion: The theoretical suggestion that the prospect......This systematic review studied the impact of exhaustion of unemployment benefits on the exit rate out of unemployment and into employment prior to benefit exhaustion or shortly thereafter. Method: We followed Campbell Collaboration guidelines to prepare this review, and ultimately located 12...... of exhaustion of benefits results in an increased incentive for finding work has been confirmed empirically by measures from seven different European countries, the United States, and Canada. The results are robust in the sense that sensitivity analyses evidenced no appreciable changes in the results. We found...

  5. Benefits of Java

    Science.gov (United States)

    ... and Facts Fitness Fitness Find out more Categories Sports and Performance Training and Recovery Exercise Topics Fueling Your Workout Benefits of Physical Activity Exercise Nutrition Top Articles Man running - Protein ...

  6. Benefits of CHP Partnership

    Science.gov (United States)

    Learn about the benefits of being a EPA CHP Partner, which include expert advice and answers to questions, CHP news, marketing resources, publicity and recognition, and being associated with EPA through a demonstrated commitment to CHP.

  7. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  8. Análise dos métodos de contagem de pontos e planímetro na quantificação do biofilme da dentadura: um estudo de validação metodológica Analysis of the point-counting and planimetric methods in the quantification of the biofilm of dentures: a study of methodological validation

    Directory of Open Access Journals (Sweden)

    Roseana Aparecida Gomes FERNANDES

    2002-03-01

    Full Text Available Dois métodos de quantificação de biofilme da dentadura (contagem de pontos e planímetro foram testados e comparados com o método de pesagem de papel e Índice de Higiene de Prótese. Superfícies internas de 62 próteses foram coradas, fotografadas e as áreas total e do biofilme foram projetadas em papel e contornadas com grafite. O método de contagem de pontos (experimental 1 foi aplicado com uma grade de pontos. Para o método do planímetro (experimental 2, as áreas foram medidas com um planímetro digital e para o método de pesagem (controle 1 foram recortadas e pesadas em balança de precisão. No Índice de Higiene de Prótese (controle 2, utilizou-se a atribuição de escores. Os resultados mostraram uma porcentagem de concordância entre os métodos experimentais e controle 1 de 82% (contagem de pontos e 95% (planímetro, bem como alto grau de correlação (r = 0,98; r = 0,99 entre os valores obtidos. Quando comparados com o controle 2, houve concordância em 55% (contagem de pontos e 37% (planímetro dos casos. Os métodos experimentais podem ser úteis em estudos clínicos para avaliação da eficácia de agentes de higienização.Two methods of quantification of the biofilm (point-counting and planimetric were tested and compared with the paper-weighing method and with the Prosthesis Hygiene Index. The internal surfaces of 62 complete dentures were stained and photographed. The slides were projected on a paper sheet. The total area and the area covered with biofilm were contoured using a black pencil. The point-counting method (experimental 1 was carried out on a mesh of equidistant points. For the planimetric method (experimental 2, the areas of interest were measured by means of a digital planimeter. In the paper-weighing method (control 1 the areas of interest were cut and weighed on a precision scale. In the determination of the Prosthesis Hygiene Index (control 2, the accumulation of biofilm was estimated by means of a

  9. Water Footprint Symposium: where next for water footprint and water assessment methodology?

    NARCIS (Netherlands)

    Tillotson, M.R.; Kiu, J.; Guan, D.; Wu, P.; Zhao, Xu; Zhang, G.P.; Pfister, S.; Pahlow, M.

    2014-01-01

    Recognizing the need for a comprehensive review of the tools and metrics for the quantification and assessment of water footprints, and allowing for the opportunity for open discussion on the challenges and future of water footprinting methodology, an international symposium on water footprint was o

  10. Water Footprint Symposium: where next for water footprint and water assessment methodology?

    NARCIS (Netherlands)

    Tillotson, M.R.; Kiu, J.; Guan, D.; Wu, P.; Zhao, Xu; Zhang, Guoping; Pfister, S.; Pahlow, Markus

    2014-01-01

    Recognizing the need for a comprehensive review of the tools and metrics for the quantification and assessment of water footprints, and allowing for the opportunity for open discussion on the challenges and future of water footprinting methodology, an international symposium on water footprint was

  11. Benefits for handicapped children

    CERN Multimedia

    2003-01-01

    The introduction of long-term care benefits within the CERN Health Insurance Scheme requires the coordination of the benefits foreseen for handicapped children. Measures were adopted by the Management following the recommendation made by the Standing Concertation Committee on 26 March 2003. A document clarifying these measures is available on the Web at the following address: http://humanresources.web.cern.ch/humanresources/external/soc/Social_affairs/social_affairs.asp Social Affairs Service 74201

  12. Socio-economic research on fusion. SERF 1997-98. Macro Tast E2: External costs and benefits. Task 2: Comparison of external costs

    Energy Technology Data Exchange (ETDEWEB)

    Schleisner, Lotte; Korhonen, Riitta

    1998-12-01

    This report is part of the SERF (Socio-Economic Research on Fusion) project, Macro Task E2, which covers External Costs and Benefits. The report is the documentation of Task 2, Comparison of External Costs. The aim of Task 2 Comparison of External Costs, has been to compare the external costs of the fusion energy with those from other alternative energy generation technologies. In this task identification and quantification of the external costs for wind energy and photovoltaic have been performed by Risoe, while identification and quantification of the external cost for nuclear fission and fossil fuels have been discussed by VTT. The methodology used for the assessment of the externalities of the fuel cycles selected has been the one developed within the ExternE Project. First estimates for the externalities of fusion energy have been under examination in Macrotask E2. Externalities of fossil fuels and nuclear fission have already been evaluated in the ExternE project and a vast amount of material for different sites in various countries is available. This material is used in comparison. In the case of renewable wind energy and photovoltaic are assessed separately. External costs of the various alternatives may change as new technologies are developed and costs can to a high extent be avoided (e.g. acidifying impacts but also global warming due to carbon dioxide emissions). Also fusion technology can experience major progress and some important cost components probably can be avoided already by 2050. (EG) 36 refs.

  13. Attitude Formation of Benefits Satisfaction: Knowledge and Fit of Benefits

    OpenAIRE

    Gery Markova, Foard Jones

    2011-01-01

    Using the theoretical framework of the Theory of Reasoned Action [6], we examine benefits satisfactionas an attitude formed by the beliefs about benefits (i.e., benefits knowledge) and the perceived value ofthese benefits (i.e., fit of benefits to individual needs). We use questionnaires to gather data from arandom sample of 591 employees in a large county agency in the South-eastern United States. The datasupport that knowledge of benefits is associated with enhanced benefits satisfaction an...

  14. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    Science.gov (United States)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  15. Assessing the carbon benefit of saltmarsh restoration

    Science.gov (United States)

    Taylor, Benjamin; Paterson, David; Hanley, Nicholas

    2016-04-01

    The quantification of carbon sequestration rates in coastal ecosystems is required to better realise their potential role in climate change mitigation. Through accurate valuation this service can be fully appreciated and perhaps help facilitate efforts to restore vulnerable ecosystems such as saltmarshes. Vegetated coastal ecosystems are suggested to account for approximately 50% of oceanic sedimentary carbon despite their 2% areal extent. Saltmarshes, conservatively estimated to store 430 ± 30 Tg C in surface sediment deposits, have experienced extensive decline in the recent past; through processes such as land use change and coastal squeeze. Saltmarsh habitats offer a range of services that benefit society and the natural world, making their conservation meaningful and beneficial. The associated costs of restoration projects could, in part, be subsidised through payment for ecosystem services, specifically Blue carbon. Additional storage is generated through the (re)vegetation of mudflat areas leading to an altered ecosystem state and function; providing similar benefits to natural saltmarsh areas. The Eden Estuary, Fife, Scotland has been a site of saltmarsh restoration since 2000; providing a temporal and spatial scale to evaluate these additional benefits. The study is being conducted to quantify the carbon benefit of restoration efforts and provide an insight into the evolution of this benefit through sites of different ages. Seasonal sediment deposition and settlement rates are measured across the estuary in: mudflat, young planted saltmarsh, old planted saltmarsh and extant high marsh areas. Carbon values being derived from loss on ignition organic content values. Samples are taken across a tidal cycle on a seasonal basis; providing data on tidal influence, vegetation condition effects and climatic factors on sedimentation and carbon sequestration rates. These data will inform on the annual characteristics of sedimentary processes in the estuary and be

  16. Guidebook in using Cost Benefit Analysis and strategic environmental assessment for environmental planning in China

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    Environmental planning in China may benefit from greater use of Cost Benefit Analysis (CBA) and Strategic Environmental Assessment (SEA) methodologies. We provide guidance on using these methodologies. Part I and II show the principles behind the methodologies as well as their theoretical structure. Part III demonstrates the methodologies in action in a range of different good practice examples. The case studies and theoretical expositions are intended to teach by way of example as well as by understanding the principles, and to help planners use the methodologies as correctly as possible.(auth)

  17. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly....... The functional HAZOP methodology lends itself directly for implementation into a computer aided reasoning tool to perform root cause and consequence analysis. Such a tool can facilitate finding causes and/or consequences far away from the site of the deviation. A functional HAZOP assistant is proposed...... and investigated in a HAZOP study of an industrial scale Indirect Vapor Recompression Distillation pilot Plant (IVaRDiP) at DTU-Chemical and Biochemical Engineering. The study shows that the functional HAZOP methodology provides a very efficient paradigm for facilitating HAZOP studies and for enabling reasoning...

  18. Changing methodologies in TESOL

    CERN Document Server

    Spiro, Jane

    2013-01-01

    Covering core topics from vocabulary and grammar to teaching, writing speaking and listening, this textbook shows you how to link research to practice in TESOL methodology. It emphasises how current understandings have impacted on the language classroom worldwide and investigates the meaning of 'methods' and 'methodology' and the importance of these for the teacher: as well as the underlying assumptions and beliefs teachers bring to bear in their practice. By introducing you to language teaching approaches, you will explore the way these are influenced by developments in our understanding of l

  19. Methodology for research I.

    Science.gov (United States)

    Garg, Rakesh

    2016-09-01

    The conduct of research requires a systematic approach involving diligent planning and its execution as planned. It comprises various essential predefined components such as aims, population, conduct/technique, outcome and statistical considerations. These need to be objective, reliable and in a repeatable format. Hence, the understanding of the basic aspects of methodology is essential for any researcher. This is a narrative review and focuses on various aspects of the methodology for conduct of a clinical research. The relevant keywords were used for literature search from various databases and from bibliographies of the articles.

  20. Financial methodology for Brazilian market of small producers of oil and natural gas, based on Canadian and North American experiences in reserves quantification, evaluation and certification; Metodologia de financeiamento para pequenos produtores do mercado brasileiro de petroleo e gas natural, baseado nas experiencias canadense e americana na quantificacao, valoracao e certificacao de reservas

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Enrico Brunno Zipoli de Sousa e [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Programa de Pos-Graduacao em Geologia; Coelho, Jose Mario [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Dept. de Minas

    2008-07-01

    ANP (National Agency of Petroleum, Natural gas and Biofuels), through auctions of exploratory blocks in the subsequent years from the break of the monopoly of PETROBRAS with the Law 9.478 of 1997, had important role in the opening of the section and in the attainment of the self-sufficiency of petroleum. However the petroleum production in ripe and marginal fields were left off, since the initial interest in the first rounds was to attract the major companies. - International Oil Companies (IOC) - when ANP granted great blocks offshore. Ripe fields are defined as fields in phase of irreversible declination and marginal fields are also defined as economical concept, certain for business decision and external economical factors (price of the oil, etc.). Canada and USA, worldwide leaders in the market of petroleum and gas, possess politics that benefit and maintain the small companies protected of the competition of INOC's by assuring small companies finance through the guarantee of proved reserves of oil. This paper assemble Canadian and American experiences in regulation for small companies investments and compares it with Brazilian financing types, which is restricted due to the Brazilian finance agent's despite about oil and gas activity. (author)

  1. Accessible quantification of multiparticle entanglement

    CERN Document Server

    Cianciaruso, Marco; Adesso, Gerardo

    2015-01-01

    Entanglement is a key ingredient for quantum technologies and a fundamental signature of quantumness in a broad range of phenomena encompassing many-body physics, thermodynamics, cosmology, and life sciences. For arbitrary multiparticle systems, the quantification of entanglement typically involves hard optimisation problems, and requires demanding tomographical techniques. In this paper we show that such difficulties can be overcome by developing an experimentally friendly method to evaluate measures of multiparticle entanglement via a geometric approach. The method provides exact analytical results for a relevant class of mixed states of $N$ qubits, and computable lower bounds to entanglement for any general state. For practical purposes, the entanglement determination requires local measurements in just three settings for any $N$. We demonstrate the power of our approach to quantify multiparticle entanglement in $N$-qubit bound entangled states and other states recently engineered in laboratory using quant...

  2. The methodological cat

    Directory of Open Access Journals (Sweden)

    Marin Dinu

    2014-03-01

    Full Text Available Economics understands action as having the connotation of here and now, the proof being that it excessively uses, for explicative purposes, two limitations of sense: space is seen as the place with a private destination (through the cognitive dissonance of methodological individualism, and time is seen as the short term (through the dystopia of rational markets.

  3. Video: Modalities and Methodologies

    Science.gov (United States)

    Hadfield, Mark; Haw, Kaye

    2012-01-01

    In this article, we set out to explore what we describe as the use of video in various modalities. For us, modality is a synthesizing construct that draws together and differentiates between the notion of "video" both as a method and as a methodology. It encompasses the use of the term video as both product and process, and as a data collection…

  4. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  5. Quantification of the volumetric benefit of image-guided radiotherapy (I.G.R.T.) in prostate cancer: Margins and presence probability map; Benefice volumetrique de la radiotherapie guidee par l'image dans les cancers prostatiques: marges et cartographies de probabilite de presence

    Energy Technology Data Exchange (ETDEWEB)

    Cazoulat, G.; Crevoisier, R. de; Simon, A.; Louvel, G.; Manens, J.P.; Haigron, P. [Inserm, U642, 35 - Rennes (France); Rennes-1 Univ., 35 (France); Crevoisier, R. de; Louvel, G.; Manens, J.P.; Lafond, C. [Centre Eugene-Marquis, Dept. de Radiotherapie, 35 - Rennes (France)

    2009-09-15

    Purpose: To quantify the prostate and seminal vesicles (S.V.) anatomic variations in order to choose appropriate margins including intrapelvic anatomic variations. To quantify volumetric benefit of image-guided radiotherapy (I.G.R.T.). Patients and methods: Twenty patients, receiving a total dose of 70 Gy in the prostate, had a planning CT scan and eight weekly CT scans during treatment. Prostate and S.V. were manually contoured. Each weekly CT scan was registered to the planning CT scan according to three modalities: radiopaque skin marks, pelvis bone or prostate. For each patient, prostate and S.V. displacements were quantified. 3-dimensional maps of prostate and S.V. presence probability were established. Volumes including minimal presence probabilities were compared between the three modalities of registration. Result: For the prostate intrapelvic displacements, systematic and random variations and maximal displacements for the entire population were: 5 mm, 2.7 mm and 16.5 mm in anteroposterior axis; 2.7 mm, 2.4 mm and 11.4 mm in supero-inferior axis and 0.5 mm, 0.8 mm and 3.3 mm laterally. Margins according to van Herk recipe (to cover the prostate for 90% of the patients with the 95% isodose) were: 8 mm, 8.3 mm and 1.9 mm, respectively. The 100% prostate presence probability volumes correspond to 37%, 50% and 61% according to the registration modality. For the S.V., these volumes correspond to 8%, 14% and 18% of the S.V. volume. Conclusions: Without I.G.R.T., 5 mm prostate posterior margins are insufficient and should be at least 8 mm, to account for intrapelvic anatomic variations. Prostate registration almost doubles the 100% presence probability volume compared to skin registration. Deformation of S.V. will require either to increase dramatically margins (simple) or new planning (not realistic). (authors)

  6. A Risk-Based Sensor Placement Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ronald W [ORNL; Kulesz, James J [ORNL

    2006-08-01

    A sensor placement methodology is proposed to solve the problem of optimal location of sensors or detectors to protect population against the exposure to and effects of known and/or postulated chemical, biological, and/or radiological threats. Historical meteorological data are used to characterize weather conditions as wind speed and direction pairs with the percentage of occurrence of the pairs over the historical period. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate population at risk against standard exposure levels. Sensor locations are determined via a dynamic programming algorithm where threats captured or detected by sensors placed in prior stages are removed from consideration in subsequent stages. Moreover, the proposed methodology provides a quantification of the marginal utility of each additional sensor or detector. Thus, the criterion for halting the iterative process can be the number of detectors available, a threshold marginal utility value, or the cumulative detection of a minimum factor of the total risk value represented by all threats.

  7. Protein inference: A protein quantification perspective.

    Science.gov (United States)

    He, Zengyou; Huang, Ting; Liu, Xiaoqing; Zhu, Peijun; Teng, Ben; Deng, Shengchun

    2016-08-01

    In mass spectrometry-based shotgun proteomics, protein quantification and protein identification are two major computational problems. To quantify the protein abundance, a list of proteins must be firstly inferred from the raw data. Then the relative or absolute protein abundance is estimated with quantification methods, such as spectral counting. Until now, most researchers have been dealing with these two processes separately. In fact, the protein inference problem can be regarded as a special protein quantification problem in the sense that truly present proteins are those proteins whose abundance values are not zero. Some recent published papers have conceptually discussed this possibility. However, there is still a lack of rigorous experimental studies to test this hypothesis. In this paper, we investigate the feasibility of using protein quantification methods to solve the protein inference problem. Protein inference methods aim to determine whether each candidate protein is present in the sample or not. Protein quantification methods estimate the abundance value of each inferred protein. Naturally, the abundance value of an absent protein should be zero. Thus, we argue that the protein inference problem can be viewed as a special protein quantification problem in which one protein is considered to be present if its abundance is not zero. Based on this idea, our paper tries to use three simple protein quantification methods to solve the protein inference problem effectively. The experimental results on six data sets show that these three methods are competitive with previous protein inference algorithms. This demonstrates that it is plausible to model the protein inference problem as a special protein quantification task, which opens the door of devising more effective protein inference algorithms from a quantification perspective. The source codes of our methods are available at: http://code.google.com/p/protein-inference/.

  8. Spanish methodological approach for biosphere assessment of radioactive waste disposal.

    Science.gov (United States)

    Agüero, A; Pinedo, P; Cancio, D; Simón, I; Moraleda, M; Pérez-Sánchez, D; Trueba, C

    2007-10-01

    The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS "Reference Biospheres Methodology" and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates.

  9. Utilité du partage des corpus pour l'analyse des interactions en ligne en situation d'apprentissage : un exemple d'approche méthodologique autour d'une base de corpus d'apprentissage Benefits of Sharing Corpora when Analyzing Online Interactions: an Example of Methodology Related to a Databank of Learning and Teaching Corpora.

    Directory of Open Access Journals (Sweden)

    Maud Ciekanski

    2010-12-01

    propos sur les modes de valorisation scientifique du travail du chercheur confronté à la collecte et à la structuration de corpus d'apprentissage.The study of online learning, whether aimed at understanding this form of situated human learning, at evaluating relevant pedagogical scenarios and settings or at improving technological environments, requires the availability of interaction data from all participants in the learning situations. However, usually data are either inaccessible, or of limited access to those who were not involved in the original project. Moreover data are fragmented, therefore decontextualized with respect to the original teaching/learning settings. Sometimes they are buried in a proprietary format within the technological environment. The consequence is that research lacks a scientific basis. In the literature comparisons are often attempted between objects that are ill-defined and may in fact be different. The processes of scientific enquiry, such as re-analyzing, replicating, verifying, refuting or extending the original findings, are therefore disabled. To address this anomaly, we suggest to create and disseminate a new type of corpus, a contextualized learner corpus, entitled "LEarning and TEaching Corpus" (Letec. Such corpora include not only the data that correspond to output of learner activity in online courses, but also their context. Sharing Letec corpora within the research community implies that: (1 corpora are formatted and structured according to a new model which is compatible with existing standards for corpora and for learning design specifications; (2 corpora are placed on a server offering cross-platform compatibility and free access; (3 an ethics policy is formulated as well as copyright-licences. This paper presents the answers brought by our Mulce project from a theoretical and methodological standpoint. We give examples extracted from two learning and teaching corpora (Simuligne and Copéas. We show how data structured

  10. On the Quantification of Incertitude in Astrophysical Simulation Codes

    Science.gov (United States)

    Hoffman, Melissa; Katz, Maximilian P.; Willcox, Donald E.; Ferson, Scott; Swesty, F. Douglas; Calder, Alan

    2017-01-01

    We present a pedagogical study of uncertainty quantification (UQ) due to epistemic uncertainties (incertitude) in astrophysical modeling using the stellar evolution software instrument MESA (Modules and Experiments for Stellar Astrophysics). We present a general methodology for UQ and examine the specific case of stars evolving from the main sequence to carbon/oxygen white dwarfs. Our study considers two epistemic variables: the wind parameters during the Red Giant and Asymptotic Giant branch phases of evolution. We choose uncertainty intervals for each variable, and use these as input to MESA simulations. Treating MESA as a "black box," we apply two UQ techniques, Cauchy deviates and Quadratic Response Surface Models, to obtain bounds for the final white dwarf masses. Our study is a proof of concept applicable to other computational problems to enable a more robust understanding of incertitude. This work was supported in part by the US Department of Energy under grant DE-FG02-87ER40317.

  11. Tannins quantification in barks of Mimosa tenuiflora and Acacia mearnsii

    Directory of Open Access Journals (Sweden)

    Leandro Calegari

    2016-03-01

    Full Text Available Due to its chemical complexity, there are several methodologies for vegetable tannins quantification. Thus, this work aims at quantifying both tannin and non-tannin substances present in the barks of Mimosa tenuiflora and Acacia mearnsii by two different methods. From bark particles of both species, analytical solutions were produced by using a steam-jacketed extractor. The solution was analyzed by Stiasny and hide-powder (no chromed methods. For both species, tannin levels were superior when analyzed by hide-powder method, reaching 47.8% and 24.1% for A. mearnsii and M. tenuiflora, respectively. By Stiasny method, the tannins levels considered were 39.0% for A. mearnsii, and 15.5% for M. tenuiflora. Despite the best results presented by A. mearnsii, the bark of M. tenuiflora also showed great potential due to its considerable amount of tannin and the availability of the species at Caatinga biome.

  12. How to produce EDMS requirements and cost-benefit data.

    Science.gov (United States)

    Kerkoulas, Patrice Schwegman

    2002-08-01

    Electronic document management systems (EDMS) have a profound impact on administrative operations of health care provider organizations. Thorough yet conservative system requirements and cost-benefit data can prove the necessity and priority of the EDMS. This case study-based article provides a methodology for all EDMS implementations, including the preparation of the vision and scope, business analysis, cost-benefit analysis, and system specification and project plan. These are illustrated with EDMS examples. To successfully minimize project risk, the article reviews the importance of phasing, standards, and integration, and it provides six detailed examples of this methodology.

  13. Teacher Retirement Benefits

    Science.gov (United States)

    Costrell, Robert; Podgursky, Michael

    2009-01-01

    The ongoing global financial crisis is forcing many employers, from General Motors to local general stores, to take a hard look at the costs of the compensation packages they offer employees. For public school systems, this will entail a consideration of fringe benefit costs, which in recent years have become an increasingly important component of…

  14. Exercise: Benefits of Exercise

    Medline Plus

    Full Text Available ... show that people with arthritis, heart disease, or diabetes benefit from regular exercise. Exercise also helps people ... or difficulty walking. To learn about exercise and diabetes, see "Exercise and Type 2 Diabetes" from Go4Life®, ...

  15. PENSION FUND BENEFITS SERVICE

    CERN Document Server

    Benefits Service

    2002-01-01

    Please note that from now on, our offices will be opened to members and beneficiaries on Tuesday, Wednesday and Thursday from 10 to 12 a.m. and from 3 to 5 p.m. We are otherwise available but by appointment only. Benefits Service 5-1-030 tel. 79194 / 72738

  16. PENSION FUND BENEFITS SERVICE

    CERN Multimedia

    Benefits Service

    2002-01-01

    Please note that from now on, our offices (5-1-030) will be opened to members and beneficiaries on Tuesday, Wednesday and Thursday from 10 to 12 a.m. and from 3 to 5 p.m. We are otherwise available but by appointment only. Benefits Service (tel. 79194 / 72738)

  17. Making benefit transfers work

    DEFF Research Database (Denmark)

    Bateman, I.J.; Brouwer, R.; Ferrini, S.

    We develop and test guidance principles for benefits transfers. These argue that when transferring across relatively similar sites, simple mean value transfers are to be preferred but that when sites are relatively dissimilar then value function transfers will yield lower errors. The paper also...

  18. Benefits at risk

    DEFF Research Database (Denmark)

    Lassen, Jesper; Sandøe, Peter

    2007-01-01

    Herbicide resistant GM plants have been promoted as a tool in the development of more environment-friendly agriculture. The environmental benefits here, however, depend not only on farmer's acceptance of GM crops as such, but also on their willingness to use herbicides in accordance with altered...

  19. Soil carbon, multiple benefits

    NARCIS (Netherlands)

    Milne, E.; Banwart, S.A.; Noellemeyer, E.; Abson, D.J.; Ballabio, C.; Bampa, F.; Bationo, A.; Batjes, N.H.; Bernoux, M.; Bhattacharyya, T.

    2015-01-01

    In March 2013, 40 leading experts from across the world gathered at a workshop, hosted by the European Commission, Directorate General Joint Research Centre, Italy, to discuss the multiple benefits of soil carbon as part of a Rapid Assessment Process (RAP) project commissioned by Scientific

  20. Public services, personal benefits

    NARCIS (Netherlands)

    Bob Kuhry; Evert Pommer; Jedid-Jah Jonker; John Stevens

    2006-01-01

    Original title: Publieke productie & persoonlijk profijt. This report looks in detail at the costs of public services (such as care, education, public administration and safety) and the benefits that citizens derive from the government expenditure involved in delivering those services. In

  1. The Benefits of Latin?

    Science.gov (United States)

    Holliday, Lisa R.

    2012-01-01

    Classicists have long claimed that the study of Latin has benefits that exceed knowledge of the language itself, and in the current economic times, these claims are made with urgency. Indeed, many contend that Latin improves English grammar and writing skills, cognitive abilities, and develops transferable skills necessary for success in the…

  2. Public services, personal benefits

    NARCIS (Netherlands)

    Bob Kuhry; Evert Pommer; Jedid-Jah Jonker; John Stevens

    2006-01-01

    Original title: Publieke productie & persoonlijk profijt. This report looks in detail at the costs of public services (such as care, education, public administration and safety) and the benefits that citizens derive from the government expenditure involved in delivering those services. In 2003,

  3. Inclusion: Who Really Benefits?

    Science.gov (United States)

    Wilson-Younger, Dylinda

    2009-01-01

    Since the reauthorization of 2003, schools across the nation are mandated to educate students within the regular educational environment. What impact does this merger have on students and teachers? Who really benefits from this merger of regular education and special education? This article discusses the attitudes of general education teachers…

  4. More Benefits of Automation.

    Science.gov (United States)

    Getz, Malcolm

    1988-01-01

    Describes a study that measured the benefits of an automated catalog and automated circulation system from the library user's point of view in terms of the value of time saved. Topics discussed include patterns of use, access time, availability of information, search behaviors, and the effectiveness of the measures used. (seven references)…

  5. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    Science.gov (United States)

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Social Security and Medicare Benefits

    Data.gov (United States)

    Social Security Administration — Annual cash benefits and rehabilitation benefits paid in each year from the Old-Age and Survivors Insurance, and Disability Insurance Trust Funds, and benefits paid...

  7. Value of information: A roadmap to quantifying the benefit of structural health monitoring

    DEFF Research Database (Denmark)

    Straub, D.; Chatzi, E.; Bismut, E.

    2017-01-01

    The concept of value of information (VoI) enables quantification of the benefits provided by structural health monitoring (SHM) systems – in principle. Its implementation is challenging, as it requires an explicit modelling of the structural system’s life cycle, in particular of the decisions...

  8. Methodology, Meditation, and Mindfulness

    Directory of Open Access Journals (Sweden)

    Balveer Singh Sikh

    2016-04-01

    Full Text Available Understanding the nondualistic nature of mindfulness is a complex and challenging task particularly when most clinical psychology draws from Western methodologies and methods. In this article, we argue that the integration of philosophical hermeneutics with Eastern philosophy and practices may provide a methodology and methods to research mindfulness practice. Mindfulness hermeneutics brings together the nondualistically aligned Western philosophies of Heidegger and Gadamer and selected Eastern philosophies and practices in an effort to bridge the gap between these differing worldviews. Based on the following: (1 fusion of horizons, (2 being in a hermeneutic circle, (3 understanding as intrinsic to awareness, and (4 the ongoing practice of meditation, a mindfulness hermeneutic approach was used to illuminate deeper understandings of mindfulness practice in ways that are congruent with its underpinning philosophies.

  9. METHODOLOGICAL BASES OF OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Lanskaya D. V.

    2014-09-01

    Full Text Available Outsourcing is investigated from a position of finding steady and unique competitive advantages of a public corporation due to attraction of carriers of unique intellectual and uses social capitals of the specialized companies within the institutional theory. Key researchers and events in the history of outsourcing are marked out, the existing approaches to definition of the concept of outsourcing, advantage and risks from application of technology of outsourcing are considered. It is established that differences of outsourcing, sub-contraction and cooperation are not in the nature of the functional relations, and in the depth of considered economic terms and phenomena. The methodology of outsourcing is considered as a part of methodology of cooperation of enterprise innovative structures of being formed sector of knowledge economy

  10. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    As part of learning at the Nordic Workshop of Evidence-based Medicine, we have read with interest the practice guidelines for central venous access, published in your Journal in 2012.1 We appraised the quality of this guideline using the checklist developed by The Evidence-Based Medicine Working ...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....... Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...

  11. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  12. Tobacco documents research methodology.

    Science.gov (United States)

    Anderson, Stacey J; McCandless, Phyra M; Klausner, Kim; Taketa, Rachel; Yerger, Valerie B

    2011-05-01

    Tobacco documents research has developed into a thriving academic enterprise since its inception in 1995. The technology supporting tobacco documents archiving, searching and retrieval has improved greatly since that time, and consequently tobacco documents researchers have considerably more access to resources than was the case when researchers had to travel to physical archives and/or electronically search poorly and incompletely indexed documents. The authors of the papers presented in this supplement all followed the same basic research methodology. Rather than leave the reader of the supplement to read the same discussion of methods in each individual paper, presented here is an overview of the methods all authors followed. In the individual articles that follow in this supplement, the authors present the additional methodological information specific to their topics. This brief discussion also highlights technological capabilities in the Legacy Tobacco Documents Library and updates methods for organising internal tobacco documents data and findings.

  13. WHEAT GRASS HEALTH BENEFITS

    Directory of Open Access Journals (Sweden)

    Akula Annapurna

    2013-10-01

    Full Text Available Nutraceutical is a food or food product that provides health and medical benefits, including the preventionand treatment of disease. Nutraceuticals are the products typically claim to prevent chronic diseases, improve health,delay the aging process, and increase life expectancy.Let us know something about one such nutraceutical.Wheatgrass is a commonly found herb in India contains enzymes like protease, cytrochrome, amylase, lipase,transhydrogenase and SOD (super oxide dismutase. Besides these enzymes, it also contains all the essential aminoacids especially alanine, asparatic acid, glutamic acid, arginine and serine, which are helpful in providing good amountof protein in body which builds and repair tissues. Wheatgrass contains chlorophyll and flavonoids in good amount.It also contains vitamins like vitamin A, vitamin C, and vitamin E and minerals like iron, calcium and magnesium.Chlorophyll has been shown to build red blood cells quickly,cures anemia, normalise blood pressure by dilating theblood vessels. Chlorophyll has been shown to produce an unfavourable environment for bacterial growth in the bodyand therefore effective in increasing the body's resistance to illness. Probably the most important benefit ofwheatgrass is, it is a cancer fighting agent. Many people strongly believe that the benefits of wheatgrass on cancerare real and that consuming wheat grass can help in the treatment and even in the prevention of cancer. Wheatgrassproduces an immunization effect against many dietary carcinogens..Additional benefits of wheatgrass are bettercomplexion and a healthy glow. The slowing of graying hair is also a benefit believed to come from wheatgrass. Wecan grow wheat grass in small cups, pots and trays very conveniently in our homes, so that we will have fresh juiceand powder with minimum cost.

  14. Land evaluation methodology

    OpenAIRE

    Lustig, Thomas

    1998-01-01

    This paper reviews non-computerised and computerised land evaluation methods or methodologies, and realises the difficulties to incorporate biophysical and socioeconomic factors from different levels. Therefore, this paper theorises an alternative land evaluation approach, which is tested and elaborated in an agricultural community in the North of Chile. The basis of the approach relies on holistic thinking and attempts to evaluate the potential for improving assumed unsustainable goat manage...

  15. Pipeline ADC Design Methodology

    OpenAIRE

    Zhao, Hui

    2012-01-01

    Demand for high-performance analog-to-digital converter (ADC) integrated circuits (ICs) with optimal combined specifications of resolution, sampling rate and power consumption becomes dominant due to emerging applications in wireless communications, broad band transceivers, digital-intermediate frequency (IF) receivers and countless of digital devices. This research is dedicated to develop a pipeline ADC design methodology with minimum power dissipation, while keeping relatively high speed an...

  16. Albert Einstein's Methodology

    OpenAIRE

    Weinstein, Galina

    2012-01-01

    This paper discusses Einstein's methodology. 1. Einstein characterized his work as a theory of principle and reasoned that beyond kinematics, the 1905 heuristic relativity principle could offer new connections between non-kinematical concepts. 2. Einstein's creativity and inventiveness and process of thinking; invention or discovery. 3. Einstein considered his best friend Michele Besso as a sounding board and his class-mate from the Polytechnic Marcel Grossman, as his active partner. Yet, Ein...

  17. Pipeline ADC Design Methodology

    OpenAIRE

    Zhao, Hui

    2012-01-01

    Demand for high-performance analog-to-digital converter (ADC) integrated circuits (ICs) with optimal combined specifications of resolution, sampling rate and power consumption becomes dominant due to emerging applications in wireless communications, broad band transceivers, digital-intermediate frequency (IF) receivers and countless of digital devices. This research is dedicated to develop a pipeline ADC design methodology with minimum power dissipation, while keeping relatively high speed an...

  18. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  19. Analytical methodologies based on LC–MS/MS for monitoring selected emerging compounds in liquid and solid phases of the sewage sludge

    Science.gov (United States)

    Boix, C.; Ibáñez, M.; Fabregat-Safont, D.; Morales, E.; Pastor, L.; Sancho, J.V.; Sánchez-Ramírez, J.E.; Hernández, F.

    2016-01-01

    In this work, two analytical methodologies based on liquid chromatography coupled to tandem mass spectrometry (LC–MS/MS) were developed for quantification of emerging pollutants identified in sewage sludge after a previous wide-scope screening. The target list included 13 emerging contaminants (EC): thiabendazole, acesulfame, fenofibric acid, valsartan, irbesartan, salicylic acid, diclofenac, carbamazepine, 4-aminoantipyrine (4-AA), 4-acetyl aminoantipyrine (4-AAA), 4-formyl aminoantipyrine (4-FAA), venlafaxine and benzoylecgonine. The aqueous and solid phases of the sewage sludge were analyzed making use of Solid-Phase Extraction (SPE) and UltraSonic Extraction (USE) for sample treatment, respectively. The methods were validated at three concentration levels: 0.2, 2 and 20 μg L−1 for the aqueous phase, and 50, 500 and 2000 μg kg−1 for the solid phase of the sludge. In general, the method was satisfactorily validated, showing good recoveries (70–120%) and precision (RSD < 20%). Regarding the limit of quantification (LOQ), it was below 0.1 μg L−1 in the aqueous phase and below 50 μg kg−1 in the solid phase for the majority of the analytes. The method applicability was tested by analysis of samples from a wider study on degradation of emerging pollutants in sewage sludge under anaerobic digestion. The key benefits of these methodologies are: • SPE and USE are appropriate sample procedures to extract selected emerging contaminants from the aqueous phase of the sewage sludge and the solid residue. • LC–MS/MS is highly suitable for determining emerging contaminants in both sludge phases. • Up to our knowledge, the main metabolites of dipyrone had not been studied before in sewage sludge. PMID:27222823

  20. Quantification and identification of components in solution mixtures from 1D proton NMR spectra using singular value decomposition.

    Science.gov (United States)

    Xu, Qiuwei; Sachs, Jeffrey R; Wang, Ting-Chuan; Schaefer, William H

    2006-10-15

    One-dimensional proton NMR spectra of complex solutions provide rich molecular information, but limited chemical shift dispersion creates peak overlap that often leads to difficulty in peak identification and analyte quantification. Modern high-field NMR spectrometers provide high digital resolution with improved peak dispersion. We took advantage of these spectral qualities and developed a quantification method based on linear least-squares fitting using singular value decomposition (SVD). The linear least-squares fitting of a mixture spectrum was performed on the basis of reference spectra from individual small-molecule analytes. Each spectrum contained an internal quantitative reference (e.g., DSS-d6 or other suitable small molecules) by which the intensity of the spectrum was scaled. Normalization of the spectrum facilitated quantification based on peak intensity using linear least-squares fitting analysis. This methodology provided quantification of individual analytes as well as chemical identification. The analysis of small-molecule analytes over a wide concentration range indicated the accuracy and reproducibility of the SVD-based quantification. To account for the contribution from residual protein, lipid or polysaccharide in solution, a reference spectrum showing the macromolecules or aggregates was obtained using a diffusion-edited 1D proton NMR analysis. We demonstrated this approach with a mixture of small-molecule analytes in the presence of macromolecules (e.g., protein). The results suggested that this approach should be applicable to the quantification and identification of small-molecule analytes in complex biological samples.

  1. Dynamic vapor sorption as a tool for characterization and quantification of amorphous content in predominantly crystalline materials.

    Science.gov (United States)

    Sheokand, Sneha; Modi, Sameer R; Bansal, Arvind K

    2014-11-01

    It is well established that pharmaceutical processing can cause disruption of the crystal structure, leading to generation of amorphous content in crystalline materials. The presence of even a small amount of amorphous form, especially on the surface of crystalline material, can affect processing, performance, and stability of a drug product. This necessitates the need to quantify, monitor, and control the amorphous form. Numerous analytical techniques have been reported for the quantification of amorphous phase, but issues of sensitivity, suitability, limit of detection, and quantitation pose significant challenges. The present review focuses on use of dynamic vapor sorption (DVS) for quantification of amorphous content in predominantly crystalline materials. The article discusses (1) theoretical and experimental considerations important for developing a quantification method, (2) methods used for quantification of amorphous content, (3) basis for selecting a suitable methodology depending on the properties of a material, and (4) role of various instrument and sample-related parameters in designing a protocol for quantification of amorphous content. Finally, DVS-based hyphenated techniques have been discussed as they can offer higher sensitivity for quantification of amorphous content.

  2. MAMA Software Features: Visual Examples of Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-20

    This document shows examples of the results from quantifying objects of certain sizes and types in the software. It is intended to give users a better feel for some of the quantification calculations, and, more importantly, to help users understand the challenges with using a small set of ‘shape’ quantification calculations for objects that can vary widely in shapes and features. We will add more examples to this in the coming year.

  3. Requirements and benefits of flow forecasting for improving hydropower generation

    OpenAIRE

    Dong, Xiaohua; Vrijling, J. K.; Dohmen-Janssen, Catarine M.; Ruigh, E.; Booij, Martijn J.; Stalenberg, B.; Hulscher, Suzanne J.M.H.; Van Gelder, P.H.A.J.M.; Verlaan, M.; Zijderveld, A; Waarts, P.

    2005-01-01

    This paper presents a methodology to identify the required lead time and accuracy of flow forecasting for improving hydropower generation of a reservoir, by simulating the benefits (in terms of electricity generated) obtained from the forecasting with varying lead times and accuracies. The benefit-lead time relationship was investigated only for perfect inflow forecasts, with a few selected forecasting lead times: 4, 10 days and 1 year. The water level and the release from the reservoir were ...

  4. Risk Quantification and Evaluation Modelling

    Directory of Open Access Journals (Sweden)

    Manmohan Singh

    2014-07-01

    Full Text Available In this paper authors have discussed risk quantification methods and evaluation of risks and decision parameter to be used for deciding on ranking of the critical items, for prioritization of condition monitoring based risk and reliability centered maintenance (CBRRCM. As time passes any equipment or any product degrades into lower effectiveness and the rate of failure or malfunctioning increases, thereby lowering the reliability. Thus with the passage of time or a number of active tests or periods of work, the reliability of the product or the system, may fall down to a low value known as a threshold value, below which the reliability should not be allowed to dip. Hence, it is necessary to fix up the normal basis for determining the appropriate points in the product life cycle where predictive preventive maintenance may be applied in the programme so that the reliability (the probability of successful functioning can be enhanced, preferably to its original value, by reducing the failure rate and increasing the mean time between failure. It is very important for defence application where reliability is a prime work. An attempt is made to develop mathematical model for risk assessment and ranking them. Based on likeliness coefficient β1 and risk coefficient β2 ranking of the sub-systems can be modelled and used for CBRRCM.Defence Science Journal, Vol. 64, No. 4, July 2014, pp. 378-384, DOI:http://dx.doi.org/10.14429/dsj.64.6366 

  5. Precise quantification of nanoparticle internalization.

    Science.gov (United States)

    Gottstein, Claudia; Wu, Guohui; Wong, Benjamin J; Zasadzinski, Joseph Anthony

    2013-06-25

    Nanoparticles have opened new exciting avenues for both diagnostic and therapeutic applications in human disease, and targeted nanoparticles are increasingly used as specific drug delivery vehicles. The precise quantification of nanoparticle internalization is of importance to measure the impact of physical and chemical properties on the uptake of nanoparticles into target cells or into cells responsible for rapid clearance. Internalization of nanoparticles has been measured by various techniques, but comparability of data between different laboratories is impeded by lack of a generally accepted standardized assay. Furthermore, the distinction between associated and internalized particles has been a challenge for many years, although this distinction is critical for most research questions. Previously used methods to verify intracellular location are typically not quantitative and do not lend themselves to high-throughput analysis. Here, we developed a mathematical model which integrates the data from high-throughput flow cytometry measurements with data from quantitative confocal microscopy. The generic method described here will be a useful tool in biomedical nanotechnology studies. The method was then applied to measure the impact of surface coatings of vesosomes on their internalization by cells of the reticuloendothelial system (RES). RES cells are responsible for rapid clearance of nanoparticles, and the resulting fast blood clearance is one of the major challenges in biomedical applications of nanoparticles. Coating of vesosomes with long chain polyethylene glycol showed a trend for lower internalization by RES cells.

  6. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  7. Inverse problems and uncertainty quantification

    KAUST Repository

    Litvinenko, Alexander

    2013-12-18

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)— the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  8. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  9. Climate and desertification: indicators for an assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Sciortino, M.; Caiaffa, E.; Fattoruso, G.; Donolo, R.; Salvetti, G.

    2009-07-01

    This work aims to define a methodology that, on the basis of commonly available surface climate records, assesses indicators of the increase or decrease of the extension of territories vulnerable to desertification and land degradation. The definition and quantification of environmental policy relevant indicators aims to improve the understanding and the decision making processes in dry lands. the results of this study show that since 1931 changes of climate involved 90% of the territory of the Sicilian region, with stronger intensity in the internal areas of Enna, Caltanissetta and Palermo provinces. (Author) 9 refs.

  10. PANSYSTEMS ANALYSIS: MATHEMATICS, METHODOLOGY,RELATIVITY AND DIALECTICAL THINKING

    Institute of Scientific and Technical Information of China (English)

    郭定和; 吴学谋; 冯向军; 李永礼

    2001-01-01

    Based on new analysis modes and new definitions with relative mathematization and simplification or strengthening forms for concepts of generalized systems,panderivatives , pansymmetry , panbox principle, pansystems relativity, etc. , the framework and related principles of pansystems methodology and pansystems relativity are developed. Related contents include: pansystems with relatively universal mathematizing forns, 200 types of dualities, duality transformation, pansymmetry transformation,pansystems dialectics, the 8-domain method, pansystems mathematical methods,generalized quantification, the principles of approximation-transforming, pan-equivalence theorems , supply-demand analysis, thinking experiment, generalized gray systems, etc.

  11. University Benefits Survey. Part I (All Benefits Excluding Pensions).

    Science.gov (United States)

    University of Western Ontario, London.

    Results of a 1984 survey of benefits, excluding pensions, for 17 Ontario, Canada, universities are presented. Information is provided on the following areas: questions on general benefits, such as insurance plans, communication of benefits, proposed changes in benefits, provision of life and dismemberment insurance, and maternity leave policy;…

  12. Management of health care expenditure by soft computing methodology

    Science.gov (United States)

    Maksimović, Goran; Jović, Srđan; Jovanović, Radomir; Aničić, Obrad

    2017-01-01

    In this study was managed the health care expenditure by soft computing methodology. The main goal was to predict the gross domestic product (GDP) according to several factors of health care expenditure. Soft computing methodologies were applied since GDP prediction is very complex task. The performances of the proposed predictors were confirmed with the simulation results. According to the results, support vector regression (SVR) has better prediction accuracy compared to other soft computing methodologies. The soft computing methods benefit from the soft computing capabilities of global optimization in order to avoid local minimum issues.

  13. Open access pricing methodology in economically adapted electric transmission systems

    Energy Technology Data Exchange (ETDEWEB)

    Rudnick, Hugh; Cura, Eliana; Palma, Rodrigo [Pontificia Univ. Catolica de Chile, Santiago (Chile). Dept. de Ingenieria Electrica

    1996-07-01

    Open access pricing methodologies are evaluated in a deregulated environment, as applied to an economically adapted electric transmission system over a ten year time horizon. A transmission planning methodology using a genetic algorithm is used to determine the adapted system and the required investment over the horizon. A production cost simulation algorithm is utilized to determine the operation of the hydroelectric system and the resultant short term marginal income. Different pricing methodologies to allocate the required supplement, as applied to the Chilean central interconnected electrical system, are evaluated: use of system, postage stamp and user benefit. The resultant payment allocations are assessed and their economic impact on participants is discussed. (author)

  14. Methodology for qualitative urban flooding risk assessment.

    Science.gov (United States)

    Leitão, João P; Almeida, Maria do Céu; Simões, Nuno E; Martins, André

    2013-01-01

    Pluvial or surface flooding can cause significant damage and disruption as it often affects highly urbanised areas. Therefore it is essential to accurately identify consequences and assess the risks associated with such phenomena. The aim of this study is to present the results and investigate the applicability of a qualitative flood risk assessment methodology in urban areas. This methodology benefits from recent developments in urban flood modelling, such as the dual-drainage modelling concept, namely one-dimensional automatic overland flow network delineation tools (e.g. AOFD) and 1D/1D models incorporating both surface and sewer drainage systems. To assess flood risk, the consequences can be estimated using hydraulic model results, such as water velocities and water depth results; the likelihood was estimated based on the return period of historical rainfall events. To test the methodology two rainfall events with return periods of 350 and 2 years observed in Alcântara (Lisbon, Portugal) were used and three consequence dimensions were considered: affected public transportation services, affected properties and pedestrian safety. The most affected areas in terms of flooding were easily identified; the presented methodology was shown to be easy to implement and effective to assess flooding risk in urban areas, despite the common difficulties in obtaining data.

  15. A Structured Methodology for Spreadsheet Modelling

    CERN Document Server

    Knight, Brian; Rajalingham, Kamalesen

    2008-01-01

    In this paper, we discuss the problem of the software engineering of a class of business spreadsheet models. A methodology for structured software development is proposed, which is based on structured analysis of data, represented as Jackson diagrams. It is shown that this analysis allows a straightforward modularisation, and that individual modules may be represented with indentation in the block-structured form of structured programs. The benefits of structured format are discussed, in terms of comprehensibility, ease of maintenance, and reduction in errors. The capability of the methodology to provide a modular overview in the model is described, and examples are given. The potential for a reverse-engineering tool, to transform existing spreadsheet models is discussed.

  16. WHEAT GRASS HEALTH BENEFITS

    OpenAIRE

    Akula Annapurna

    2013-01-01

    Nutraceutical is a food or food product that provides health and medical benefits, including the preventionand treatment of disease. Nutraceuticals are the products typically claim to prevent chronic diseases, improve health,delay the aging process, and increase life expectancy.Let us know something about one such nutraceutical.Wheatgrass is a commonly found herb in India contains enzymes like protease, cytrochrome, amylase, lipase,transhydrogenase and SOD (super oxide dismutase). Besides the...

  17. Identification and quantification of the major constituents in Egyptian carob extract by liquid chromatography–electrospray ionization-tandem mass spectrometry

    Directory of Open Access Journals (Sweden)

    Asmaa Ibrahim Owis

    2016-01-01

    Full Text Available Background: Carob - Ceratonia siliqua L., commonly known as St John's-bread or locust bean, family Fabaceae - is one of the most useful native Mediterranean trees. There is no data about the chromatography methods performed by high performance liquid chromatography (HPLC for determining polyphenols in Egyptian carob pods. Objective: To establish a sensitive and specific liquid chromatography–electrospray ionization (ESI-tandem mass spectrometry (MSn methodology for the identification of the major constituents in Egyptian carob extract. Materials and Methods: HPLC with diode array detector and ESI-mass spectrometry (MS was developed for the identification and quantification of phenolic acids, flavonoid glycosides, and aglycones in the methanolic extract of Egyptian C. siliqua. The MS and MSn data together with HPLC retention time of phenolic components allowed structural characterization of these compounds. Peak integration of ions in the MS scans had been used in the quantification technique. Results: A total of 36 compounds were tentatively identified. Twenty-six compounds were identified in the negative mode corresponding to 85.4% of plant dry weight, while ten compounds were identified in the positive mode representing 16.1% of plant dry weight, with the prevalence of flavonoids (75.4% of plant dry weight predominantly represented by two methylapigenin-O-pentoside isomers (20.9 and 13.7% of plant dry weight. Conclusion: The identification of various compounds present in carob pods opens a new door to an increased understanding of the different health benefits brought about by the consumption of carob and its products.

  18. Identification and Quantification of the Major Constituents in Egyptian Carob Extract by Liquid Chromatography–Electrospray Ionization-Tandem Mass Spectrometry

    Science.gov (United States)

    Owis, Asmaa Ibrahim; El-Naggar, El-Motaz Bellah

    2016-01-01

    Background: Carob - Ceratonia siliqua L., commonly known as St John's-bread or locust bean, family Fabaceae - is one of the most useful native Mediterranean trees. There is no data about the chromatography methods performed by high performance liquid chromatography (HPLC) for determining polyphenols in Egyptian carob pods. Objective: To establish a sensitive and specific liquid chromatography–electrospray ionization (ESI)-tandem mass spectrometry (MSn) methodology for the identification of the major constituents in Egyptian carob extract. Materials and Methods: HPLC with diode array detector and ESI-mass spectrometry (MS) was developed for the identification and quantification of phenolic acids, flavonoid glycosides, and aglycones in the methanolic extract of Egyptian C. siliqua. The MS and MSn data together with HPLC retention time of phenolic components allowed structural characterization of these compounds. Peak integration of ions in the MS scans had been used in the quantification technique. Results: A total of 36 compounds were tentatively identified. Twenty-six compounds were identified in the negative mode corresponding to 85.4% of plant dry weight, while ten compounds were identified in the positive mode representing 16.1% of plant dry weight, with the prevalence of flavonoids (75.4% of plant dry weight) predominantly represented by two methylapigenin-O-pentoside isomers (20.9 and 13.7% of plant dry weight). Conclusion: The identification of various compounds present in carob pods opens a new door to an increased understanding of the different health benefits brought about by the consumption of carob and its products. SUMMARY This research proposed a good example for the rapid identification of major constituents in complex systems such as herbs using sensitive, accurate and specific method coupling HPLC with DAD and MS, which facilitate the clarification of phytochemical composition of herbal medicine for better understanding of their nature and

  19. Housing Accessibility Methodology Targeting Older People

    DEFF Research Database (Denmark)

    Helle, Tina

    research, practice and policy in a global context for the benefit of the health and well-being of older people with functional limitations. Moreover, the results provide new knowledge and invite reflections on central concepts and methodology relevant to psychometrics and research on person-environment fit....... of valid and reliable assessment instruments targeting housing accessibility, and in-depth analysis of factors potentially impacting on reliability in complex assessment situations is remarkably absent. Moreover, the knowledge base informing the housing standards appears to be vague. We may therefore...

  20. Differing antidepressant maintenance methodologies.

    Science.gov (United States)

    Safer, Daniel J

    2017-10-01

    The principle evidence that antidepressant medication (ADM) is an effective maintenance treatment for adults with major depressive disorder (MDD) is from placebo substitution trials. These trials enter responders from ADM efficacy trials into randomized, double-blind placebo-controlled (RDBPC) effectiveness trials to measure the rate of MDD relapse over time. However, other randomized maintenance trial methodologies merit consideration and comparison. A systematic review of ADM randomized maintenance trials included research reports from multiple databases. Relapse rate was the main effectiveness outcome assessed. Five ADM randomized maintenance methodologies for MDD responders are described and compared for outcome. These effectiveness trials include: placebo-substitution, ADM/placebo extension, ADM extension, ADM vs. psychotherapy, and treatment as usual. The placebo-substitution trials for those abruptly switched to placebo resulted in unusually high (46%) rates of relapse over 6-12months, twice the continuing ADM rate. These trials were characterized by selective screening, high attrition, an anxious anticipation of a switch to placebo, and a risk of drug withdrawal symptoms. Selectively screened ADM efficacy responders who entered into 4-12month extension trials experienced relapse rates averaging ~10% with a low attrition rate. Non-industry sponsored randomized trials of adults with multiple prior MDD episodes who were treated with ADM maintenance for 1-2years experienced relapse rates averaging 40%. Placebo substitution trial methodology represents only one approach to assess ADM maintenance. Antidepressant maintenance research for adults with MDD should be evaluated for industry sponsorship, attrition, the impact of the switch to placebo, and major relapse differences in MDD subpopulations. Copyright © 2017. Published by Elsevier Inc.

  1. Analog Ensemble Methodology: Expansion and Optimization for Renewable Energy Applications

    Science.gov (United States)

    Harding, L.; Cervone, G.; Delle Monache, L.

    2015-12-01

    Renewable energy is fundamental for sustaining and developing society. Solar and wind energy are promising sources because of their decreased environmental impact relative to conventional energy sources, improved efficiency, and increased use. A key challenge with renewable energy production is the generation of accurate renewable energy forecasts at varying spatial and temporal scales to assist utility companies in effective energy management. Specifically, this research applies the Analog Ensemble (AnEn) methodology to short-term (0-48 hour) wind speed forecasting for power generation and short-term (0-72) hour solar power measured (PM) output predictions. AnEn uses a set of past observations corresponding to the best analogs of a deterministic numerical weather prediction model to generate a probability distribution of future atmospheric states: an ensemble of analogs. Currently the AnEn methodology equally weights predictors and only handles 1D(time). We determine an optimal distribution of predictor weights based upon parameter characteristics, investigate spatial variations in the application of the methodology and develop a theory expanding the methodology into 2D. The AnEn methodology improves short-term prediction accuracy, decreases computational costs and provides uncertainty quantification allowing utility companies to manage over- or under power generation for renewable energy sources.

  2. MAXIMIZING THE BENEFITS OF ERP SYSTEMS

    Directory of Open Access Journals (Sweden)

    Paulo André da Conceição Menezes

    2010-04-01

    Full Text Available The ERP (Enterprise Resource Planning systems have been consolidated in companies with different sizes and sectors, allowing their real benefits to be definitively evaluated. In this study, several interactions have been studied in different phases, such as the strategic priorities and strategic planning defined as ERP Strategy; business processes review and the ERP selection in the pre-implementation phase, the project management and ERP adaptation in the implementation phase, as well as the ERP revision and integration efforts in the post-implementation phase. Through rigorous use of case study methodology, this research led to developing and to testing a framework for maximizing the benefits of the ERP systems, and seeks to contribute for the generation of ERP initiatives to optimize their performance.

  3. MAXIMIZING THE BENEFITS OF ERP SYSTEMS

    Directory of Open Access Journals (Sweden)

    Paulo André Da Conceiçao Menezes

    2010-04-01

    Full Text Available The ERP (Enterprise Resource Planning systems have been consolidated in companies with different sizes and sectors, allowing their real benefits to be definitively evaluated. In this study, several interactions have been studied in different phases, such as the strategic priorities and strategic planning defined as ERP Strategy; business processes review and the ERP selection in the pre-implementation phase, the project management and ERP adaptation in the implementation phase, as well as the ERP revision and integration efforts in the post-implementation phase. Through rigorous use of case study methodology, this research led to developing and to testing a framework for maximizing the benefits of the ERP systems, and seeks to contribute for the generation of ERP initiatives to optimize their performance.

  4. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  5. Literacy research methodologies

    CERN Document Server

    Duke, Nell K

    2012-01-01

    The definitive reference on literacy research methods, this book serves as a key resource for researchers and as a text in graduate-level courses. Distinguished scholars clearly describe established and emerging methodologies, discuss the types of questions and claims for which each is best suited, identify standards of quality, and present exemplary studies that illustrate the approaches at their best. The book demonstrates how each mode of inquiry can yield unique insights into literacy learning and teaching and how the methods can work together to move the field forward.   New to This Editi

  6. Internalism as Methodology

    Directory of Open Access Journals (Sweden)

    Terje Lohndal

    2009-10-01

    Full Text Available This paper scrutinizes the recent proposal made by Lassiter (2008 that the dichotomy between Chomskyan internalism and Dummett-type externalism is misguided and should be overcome by an approach that incorporates sociolinguistic concepts such as speakers’ dispositions to defer. We argue that Lassiter’s arguments are flawed and based on a serious misunder-standing of the internalist approach to the study of natural language, failing to appreciate its methodological nature and conclude that Lassiter’s socio-linguistic approach is just another instance of externalist attempts with little hope of scientific achievement.

  7. The New Methodology

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In the past few years there's been a rapidly growing interest in“lightweight” methodologies. Alternatively characterized as an antidote to bureaucracy or a license to hack they've stirred up interest all over the software landscape. In this essay I explore the reasons for lightweight methods, focusing not so much on their weight but on their adaptive nature and their people-first orientation . I also give a summary and references to the processes in this school and consider the factors that should influence your choice of whether to go down this newly trodden path.

  8. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  9. Forest Carbon Leakage Quantification Methods and Their Suitability for Assessing Leakage in REDD

    Directory of Open Access Journals (Sweden)

    Sabine Henders

    2012-01-01

    Full Text Available This paper assesses quantification methods for carbon leakage from forestry activities for their suitability in leakage accounting in a future Reducing Emissions from Deforestation and Forest Degradation (REDD mechanism. To that end, we first conducted a literature review to identify specific pre-requisites for leakage assessment in REDD. We then analyzed a total of 34 quantification methods for leakage emissions from the Clean Development Mechanism (CDM, the Verified Carbon Standard (VCS, the Climate Action Reserve (CAR, the CarbonFix Standard (CFS, and from scientific literature sources. We screened these methods for the leakage aspects they address in terms of leakage type, tools used for quantification and the geographical scale covered. Results show that leakage methods can be grouped into nine main methodological approaches, six of which could fulfill the recommended REDD leakage requirements if approaches for primary and secondary leakage are combined. The majority of methods assessed, address either primary or secondary leakage; the former mostly on a local or regional and the latter on national scale. The VCS is found to be the only carbon accounting standard at present to fulfill all leakage quantification requisites in REDD. However, a lack of accounting methods was identified for international leakage, which was addressed by only two methods, both from scientific literature.

  10. Simple and accurate quantification of quantum dots via single-particle counting.

    Science.gov (United States)

    Zhang, Chun-yang; Johnson, Lawrence W

    2008-03-26

    Quantification of quantum dots (QDs) is essential to the quality control of QD synthesis, development of QD-based LEDs and lasers, functionalizing of QDs with biomolecules, and engineering of QDs for biological applications. However, simple and accurate quantification of QD concentration in a variety of buffer solutions and in complex mixtures still remains a critical technological challenge. Here, we introduce a new methodology for quantification of QDs via single-particle counting, which is conceptually different from established UV-vis absorption and fluorescence spectrum techniques where large amounts of purified QDs are needed and specific absorption coefficient or quantum yield values are necessary for measurements. We demonstrate that single-particle counting allows us to nondiscriminately quantify different kinds of QDs by their distinct fluorescence burst counts in a variety of buffer solutions regardless of their composition, structure, and surface modifications, and without the necessity of absorption coefficient and quantum yield values. This single-particle counting can also unambiguously quantify individual QDs in a complex mixture, which is practically impossible for both UV-vis absorption and fluorescence spectrum measurements. Importantly, the application of this single-particle counting is not just limited to QDs but also can be extended to fluorescent microspheres, quantum dot-based microbeads, and fluorescent nano rods, some of which currently lack efficient quantification methods.

  11. Scrum methodology in banking environment

    OpenAIRE

    Strihová, Barbora

    2015-01-01

    Bachelor thesis "Scrum methodology in banking environment" is focused on one of agile methodologies called Scrum and description of the methodology used in banking environment. Its main goal is to introduce the Scrum methodology and outline a real project placed in a bank focused on software development through a case study, address problems of the project, propose solutions of the addressed problems and identify anomalies of Scrum in software development constrained by the banking environmen...

  12. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  13. Uncertainty quantification for large-scale ocean circulation predictions.

    Energy Technology Data Exchange (ETDEWEB)

    Safta, Cosmin; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik

    2010-09-01

    Uncertainty quantificatio in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO{sub 2} forcing. We develop a methodology that performs uncertainty quantificatio in the presence of limited data that have discontinuous character. Our approach is two-fold. First we detect the discontinuity location with a Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve location in presence of arbitrarily distributed input parameter values. Furthermore, we developed a spectral approach that relies on Polynomial Chaos (PC) expansions on each sides of the discontinuity curve leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification and propagation. The methodology is tested on synthetic examples of discontinuous data with adjustable sharpness and structure.

  14. Electromagnetomechanical elastodynamic model for Lamb wave damage quantification in composites

    Science.gov (United States)

    Borkowski, Luke; Chattopadhyay, Aditi

    2014-03-01

    Physics-based wave propagation computational models play a key role in structural health monitoring (SHM) and the development of improved damage quantification methodologies. Guided waves (GWs), such as Lamb waves, provide the capability to monitor large plate-like aerospace structures with limited actuators and sensors and are sensitive to small scale damage; however due to the complex nature of GWs, accurate and efficient computation tools are necessary to investigate the mechanisms responsible for dispersion, coupling, and interaction with damage. In this paper, the local interaction simulation approach (LISA) coupled with the sharp interface model (SIM) solution methodology is used to solve the fully coupled electro-magneto-mechanical elastodynamic equations for the piezoelectric and piezomagnetic actuation and sensing of GWs in fiber reinforced composite material systems. The final framework provides the full three-dimensional displacement as well as electrical and magnetic potential fields for arbitrary plate and transducer geometries and excitation waveform and frequency. The model is validated experimentally and proven computationally efficient for a laminated composite plate. Studies are performed with surface bonded piezoelectric and embedded piezomagnetic sensors to gain insight into the physics of experimental techniques used for SHM. The symmetric collocation of piezoelectric actuators is modeled to demonstrate mode suppression in laminated composites for the purpose of damage detection. The effect of delamination and damage (i.e., matrix cracking) on the GW propagation is demonstrated and quantified. The developed model provides a valuable tool for the improvement of SHM techniques due to its proven accuracy and computational efficiency.

  15. Cytokine mRNA quantification by real-time PCR.

    Science.gov (United States)

    Stordeur, Patrick; Poulin, Lionel F; Craciun, Ligia; Zhou, Ling; Schandené, Liliane; de Lavareille, Aurore; Goriely, Stanislas; Goldman, Michel

    2002-01-01

    Real-time PCR represents a new methodology that accurately quantifies nucleic acids. This has been made possible by the use of fluorogenic probes, which are presented in two forms, namely hydrolysis probes (also called TaqMan probes) and hybridisation probes. We decided to apply this methodology to cytokine mRNA quantification and this led us to the development of a protocol that provides an easy way to develop and perform rapidly real-time PCR on a Lightcycler instrument. It was made possible by the use of freely available software that permits a choice of both the hydrolysis probe and the primers. We firstly demonstrated that the reproducibility of the method using hydrolysis probes compares favourably with that obtained with hybridisation probes. We then applied this technique to determine the kinetics of IL-1ra, IL-1beta, IL-5, IL-13, TNF-alpha and IFN-gamma induction upon stimulation of human peripheral blood mononuclear cells (PBMC) by phytohaemagglutinin (PHA). Finally, the method was also used successfully to demonstrate that IFN-alpha induces IL-10 mRNA accumulation in human monocytes.

  16. 78 FR 66653 - Patient Protection and Affordable Care Act; HHS Notice of Benefit and Payment Parameters for 2014...

    Science.gov (United States)

    2013-11-06

    ..., and our proposal of a specific methodology for the 2014 benefit year. As in the case of the other plan... 2014 benefit year. As in the case of the other plan variations, we plan to review the methodology for... regarding determining employer size for purposes of participation in the Small Business Health Option...

  17. Developing benefits management

    DEFF Research Database (Denmark)

    Laursen, Markus

    if they do realize the value of their IT investments (Bradley, 2010; Hunter & Westerman, 2009; Legris & Collerette, 2006), although the relationship between IT and business performance has been known for many years (Melville et al., 2004; Kohli & Grover, 2008). A starting point for any organization is thus......An old quote goes “Rome wasn’t built in a day” which is similar to the practices comprehended by benefits management (BM) in today’s organizations; they mature as organizations improve practices (Ward & Daniel, 2012). The implication is that many organizations are still not realizing or are unsure...

  18. A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.

    Science.gov (United States)

    Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen

    2014-07-01

    Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities.

  19. Requirements and benefits of flow forecasting for improving hydropower generation

    NARCIS (Netherlands)

    Dong, Xiaohua; Vrijling, J.K.; Dohmen-Janssen, Catarine M.; Ruigh, E.; Booij, Martijn J.; Stalenberg, B.; Hulscher, Suzanne J.M.H.; van Gelder, P.H.A.J.M.; Verlaan, M.; Zijderveld, A.; Waarts, P.

    2005-01-01

    This paper presents a methodology to identify the required lead time and accuracy of flow forecasting for improving hydropower generation of a reservoir, by simulating the benefits (in terms of electricity generated) obtained from the forecasting with varying lead times and accuracies. The

  20. A Big Data Analytics Methodology Program in the Health Sector

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  1. Separation and quantification of microalgal carbohydrates.

    Science.gov (United States)

    Templeton, David W; Quinn, Matthew; Van Wychen, Stefanie; Hyman, Deborah; Laurens, Lieve M L

    2012-12-28

    Structural carbohydrates can constitute a large fraction of the dry weight of algal biomass and thus accurate identification and quantification is important for summative mass closure. Two limitations to the accurate characterization of microalgal carbohydrates are the lack of a robust analytical procedure to hydrolyze polymeric carbohydrates to their respective monomers and the subsequent identification and quantification of those monosaccharides. We address the second limitation, chromatographic separation of monosaccharides, here by identifying optimum conditions for the resolution of a synthetic mixture of 13 microalgae-specific monosaccharides, comprised of 8 neutral, 2 amino sugars, 2 uronic acids and 1 alditol (myo-inositol as an internal standard). The synthetic 13-carbohydrate mix showed incomplete resolution across 11 traditional high performance liquid chromatography (HPLC) methods, but showed improved resolution and accurate quantification using anion exchange chromatography (HPAEC) as well as alditol acetate derivatization followed by gas chromatography (for the neutral- and amino-sugars only). We demonstrate the application of monosaccharide quantification using optimized chromatography conditions after sulfuric acid analytical hydrolysis for three model algae strains and compare the quantification and complexity of monosaccharides in analytical hydrolysates relative to a typical terrestrial feedstock, sugarcane bagasse.

  2. Health benefits of cocoa.

    Science.gov (United States)

    Latif, Rabia

    2013-11-01

    In modern society, cocoa is being eaten as a confectionery, contrary to its medicinal use in the past. However, since the last decade, there has been a revival of talks about cocoa's health beneficial effects. Development has been made at the molecular level recently. This review discusses the recent progresses on potential health benefits of cocoa and/or its derivatives, with a focus on the areas that have been paid little attention so far, such as the role of cocoa in immune regulation, inflammation, neuroprotection, oxidative stress, obesity, and diabetes control. Thanks to the advancement in analytical technologies, the cocoa's metabolic pathways have now been properly mapped providing essential information on its roles. Cocoa helps in weight loss by improving mitochondrial biogenesis. It increases muscle glucose uptake by inserting glucose transporter 4 in skeletal muscles membrane. Because of its antioxidant properties, cocoa offers neuron protection and enhances cognition and positive mood. It lowers immunoglobulin E release in allergic responses. It can affect the immune response and bacterial growth at intestinal levels. It reduces inflammation by inhibiting nuclear factor-κB. Keeping in view the pleiotropic health benefits of cocoa, it may have the potential to be used for the prevention/treatment of allergies, cancers, oxidative injuries, inflammatory conditions, anxiety, hyperglycemia, and insulin resistance.

  3. A Framework for Identifying and Understanding Enterprise Systems Benefits

    DEFF Research Database (Denmark)

    Schubert, Petra; Williams, Susan P.

    2011-01-01

    Purpose – Identifying the benefits arising from implementations of enterprise systems and realizing business value remains a significant challenge for both research and industry. This paper aims to consolidate previous work. It presents a framework for investigating enterprise systems benefits...... and business change, which addresses the identified limitations of previous research and provides a more detailed analysis of benefits and their contextual variation. Design/methodology/approach – Drawing on data gathered from 31 real-world organizations (case studies) of differing size, maturity, and industry...... sector, the study adopts an iterative content analysis to empirically derive a comprehensive benefits framework. Findings – The content analysis provides a detailed classification of expectations and benefits, which is described in a four-level framework. The four levels (areas) are further subdivided...

  4. UNCOMMON SENSORY METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladimír Vietoris

    2015-02-01

    Full Text Available Sensory science is the young but the rapidly developing field of the food industry. Actually, the great emphasis is given to the production of rapid techniques of data collection, the difference between consumers and trained panel is obscured and the role of sensory methodologists is to prepare the ways for evaluation, by which a lay panel (consumers can achieve identical results as a trained panel. Currently, there are several conventional methods of sensory evaluation of food (ISO standards, but more sensory laboratories are developing methodologies that are not strict enough in the selection of evaluators, their mechanism is easily understandable and the results are easily interpretable. This paper deals with mapping of marginal methods used in sensory evaluation of food (new types of profiles, CATA, TDS, napping.

  5. Albert Einstein's Methodology

    CERN Document Server

    Weinstein, Galina

    2012-01-01

    This paper discusses Einstein's methodology. 1. Einstein characterized his work as a theory of principle and reasoned that beyond kinematics, the 1905 heuristic relativity principle could offer new connections between non-kinematical concepts. 2. Einstein's creativity and inventiveness and process of thinking; invention or discovery. 3. Einstein considered his best friend Michele Besso as a sounding board and his class-mate from the Polytechnic Marcel Grossman, as his active partner. Yet, Einstein wrote to Arnold Sommerfeld that Grossman will never claim to be considered a co-discoverer of the Einstein-Grossmann theory. He only helped in guiding Einstein through the mathematical literature, but contributed nothing of substance to the results of the theory. Hence, Einstein neither considered Besso or Grossmann as co-discoverers of the relativity theory which he invented.

  6. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  7. Social Security's special minimum benefit.

    Science.gov (United States)

    Olsen, K A; Hoffmeyer, D

    Social Security's special minimum primary insurance amount (PIA) provision was enacted in 1972 to increase the adequacy of benefits for regular long-term, low-earning covered workers and their dependents or survivors. At the time, Social Security also had a regular minimum benefit provision for persons with low lifetime average earnings and their families. Concerns were rising that the low lifetime average earnings of many regular minimum beneficiaries resulted from sporadic attachment to the covered workforce rather than from low wages. The special minimum benefit was seen as a way to reward regular, low-earning workers without providing the windfalls that would have resulted from raising the regular minimum benefit to a much higher level. The regular minimum benefit was subsequently eliminated for workers reaching age 62, becoming disabled, or dying after 1981. Under current law, the special minimum benefit will phase out over time, although it is not clear from the legislative history that this was Congress's explicit intent. The phaseout results from two factors: (1) special minimum benefits are paid only if they are higher than benefits payable under the regular PIA formula, and (2) the value of the regular PIA formula, which is indexed to wages before benefit eligibility, has increased faster than that of the special minimum PIA, which is indexed to inflation. Under the Social Security Trustees' 2000 intermediate assumptions, the special minimum benefit will cease to be payable to retired workers attaining eligibility in 2013 and later. Their benefits will always be larger under the regular benefit formula. As policymakers consider Social Security solvency initiatives--particularly proposals that would reduce benefits or introduce investment risk--interest may increase in restoring some type of special minimum benefit as a targeted protection for long-term low earners. Two of the three reform proposals offered by the President's Commission to Strengthen

  8. Quantification of radiation-induced lung damage with CT scans - The possible benefit for radiogenomics

    Energy Technology Data Exchange (ETDEWEB)

    De Ruysscher, Dirk [Radiation Oncology, Univ. Hospitals Leuven/KU Leuven, Leuven (Belgium); Dept. of Radiation Oncology (Maastro clinic), Maastricht Univ. Medical Center, Maastricht (Netherlands)], e-mail: dirk.deruysscher@uzleuven.be; Sharifi, Hoda [Dept. of Radiation Oncology (Maastro clinic), Maastricht Univ. Medical Center, Maastricht (Netherlands); Defraene, Gilles [Radiation Oncology, Univ. Hospitals Leuven/KU Leuven, Leuven (Belgium)] [and others

    2013-10-15

    Background: Radiation-induced lung damage (RILD) is an important problem. Although physical parameters such as the mean lung dose are used in clinical practice, they are not suited for individualised radiotherapy. Objective, quantitative measurements of RILD on a continuous instead of on an ordinal, semi-quantitative, semi-subjective scale, are needed. Methods: Hounsfield unit (HU) changes before versus three months post-radiotherapy were correlated per voxel with the radiotherapy dose in 95 lung cancer patients. Deformable registration was used to register pre- and post-CT scans and the density increase was quantified for various dose bins. The dose-response curve for increased HU was quantified using the slope of a linear regression (HU/Gy). The end-point for the toxicity analysis was dyspnoea = grade 2. Results: Radiation dose was linearly correlated with the change in HU (mean R2 = 0.74 {+-} 0.28). No differences in HU/Gy between groups treated with stereotactic radiotherapy, conventional radiotherapy alone, sequential or concurrent chemo-radiotherapy were observed. In the whole patient group, 33/95 (34.7%) had dyspnoea {>=} G2. Of the 48 patients with a HU/Gy below the median, 16 (33.3%) developed dyspnoea = G2, while in the 47 patients with a HU/Gy above the median, 17 (36.1%) had dyspnoea {>=}G2 (not significant). Individual patients showed a nearly 21-fold difference in radiosensitivity, with HU/Gy ranging from 0 to 10 HU/Gy. Conclusions: HU changes identify objectively the whole range of individual radiosensitivity on a continuous, quantitative scale. CT density changes may allow more robust and accurate radiogenomics studies.

  9. CONSIDERATIONS REGARDING THE QUANTIFICATION OF THE BENEFITS OF A CLEAN AND HEALTHY ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    PAUL-BOGDAN ZAMFIR

    2012-09-01

    Full Text Available The practice and economic theory reveal relationships of dependence, between degree of reduction ofpollutant residues, on the one hand, and the cost, as well as the total positive effects which is to be made by the controland the actions of reduction the degree of pollution, on the other hand.Thus, from an ecological point of view, an action may be defined as economically efficient, not only whereensures achievement of the objectives proposed in terms of minimum costs, not only where but also if it ensures at leastkeeping the quality of natural environment. The protection program of the environmental quality drawn up ofenterprises, program included in their strategy of development, to be operational it is necessary to include a series ofindicators such as: the permissible level of pollution of the environment with different substances, acceptable levels ofcontamination from the enterprise products, the volume of expenditure which it involves taking measures for theconservation and protection of the environment, the modality of including in the production cost the expensesrelated to protect the natural environment, etc.

  10. Enlisting Ecosystem Benefits: Quantification and Valuation of Ecosystem Services to Inform Installation Management

    Science.gov (United States)

    2015-05-27

    stewardship, sustaining multiple uses of lands (e.g., resource conservation, wildlife protection, recreation, and energy development), evaluating...restoration carried out by clear-cutting loblolly pine plantations yields a sharp loss of carbon across the installation in these two scenarios (red in...cutting loblolly plantations ~5 years from the baseline. The LEF scenario shows a large, positive impact on carbon sequestration and greater

  11. Uncertainty quantification methodologies development for stress corrosion cracking of canister welds

    Energy Technology Data Exchange (ETDEWEB)

    Dingreville, Remi Philippe Michel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bryan, Charles R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-30

    This letter report presents a probabilistic performance assessment model to evaluate the probability of canister failure (through-wall penetration) by SCC. The model first assesses whether environmental conditions for SCC – the presence of an aqueous film – are present at canister weld locations (where tensile stresses are likely to occur) on the canister surface. Geometry-specific storage system thermal models and weather data sets representative of U.S. spent nuclear fuel (SNF) storage sites are implemented to evaluate location-specific canister surface temperature and relative humidity (RH). As the canister cools and aqueous conditions become possible, the occurrence of corrosion is evaluated. Corrosion is modeled as a two-step process: first, pitting is initiated, and the extent and depth of pitting is a function of the chloride surface load and the environmental conditions (temperature and RH). Second, as corrosion penetration increases, the pit eventually transitions to a SCC crack, with crack initiation becoming more likely with increasing pit depth. Once pits convert to cracks, a crack growth model is implemented. The SCC growth model includes rate dependencies on both temperature and crack tip stress intensity factor, and crack growth only occurs in time steps when aqueous conditions are predicted. The model suggests that SCC is likely to occur over potential SNF interim storage intervals; however, this result is based on many modeling assumptions. Sensitivity analyses provide information on the model assumptions and parameter values that have the greatest impact on predicted storage canister performance, and provide guidance for further research to reduce uncertainties.

  12. A continous Bayesian network for earth dams' risk assessment: methodology and quantification

    NARCIS (Netherlands)

    Morales-Napoles, O.; Delgado-Hernadez-D.J.; De-Leon-Escobedo, D.; Arteaga-Arcos, J.C.

    2013-01-01

    Dams’ safety is highly important for authorities around the world. The impacts of a dam failure can be enormous. Models for investigating dam safety are required for helping decision-makers to mitigate the possible adverse consequences of flooding. A model for earth dam safety must specify clearly p

  13. Quantification and Management of Manifest Occlusal Caries Lesions in Adults: A Methodological and a Clinical Study

    DEFF Research Database (Denmark)

    Bakhshandeh, Azam

    2010-01-01

    resin sealants on medium deep or deep dentinal lesions; and only few studies examine the effect of resin sealants of lesions in adult patients. The progression of dentinal caries lesions are evaluated radiographically based on scores of lesion depth by paired comparison of X-rays, or by subtractions...... teeth with primary occlusal lesions. Randomization was performed in cases of more than one lesion in the same patient, so that the final material consisted of 60 resin sealed and 12 restored lesions. After 2-3 years, there was a drop-out of 15%; 2 patients did not show up for the control and 9...... tooth was color-dyed with Caries Detector®, and digitally photographed. The maximum width and the deepest extent of the lesions, enamel-dentin-junction and enamel-cement-junction were marked, and their relative dentinal depth and width were measured in the 2 histological images before and after dying...

  14. Application of the RPN methodology for quantification of the operability of the quadruple-tank process

    Directory of Open Access Journals (Sweden)

    J.O. Trierweiler

    2002-04-01

    Full Text Available The RPN indicates how potentially difficult it is for a given system to achieve the desired performance robustly. It reflects both the attainable performance of a system and its degree of directionality. Two new indices, RPN ratio and RPN difference are introduced to quantify how realizable a given desired performance can be. The predictions made by RPN are verified by closed-loop simulations. These indices are applied to quantify the IO-controllability of the quadruple-tank process.

  15. A continous Bayesian network for earth dams' risk assessment: methodology and quantification

    NARCIS (Netherlands)

    Morales-Napoles, O.; Delgado-Hernadez-D.J.; De-Leon-Escobedo, D.; Arteaga-Arcos, J.C.

    2013-01-01

    Dams’ safety is highly important for authorities around the world. The impacts of a dam failure can be enormous. Models for investigating dam safety are required for helping decision-makers to mitigate the possible adverse consequences of flooding. A model for earth dam safety must specify clearly p

  16. New approach for the quantification of processed animal proteins in feed using light microscopy.

    Science.gov (United States)

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  17. Considerations for quantification of lipids in nerve tissue using MALDI mass spectrometric imaging

    Science.gov (United States)

    Landgraf, Rachelle R.; Garrett, Timothy J.; Prieto Conaway, Maria C.; Calcutt, Nigel A.; Stacpoole, Peter W.; Yost, Richard A.

    2013-01-01

    MALDI mass spectrometric imaging is a technique that provides the ability to identify and characterize endogenous and exogenous compounds spatially within tissue with relatively little sample preparation. While it is a proven methodology for qualitative analysis, little has been reported for its utility in quantitative measurements. In the current work, inherent challenges in MALDI quantification are addressed. Signal response is monitored over successive analyses of a single tissue section to minimize error due to variability in the laser, matrix application, and sample inhomogeneity. Methods for the application of an internal standard to tissue sections are evaluated and used to quantify endogenous lipids in nerve tissue. A precision of 5% or less standard error was achieved, illustrating that MALDI imaging offers a reliable means of in situ quantification for microgram-sized samples and requires minimal sample preparation. PMID:21953974

  18. University Benefits Survey. Part 1 (All Benefits Excluding Pensions).

    Science.gov (United States)

    University of Western Ontario, London.

    Results of a 1983 survey of benefits, excluding pensions, for 17 Ontario, Canada, universities are presented. Information is provided on the following areas: whether the university self-administers insurance plans, communication of benefits, proposed changes in benefits, provision of life and dismemberment insurance, maternity leave policy,…

  19. Consistent quantification of climate impacts due to biogenic carbon storage across a range of bio-product systems

    Energy Technology Data Exchange (ETDEWEB)

    Guest, Geoffrey, E-mail: geoffrey.guest@ntnu.no; Bright, Ryan M., E-mail: ryan.m.bright@ntnu.no; Cherubini, Francesco, E-mail: francesco.cherubini@ntnu.no; Strømman, Anders H., E-mail: anders.hammer.stromman@ntnu.no

    2013-11-15

    Temporary and permanent carbon storage from biogenic sources is seen as a way to mitigate climate change. The aim of this work is to illustrate the need to harmonize the quantification of such mitigation across all possible storage pools in the bio- and anthroposphere. We investigate nine alternative storage cases and a wide array of bio-resource pools: from annual crops, short rotation woody crops, medium rotation temperate forests, and long rotation boreal forests. For each feedstock type and biogenic carbon storage pool, we quantify the carbon cycle climate impact due to the skewed time distribution between emission and sequestration fluxes in the bio- and anthroposphere. Additional consideration of the climate impact from albedo changes in forests is also illustrated for the boreal forest case. When characterizing climate impact with global warming potentials (GWP), we find a large variance in results which is attributed to different combinations of biomass storage and feedstock systems. The storage of biogenic carbon in any storage pool does not always confer climate benefits: even when biogenic carbon is stored long-term in durable product pools, the climate outcome may still be undesirable when the carbon is sourced from slow-growing biomass feedstock. For example, when biogenic carbon from Norway Spruce from Norway is stored in furniture with a mean life time of 43 years, a climate change impact of 0.08 kg CO{sub 2}eq per kg CO{sub 2} stored (100 year time horizon (TH)) would result. It was also found that when biogenic carbon is stored in a pool with negligible leakage to the atmosphere, the resulting GWP factor is not necessarily − 1 CO{sub 2}eq per kg CO{sub 2} stored. As an example, when biogenic CO{sub 2} from Norway Spruce biomass is stored in geological reservoirs with no leakage, we estimate a GWP of − 0.56 kg CO{sub 2}eq per kg CO{sub 2} stored (100 year TH) when albedo effects are also included. The large variance in GWPs across the range of

  20. Atomic bomb health benefits.

    Science.gov (United States)

    Luckey, T D

    2008-01-01

    Media reports of deaths and devastation produced by atomic bombs convinced people around the world that all ionizing radiation is harmful. This concentrated attention on fear of miniscule doses of radiation. Soon the linear no threshold (LNT) paradigm was converted into laws. Scientifically valid information about the health benefits from low dose irradiation was ignored. Here are studies which show increased health in Japanese survivors of atomic bombs. Parameters include decreased mutation, leukemia and solid tissue cancer mortality rates, and increased average lifespan. Each study exhibits a threshold that repudiates the LNT dogma. The average threshold for acute exposures to atomic bombs is about 100 cSv. Conclusions from these studies of atomic bomb survivors are: One burst of low dose irradiation elicits a lifetime of improved health.Improved health from low dose irradiation negates the LNT paradigm.Effective triage should include radiation hormesis for survivor treatment.

  1. NASA Benefits Earth

    Science.gov (United States)

    Robinson, Julie A.

    2009-01-01

    This slide presentation reviews several ways in which NASA research has benefited Earth and made life on Earth better. These innovations include: solar panels, recycled pavement, thermometer pill, invisible braces for straightening teeth, LASIK, aerodynamic helmets and tires for bicycles, cataract detection, technology that was used to remove Anthrax spores from mail handling facilities, study of atomic oxygen erosion of materials has informed the restoration of artwork, macroencapsulation (a potential mechanism to deliver anti cancer drugs to specific sites), and research on a salmonella vaccine. With research on the International Space Station just beginning, there will be opportunities for entrepreneurs and other government agencies to access space for their research and development. As well as NASA continuing its own research on human health and technology development.

  2. Making benefit transfers work

    DEFF Research Database (Denmark)

    Bateman, I.J.; Brouwer, R.; Ferrini, S.

    We develop and test guidance principles for benefits transfers. These argue that when transferring across relatively similar sites, simple mean value transfers are to be preferred but that when sites are relatively dissimilar then value function transfers will yield lower errors. The paper also...... provides guidance on the appropriate specification of transferable value functions arguing that these should be developed from theoretical rather than ad-hoc statistical principles. These principles are tested via a common format valuation study of water quality improvements across five countries. Results...... support our various hypotheses providing a set of principles for future transfer studies. The application also considers new ways of incorporating distance decay, substitution and framing effects within transfers and presents a novel water quality ladder....

  3. Making benefit transfers work

    DEFF Research Database (Denmark)

    Bateman, I. J.; Brouwer, R.; Ferrini, S.

    2011-01-01

    We develop and test guidance principles for benefits transfers. These argue that when transferring across relatively similar sites, simple mean value transfers are to be preferred but that when sites are relatively dissimilar then value function transfers will yield lower errors. The paper also...... provides guidance on the appropriate specification of transferable value functions arguing that these should be developed from theoretical rather than ad-hoc statistical principles. These principles are tested via a common format valuation study of water quality improvements across five countries. Results...... support our various hypotheses providing a set of principles for future transfer studies. The application also considers new ways of incorporating distance decay, substitution and framing effects within transfers and presents a novel water quality ladder....

  4. NASA Technology Benefits Orthotics

    Science.gov (United States)

    Myers, Neill; Shadoan, Michael

    1998-01-01

    Engineers at NASA's Marshall Space Flight Center (MSFC) in Huntsville, Alabama have designed a knee brace to aid in the rehabilitation of medical patients. The device, called the Selectively Lockable Knee Brace, was designed for knee injury and stroke patients but may potentially serve in many more patient applications. Individuals with sports related injuries, spinal cord injuries and birth defects, such as spina bifida, may also benefit from the device. The Selectively Lockable Knee Brace is designed to provide secure support to the patient when weight is applied to the leg; however; when the leg is not supporting weight, the device allows free motion of the knee joint. Braces currently on the market lock the knee in a rigid, straight or bent position, or by manually pulling a pin, allow continuous free joint motion.

  5. China Benefits from FDI

    Institute of Scientific and Technical Information of China (English)

    华民

    2007-01-01

    While China’s opening up policy has brought about rapid economic growth,it has also resulted in a certain loss of welfare.Many scholars have debated this issue extensively from different perspectives.An article entitled"An Open Economy Calls for New Development Theories"by Zhang Youwen (published in the Sept.2006 issue of China Economist) proposed a"new approach to opening up"-a reflection of the views held by some Chinese scholars. Disagreeing with these views,the author of this article believes that China should give more consideration to her resource endowment and economic growth stages and evaluate scientifically the benefits of"opening up".

  6. State of the art in benefit-risk analysis: medicines.

    Science.gov (United States)

    Luteijn, J M; White, B C; Gunnlaugsdóttir, H; Holm, F; Kalogeras, N; Leino, O; Magnússon, S H; Odekerken, G; Pohjola, M V; Tijhuis, M J; Tuomisto, J T; Ueland, Ø; McCarron, P A; Verhagen, H

    2012-01-01

    Benefit-risk assessment in medicine has been a valuable tool in the regulation of medicines since the 1960s. Benefit-risk assessment takes place in multiple stages during a medicine's life-cycle and can be conducted in a variety of ways, using methods ranging from qualitative to quantitative. Each benefit-risk assessment method is subject to its own specific strengths and limitations. Despite its widespread and long-time use, benefit-risk assessment in medicine is subject to debate and suffers from a number of limitations and is currently still under development. This state of the art review paper will discuss the various aspects and approaches to benefit-risk assessment in medicine in a chronological pathway. The review will discuss all types of benefit-risk assessment a medicinal product will undergo during its lifecycle, from Phase I clinical trials to post-marketing surveillance and health technology assessment for inclusion in public formularies. The benefit-risk profile of a drug is dynamic and differs for different indications and patient groups. In the end of this review we conclude benefit-risk analysis in medicine is a developed practice that is subject to continuous improvement and modernisation. Improvement not only in methodology, but also in cooperation between organizations can improve benefit-risk assessment. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. The Methodological Imperatives of Feminist Ethnography

    Directory of Open Access Journals (Sweden)

    Richelle D. Schrock

    2013-12-01

    Full Text Available Feminist ethnography does not have a single, coherent definition and is caught between struggles over the definition and goals of feminism and the multiple practices known collectively as ethnography. Towards the end of the 1980s, debates emerged that problematized feminist ethnography as a productive methodology and these debates still haunt feminist ethnographers today. In this article, I provide a concise historiography of feminist ethnography that summarizes both its promises and its vulnerabilities. I address the three major challenges I argue feminist ethnographers currently face, which include responding productively to feminist critiques of representing "others," accounting for feminisms' commitment to social change while grappling with poststructuralist critiques of knowledge production, and confronting the historical and ongoing lack of recognition for significant contributions by feminist ethnographers. Despite these challenges, I argue that feminist ethnography is a productive methodology and I conclude by delineating its methodological imperatives. These imperatives include producing knowledge about women's lives in specific cultural contexts, recognizing the potential detriments and benefits of representation, exploring women's experiences of oppression along with the agency they exercise in their own lives, and feeling an ethical responsibility towards the communities in which the researchers work. I argue that this set of imperatives enables feminist ethnographers to successfully navigate the challenges they face.

  8. A NEW METHODOLOGY ON STRATEGIC PLANNING

    Directory of Open Access Journals (Sweden)

    Hakan Bütüner

    2014-07-01

    Full Text Available A systematic method of strategic planning is anticipated to be easily understood and straightforward; based on fundamentals; and to be universally applicable for any type of business.  Accordingly, this methodology is generated for the purpose of assembling the disconnected and disorderly ideas, processes, and techniques (written on strategy and business development under the same roof, in order to develop a systematic methodology that is easily understandable and applicable. As many sources exhort managers to“think strategically” or prescribe “strategic leadership” to helicopter out of tactical day-to-day management, only a few address how to make this happen.  Where strategic analysis tools are explained, this is most frequently done conceptually rather than practically on how to utilize the tools for strategic planning.  Moreover, as fondly as it may sound, there is an exact approach or systematic thinking on this issue; our intention is to bring in a new perspective to the reader and, more significantly, to provide a different benefit by the application of this systematic methodology.Systematic strategic planning (SSP consists of a framework of phases through which each project passes, a pattern of procedures for straight-forward planning, and the fundamentals involved in any strategic planning project.

  9. Medicare Prescription Drug Benefit Manual

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Part D Prescription Drug Benefit Manual (PDBM) is user guide to Part D Prescription Drug Program. It includes information on general provisions, benefits,...

  10. Robust and Efficient Uncertainty Quantification and Validation of RFIC Isolation

    Directory of Open Access Journals (Sweden)

    A. Di Bucchianico

    2014-04-01

    Full Text Available Modern communication and identification products impose demanding constraints on reliability of components. Due to this statistical constraints more and more enter optimization formulations of electronic products. Yield constraints often require efficient sampling techniques to obtain uncertainty quantification also at the tails of the distributions. These sampling techniques should outperform standard Monte Carlo techniques, since these latter ones are normally not efficient enough to deal with tail probabilities. One such a technique, Importance Sampling, has successfully been applied to optimize Static Random Access Memories (SRAMs while guaranteeing very small failure probabilities, even going beyond 6-sigma variations of parameters involved. Apart from this, emerging uncertainty quantifications techniques offer expansions of the solution that serve as a response surface facility when doing statistics and optimization. To efficiently derive the coefficients in the expansions one either has to solve a large number of problems or a huge combined problem. Here parameterized Model Order Reduction (MOR techniques can be used to reduce the work load. To also reduce the amount of parameters we identify those that only affect the variance in a minor way. These parameters can simply be set to a fixed value. The remaining parameters can be viewed as dominant. Preservation of the variation also allows to make statements about the approximation accuracy obtained by the parameter-reduced problem. This is illustrated on an RLC circuit. Additionally, the MOR technique used should not affect the variance significantly. Finally we consider a methodology for reliable RFIC isolation using floor-plan modeling and isolation grounding. Simulations show good comparison with measurements.

  11. Quantification of blood flow and topology in developing vascular networks.

    Directory of Open Access Journals (Sweden)

    Astrid Kloosterman

    Full Text Available Since fluid dynamics plays a critical role in vascular remodeling, quantification of the hemodynamics is crucial to gain more insight into this complex process. Better understanding of vascular development can improve prediction of the process, and may eventually even be used to influence the vascular structure. In this study, a methodology to quantify hemodynamics and network structure of developing vascular networks is described. The hemodynamic parameters and topology are derived from detailed local blood flow velocities, obtained by in vivo micro-PIV measurements. The use of such detailed flow measurements is shown to be essential, as blood vessels with a similar diameter can have a large variation in flow rate. Measurements are performed in the yolk sacs of seven chicken embryos at two developmental stages between HH 13+ and 17+. A large range of flow velocities (1 µm/s to 1 mm/s is measured in blood vessels with diameters in the range of 25-500 µm. The quality of the data sets is investigated by verifying the flow balances in the branching points. This shows that the quality of the data sets of the seven embryos is comparable for all stages observed, and the data is suitable for further analysis with known accuracy. When comparing two subsequently characterized networks of the same embryo, vascular remodeling is observed in all seven networks. However, the character of remodeling in the seven embryos differs and can be non-intuitive, which confirms the necessity of quantification. To illustrate the potential of the data, we present a preliminary quantitative study of key network topology parameters and we compare these with theoretical design rules.

  12. A multi-model assessment of the co-benefits of climate mitigation for global air quality

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Shilpa; Klimont, Zbigniew; Leitao, Joana; Riahi, Keywan; van Dingenen, Rita; Reis, Lara Aleluia; Calvin, Katherine; Dentener, Frank; Drouet, Laurent; Fujimori, Shinichiro; Harmsen, Mathijs; Luderer, Gunnar; Heyes, Chris; Strefler, Jessica; Tavoni, Massimo; van Vuuren, Detlef P.

    2016-12-01

    sector and region level. A second methodological advancement is a quantification of the co-benefits in terms of the associated atmospheric concentrations of fine particulate matter (PM2.5) and consequent mortality related outcomes across different models. This is made possible by the use of state-of the art simplified atmospheric model that allows for the first time a computationally feasible multi-model evaluation of such outcomes.

  13. Regional issue identification and assessment: study methodology. First annual report

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    The overall assessment methodologies and models utilized for the first project under the Regional Issue Identification and Assessment (RIIA) program are described. Detailed descriptions are given of the methodologies used by lead laboratories for the quantification of the impacts of an energy scenario on one or more media (e.g., air, water, land, human and ecology), and by all laboratories to assess the regional impacts on all media. The research and assessments reflected in this document were performed by the following national laboratories: Argonne National Laboratory; Brookhaven National Laboratory; Lawrence Berkeley Laboratory; Los Alamos Scientific Laboratory; Oak Ridge National Laboratory; and Pacific Northwest Laboratory. This report contains five chapters. Chapter 1 briefly describes the overall study methodology and introduces the technical participants. Chapter 2 is a summary of the energy policy scenario selected for the RIIA I study and Chapter 3 describes how this scenario was translated into a county-level siting pattern of energy development. The fourth chapter is a detailed description of the individual methodologies used to quantify the environmental and socioeconomic impacts of the scenario while Chapter 5 describes how these impacts were translated into comprehensive regional assessments for each Federal Region.

  14. Benefit / Cost priorities : achieving commensurability

    NARCIS (Netherlands)

    Wedley, W.C.; Choo, E.U.; Wijnmalen, D.J.D.

    2003-01-01

    Traditional Benefit/Cost analysis requires benefits and costs to be expressed in a common currency, usually dollars. More recently, benefits and costs have been expressed and compared as relative priorities. This process has been criticized because there is no guarantee that the two sources of prior

  15. Suitability of Modern Software Development Methodologies for Model Driven Development

    Directory of Open Access Journals (Sweden)

    Ruben Picek

    2009-12-01

    Full Text Available As an answer to today’s growing challenges in software industry, wide spectrum of new approaches of software development has occurred. One prominent direction is currently most promising software development paradigm called Model Driven Development (MDD. Despite a lot of skepticism and problems, MDD paradigm is being used and improved to accomplish many inherent potential benefits. In the methodological approach of software development it is necessary to use some kind of development process. Modern methodologies can be classified into two main categories: formal or heavyweight and agile or lightweight. But when it is a question about MDD and development process for MDD, currently known methodologies are very poor or better said they don't have any explanation of MDD process. As the result of research, in this paper, author examines the possibilities of using existing modern software methodologies in context of MDD paradigm.

  16. Engineering radioecology: Methodological considerations

    Energy Technology Data Exchange (ETDEWEB)

    Nechaev, A.F.; Projaev, V.V. [St. Petersburg State Inst. of Technology (Russian Federation); Sobolev, I.A.; Dmitriev, S.A. [United Ecologo-Technological and Research Center on Radioactive Waste Management and Environmental Remediation, Moscow (Russian Federation)

    1995-12-31

    The term ``radioecology`` has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ``engineering radioecology``, seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology.

  17. Cancer cytogenetics: methodology revisited.

    Science.gov (United States)

    Wan, Thomas S K

    2014-11-01

    The Philadelphia chromosome was the first genetic abnormality discovered in cancer (in 1960), and it was found to be consistently associated with CML. The description of the Philadelphia chromosome ushered in a new era in the field of cancer cytogenetics. Accumulating genetic data have been shown to be intimately associated with the diagnosis and prognosis of neoplasms; thus, karyotyping is now considered a mandatory investigation for all newly diagnosed leukemias. The development of FISH in the 1980s overcame many of the drawbacks of assessing the genetic alterations in cancer cells by karyotyping. Karyotyping of cancer cells remains the gold standard since it provides a global analysis of the abnormalities in the entire genome of a single cell. However, subsequent methodological advances in molecular cytogenetics based on the principle of FISH that were initiated in the early 1990s have greatly enhanced the efficiency and accuracy of karyotype analysis by marrying conventional cytogenetics with molecular technologies. In this review, the development, current utilization, and technical pitfalls of both the conventional and molecular cytogenetics approaches used for cancer diagnosis over the past five decades will be discussed.

  18. Scientific methodology applied.

    Science.gov (United States)

    Lussier, A

    1975-04-01

    The subject of this symposium is naproxen, a new drug that resulted from an investigation to find a superior anti-inflammatory agent. It was synthesized by Harrison et al. in 1970 at the Syntex Institute of Organic Chemistry and Biological Sciences. How can we chart the evolution of this or any other drug? Three steps are necessary: first, chemical studies (synthesis, analysis); second, animal pharmacology; third, human pharmacology. The last step can additionally be divided into four phases: metabolism and toxicology of the drug in normal volunteers; dose titration and initial clinical trials with sick subjects (pharmacometry); confirmatory clinical trials when the drug is accepted on the market and revaluation (familiarization trials). To discover the truth about naproxen, we must all participate actively with a critical mind, following the principles of scientific methodology. We shall find that the papers to be presented today all deal with the third step in the evaluation process--clinical pharmacology. It is quite evident that the final and most decisive test must be aimed at the most valuable target: the human being. The end product of this day's work for each of us should be the formation of an opinion based on solid scientific proofs. And let us hope that we will all enjoy fulfilling the symposium in its entire etymological meaning this evening. In vino veritas.

  19. Risk-benefit analysis and public policy: a bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Clark, E.M.; Van Horn, A.J.

    1976-11-01

    Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis, and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.

  20. Qualitative Methodology in Unfamiliar Cultures

    DEFF Research Database (Denmark)

    Svensson, Christian Franklin

    2014-01-01

    on a qualitative methodology, conscious reflection on research design and objectivity is important when doing fieldwork. This case study discusses such reflections. Emphasis throughout is given to applied qualitative methodology and its contributions to the social sciences, in particular having to do...... with relational, emotional, and ethical issues associated with interviewing and personal observation. Although the empirical setting of this case is Southeast Asia, the various discussions and interrelatedness of methodology, theory, and empirical reflections will prove applicable to field studies throughout...

  1. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Paulson, Patrick R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Purohit, Sumit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rodriguez, Luke R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  2. Research Review of Post-Evaluation for Comprehensive Benefits of Forestry Ecological Programs

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The paper summarized the meaning of post-evaluation for comprehensive benefits of forestry ecological programs, discussed and reviewed its development process in terms of content, indicators and methodologies, and finally presented its development trend from the perspectives of theoretical research, methodological research and application research.

  3. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  4. Mentoring practices benefiting pediatric nurses.

    Science.gov (United States)

    Weese, Meghan M; Jakubik, Louise D; Eliades, Aris B; Huth, Jennifer J

    2015-01-01

    Previous studies examining predictors of pediatric nurse protégé mentoring benefits demonstrated that protégé perception of quality was the single best predictor of mentoring benefits. The ability to identify the mentoring practices that predict specific benefits for individual nurses provides a better understanding of how mentoring relationships can be leveraged within health care organizations promoting mutual mentoring benefits. This descriptive correlational, non-experimental study of nurses at a northeast Ohio, Magnet® recognized, free-standing pediatric hospital advances nursing science by demonstrating how mentoring practices benefit pediatric nurse protégés.

  5. Quantification of fluorescent reporters in plant cells.

    Science.gov (United States)

    Pound, Michael; French, Andrew P; Wells, Darren M

    2015-01-01

    Fluorescent reporters are powerful tools for plant research. Many studies require accurate determination of fluorescence intensity and localization. Here, we describe protocols for the quantification of fluorescence intensity in plant cells from confocal laser scanning microscope images using semiautomated software and image analysis techniques.

  6. Quantification of interferon signaling in avian cells

    NARCIS (Netherlands)

    Kint, Joeri; Forlenza, Maria

    2015-01-01

    Activation of the type I interferon (IFN) response is an essential defense mechanism against invading pathogens such as viruses. This chapter describes two protocols to quantify activation of the chicken IFN response through analysis of gene expression by real-time quantitative PCR and by quantif

  7. Quantification of coating aging using impedance measurements

    NARCIS (Netherlands)

    Westing, E.P.M. van; Weijde, D.H. van der; Vreijling, M.P.W.; Ferrari, G.M.; Wit, J.H.W. de

    1998-01-01

    This chapter shows the application results of a novel approach to quantify the ageing of organic coatings using impedance measurements. The ageing quantification is based on the typical impedance behaviour of barrier coatings in immersion. This immersion behaviour is used to determine the limiting c

  8. Quantification of topological concepts using ideals

    Directory of Open Access Journals (Sweden)

    Robert Lowen

    2001-01-01

    Full Text Available We introduce certain ideals of real-valued functions as a natural generalization of filters. We show that these ideals establish a canonical framework for the quantification of topological concepts, such as closedness, adherence, and compactness, in the setting of approach spaces.

  9. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  10. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  11. Armentum: a hybrid direct search optimization methodology

    Science.gov (United States)

    Briones, Francisco Zorrilla

    2016-07-01

    Design of experiments (DOE) offers a great deal of benefits to any manufacturing organization, such as characterization of variables and sets the path for the optimization of the levels of these variables (settings) trough the Response surface methodology, leading to process capability improvement, efficiency increase, cost reduction. Unfortunately, the use of these methodologies is very limited due to various situations. Some of these situations involve the investment on production time, materials, personnel, equipment; most of organizations are not willing to invest in these resources or are not capable because of production demands, besides the fact that they will produce non-conformant product (scrap) during the process of experimentation. Other methodologies, in the form of algorithms, may be used to optimize a process. Known as direct search methods, these algorithms search for an optimum on an unknown function, trough the search of the best combination of the levels on the variables considered in the analysis. These methods have a very different application strategy, they search on the best combination of parameters, during the normal production run, calculating the change in the input variables and evaluating the results in small steps until an optimum is reached. These algorithms are very sensible to internal noise (variation of the input variables), among other disadvantages. In this paper it is made a comparison between the classical experimental design and one of these direct search methods, developed by Nelder and Mead (1965), known as the Nelder Mead simplex (NMS), trying to overcome the disadvantages and maximize the advantages of both approaches, trough a proposed combination of the two methodologies.

  12. Cardiovascular benefits of exercise

    Directory of Open Access Journals (Sweden)

    Agarwal SK

    2012-06-01

    Full Text Available Shashi K AgarwalMedical Director, Agarwal Health Center, NJ, USAAbstract: Regular physical activity during leisure time has been shown to be associated with better health outcomes. The American Heart Association, the Centers for Disease Control and Prevention and the American College of Sports Medicine all recommend regular physical activity of moderate intensity for the prevention and complementary treatment of several diseases. The therapeutic role of exercise in maintaining good health and treating diseases is not new. The benefits of physical activity date back to Susruta, a 600 BC physician in India, who prescribed exercise to patients. Hippocrates (460–377 BC wrote “in order to remain healthy, the entire day should be devoted exclusively to ways and means of increasing one's strength and staying healthy, and the best way to do so is through physical exercise.” Plato (427–347 BC referred to medicine as a sister art to physical exercise while the noted ancient Greek physician Galen (129–217 AD penned several essays on aerobic fitness and strengthening muscles. This article briefly reviews the beneficial effects of physical activity on cardiovascular diseases.Keywords: exercise, cardiovascular disease, lifestyle changes, physical activity, good health

  13. HEALTH BENEFITS OF BARLEY

    Directory of Open Access Journals (Sweden)

    Akula Annapurna

    2013-09-01

    Full Text Available Prevalence of lifestyle diseases is increasing day by day. Mostly the younger generation do not have much awareness about healthy nutritional supplements. One such important cereal grain not used mostly by youngsters is barley It is a good old grain with so many health benefits like weight reduction, decreasing blood pressure, blood cholesterol, blood glucose in Type 2 diabetes and preventing colon cancer. It is easily available and cheap grain. It contains both soluble and insoluble fiber, protein, vitamins B and E, minerals selenium, magnesium and iron, copper, flavonoids and anthocynins. Barley contains soluble fiber, beta glucan binds to bile acids in the intestines and thereby decreasing plasma cholesterol levels. Absorbed soluble fiber decreases cholesterol synthesis by liver and cleansing blood vessels. Insoluble fiber provides bulkiness in the intestines, thereby satiety. decreased appetite. It promotes intestinal movements relieving constipation, cleansing colonic harmful bacteria and reduced incidence of colonic cancer. It is a good source of niacin ,reducing LDL levels and increasing HDL levels. Selenium and vitamin E providing beneficial antioxidant effects. Magnesium, a cofactor for many carbohydrate metabolism enzymes and high fiber content contributes for its blood glucose reducing effect in Type 2 diabetes. It is having good diuretic activity and is useful in urinary tract infections. Barley contains gluten, contraindicated in celiac disease.

  14. Research and Development Methodology for Practical Use of Accident Tolerant Fuel in Light Water Reactors

    OpenAIRE

    Masaki Kurata

    2016-01-01

    Research and development (R&D) methodology for the practical use of accident tolerant fuel (ATF) in commercial light water reactors is discussed in the present review. The identification and quantification of the R&D-metrics and the attribute of candidate ATF-concepts, recognition of the gap between the present R&D status and the targeted practical use, prioritization of the R&D, and technology screening schemes are important for achieving a common understanding on technology screening proces...

  15. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    Science.gov (United States)

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  16. Microbiological Methodology in Astrobiology

    Science.gov (United States)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  17. Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Datta, Susmita; Payne, Samuel H.; Kang, Jiyun; Bramer, Lisa M.; Nicora, Carrie D.; Shukla, Anil K.; Metz, Thomas O.; Rodland, Karin D.; Smith, Richard D.; Tardiff, Mark F.; McDermott, Jason E.; Pounds, Joel G.; Waters, Katrina M.

    2014-12-01

    As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.

  18. Quantification of glucosylceramide in plasma of Gaucher disease patients

    Directory of Open Access Journals (Sweden)

    Maria Viviane Gomes Muller

    2010-12-01

    Full Text Available Gaucher disease is a sphingolipidosis that leads to an accumulation of glucosylceramide. The objective of this study was to develop a methodology, based on the extraction, purification and quantification of glucosylceramide from blood plasma, for use in clinical research laboratories. Comparison of the glucosylceramide content in plasma from Gaucher disease patients, submitted to enzyme replacement therapy or otherwise, against that from normal individuals was also carried out. The glucosylceramide, separated from other glycosphingolipids by high performance thin layer chromatography (HPTLC was chemically developed (CuSO4 / H3PO4 and the respective band confirmed by immunostaining (human anti-glucosylceramide antibody / peroxidase-conjugated secondary antibody. Chromatogram quantification by densitometry demonstrated that the glucosylceramide content in Gaucher disease patients was seventeen times higher than that in normal individuals, and seven times higher than that in patients on enzyme replacement therapy. The results obtained indicate that the methodology established can be used in complementary diagnosis and for treatment monitoring of Gaucher disease patients.A doença de Gaucher é uma esfingolipidose caracterizada pelo acúmulo de glicosilceramida. O objetivo deste estudo foi desenvolver metodologia baseada na extração, purificação e quantificação da glicosilceramida plasmática a qual possa ser usada em laboratórios de pesquisa clínica. Após o desenvolvimento desta metodologia, foi proposto, também, comparar o conteúdo de glicosilceramida presente no plasma de pacientes com doença de Gaucher, submetidos ou não a tratamento, com aquele de indivíduos normais. A glicosilceramida, separada de outros glicoesfingolipídios por cromatografia de camada delgada de alto desempenho (HPTLC, foi revelada quimicamente (CuSO4/H3PO4 e a respectiva banda foi confirmada por imunorrevelação (anticorpo anti-glicosilceramida humana

  19. Cost-benefit analysis in occupational health: A comparison of intervention scenarios for occupational asthma and rhinitis among bakery workers

    NARCIS (Netherlands)

    Meijster, T.; Duuren-Stuurman, B. van; Heederik, D.; Houba, R.; Koningsveld, E.; Warren, N.; Tielemans, E.

    2011-01-01

    Objectives: Use of cost-benefit analysis in occupational health increases insight into the intervention strategy that maximises the cost-benefit ratio. This study presents a methodological framework identifying the most important elements of a cost-benefit analysis for occupational health settings.

  20. Development of a methodology for the application of the analysis of human reliability to individualized temporary storage facility; Desarrollo de una metodologia de aplicacion del Analisis de Fiabilidad Humana a una instalacion de Almacen Temporal Individualizado

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, P.; Dies, J.; Tapia, C.; Blas, A. de

    2014-07-01

    The paper aims to present the methodology that has been developed with the purpose of applying an ATI without the need of having experts during the process of modelling and quantification analysis of HRA. The developed methodology is based on ATHEANA and relies on the use of other methods of analysis of human action and in-depth analysis. (Author)

  1. Workshops as a Research Methodology

    Science.gov (United States)

    Ørngreen, Rikke; Levinsen, Karin

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on…

  2. A methodology for social experimentation

    DEFF Research Database (Denmark)

    Ravn, Ib

    A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations......A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations...

  3. Methodology of Law and Economics

    NARCIS (Netherlands)

    A.M. Pacces (Alessio Maria); L.T. Visscher (Louis)

    2011-01-01

    textabstractIntroduction A chapter on the methodology of law and economics, i.e. the economic analysis of law, concerns the methodology of economics. The above quote (Becker 1976, 5) shows that economics should not be defined by its subject, but by its method (also Veljanovski 2007, 19). This method

  4. Methodological Pluralism and Narrative Inquiry

    Science.gov (United States)

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  5. Choosing a Methodology: Philosophical Underpinning

    Science.gov (United States)

    Jackson, Elizabeth

    2013-01-01

    As a university lecturer, I find that a frequent question raised by Masters students concerns the methodology chosen for research and the rationale required in dissertations. This paper unpicks some of the philosophical coherence that can inform choices to be made regarding methodology and a well-thought out rationale that can add to the rigour of…

  6. Building ASIPS the Mescal methodology

    CERN Document Server

    Gries, Matthias

    2006-01-01

    A number of system designers use ASIP's rather than ASIC's to implement their system solutions. This book gives a comprehensive methodology for the design of these application-specific instruction processors (ASIPs). It includes demonstrations of applications of the methodologies using the Tipi research framework.

  7. Application of the NUREG/CR-6850 EPRI/NRC Fire PRA Methodology to a DOE Facility

    Energy Technology Data Exchange (ETDEWEB)

    Tom Elicson; Bentley Harwood; Richard Yorg; Heather Lucek; Jim Bouchard; Ray Jukkola; Duan Phan

    2011-03-01

    The application NUREG/CR-6850 EPRI/NRC fire PRA methodology to DOE facility presented several challenges. This paper documents the process and discusses several insights gained during development of the fire PRA. A brief review of the tasks performed is provided with particular focus on the following: • Tasks 5 and 14: Fire-induced risk model and fire risk quantification. A key lesson learned was to begin model development and quantification as early as possible in the project using screening values and simplified modeling if necessary. • Tasks 3 and 9: Fire PRA cable selection and detailed circuit failure analysis. In retrospect, it would have been beneficial to perform the model development and quantification in 2 phases with detailed circuit analysis applied during phase 2. This would have allowed for development of a robust model and quantification earlier in the project and would have provided insights into where to focus the detailed circuit analysis efforts. • Tasks 8 and 11: Scoping fire modeling and detailed fire modeling. More focus should be placed on detailed fire modeling and less focus on scoping fire modeling. This was the approach taken for the fire PRA. • Task 14: Fire risk quantification. Typically, multiple safe shutdown (SSD) components fail during a given fire scenario. Therefore dependent failure analysis is critical to obtaining a meaningful fire risk quantification. Dependent failure analysis for the fire PRA presented several challenges which will be discussed in the full paper.

  8. Size-exclusion HPLC as a sensitive and calibrationless method for complex peptide mixtures quantification.

    Science.gov (United States)

    Bodin, Alice; Framboisier, Xavier; Alonso, Dominique; Marc, Ivan; Kapel, Romain

    2015-12-01

    This work describes an original methodology to quantify complex peptide mixtures by size-exclusion high-performance liquid chromatography (SE-HPLC). The methodology was first tested on simulated elutions of peptide mixtures. For this set of experiments, a good estimation of the total peptide concentration was observed (error less than 10 %). Then 30 fractions obtained by ultrafiltration of hydrolysates from two different sources were titrated by Kjeldahl or BCA analysis and analysed by SE-HPLC for an experimental validation of the methodology. Very good matchs between methods were obtained. The linear working range depends on the hydrolysate but is generally between 0.2 and 4gL(-1) (i.e. between 10 and 200μg). Moreover, the presence of organic solvents or salts in samples does not impact the accuracy of the methodology contrary to common quantification methods. Hence, the findings of this study show that total concentration of complex peptide mixture can be efficiently determinate by the proposed methodology using simple SE-HPLC analysis.

  9. Who Benefits from Volunteering? Variations in Perceived Benefits

    Science.gov (United States)

    Morrow-Howell, Nancy; Hong, Song-Iee; Tang, Fengyan

    2009-01-01

    Purpose: The purpose of this study was to document the benefits of volunteering perceived by older adults and to explain variation in these self-perceived benefits. Design and Methods: This is a quantitative study of 13 volunteer programs and 401 older adults serving in those programs. Program directors completed telephone interviews, and older…

  10. The business case: The missing link between information technology benefits and organisational strategies

    OpenAIRE

    Carl Marnewick

    2014-01-01

    Purpose: Business cases are an integral part of information technology (IT) projects, providingthe linkage between the organisational strategies and the promised benefits. Most majorproject management standards and methodologies make reference to the business case andits intended usage.Problem investigated: The success of IT projects is measured based on the benefits they deliver; anecdotal evidence states that IT projects fail at an alarming rate. The benefits are promised in the business ca...

  11. A home-brew real-time PCR assay for reliable detection and quantification of mature miR-122.

    Science.gov (United States)

    Naderi, Mahmood; Abdul Tehrani, Hossein; Soleimani, Masoud; Shabani, Iman; Hashemi, Seyed Mahmoud

    2015-09-01

    miR-122 is a liver-specific miRNA that has significant gene expression alterations in response to specific pathophysiological circumstances of liver such as drug-induced liver injury, hepatocellular carcinoma, and hepatitis B and C virus infections. Therefore, accurate and precise quantification of miR-122 is very important for clinical diagnostics. However, because of the lack of in vitro diagnostics assays for miR-122 detection and quantification of the existence of an open-source assay could inevitably provide external evaluation by other researchers and the chance of promoting the assay when required. The aim of this study was to develop a Taqman real-time polymerase chain reaction assay, which is capable of robust and reliable quantification of miR-122 in different sample types. We used stem loop methodology to design a specific Taqman real-time polymerase chain reaction assay for miR-122. This technique enabled us to reliably and reproducibly quantify short-length oligonucleotides such as miR-122. The specificity, sensitivity, interassay and intra-assay, and the dynamic range of the assay were experimentally determined by their respective methodology. The assay had a linear dynamic range of 3E to 4.8E miR-122 copies/reaction and the limit of detection was determined to be between 960 and 192 copies/reaction with 95% confidence interval. The assay gave a coefficient of variation for the Ct values of 50,000 copies per hepatocyte, this assay is able to suffice the need for reliable detection and quantification of this miRNA. Therefore, this study can be considered as a start point for standardizing miR-122 quantification.

  12. Scoping studies: advancing the methodology

    Directory of Open Access Journals (Sweden)

    O'Brien Kelly K

    2010-09-01

    Full Text Available Abstract Background Scoping studies are an increasingly popular approach to reviewing health research evidence. In 2005, Arksey and O'Malley published the first methodological framework for conducting scoping studies. While this framework provides an excellent foundation for scoping study methodology, further clarifying and enhancing this framework will help support the consistency with which authors undertake and report scoping studies and may encourage researchers and clinicians to engage in this process. Discussion We build upon our experiences conducting three scoping studies using the Arksey and O'Malley methodology to propose recommendations that clarify and enhance each stage of the framework. Recommendations include: clarifying and linking the purpose and research question (stage one; balancing feasibility with breadth and comprehensiveness of the scoping process (stage two; using an iterative team approach to selecting studies (stage three and extracting data (stage four; incorporating a numerical summary and qualitative thematic analysis, reporting results, and considering the implications of study findings to policy, practice, or research (stage five; and incorporating consultation with stakeholders as a required knowledge translation component of scoping study methodology (stage six. Lastly, we propose additional considerations for scoping study methodology in order to support the advancement, application and relevance of scoping studies in health research. Summary Specific recommendations to clarify and enhance this methodology are outlined for each stage of the Arksey and O'Malley framework. Continued debate and development about scoping study methodology will help to maximize the usefulness and rigor of scoping study findings within healthcare research and practice.

  13. Quantification of protein posttranslational modifications using stable isotope and mass spectrometry. II. Performance.

    Science.gov (United States)

    Luo, Quanzhou; Wypych, Jette; Jiang, Xinzhao Grace; Zhang, Xin; Luo, Shun; Jerums, Matthew; Lewis, Jeffrey; Iii, Ronald Keener; Huang, Gang; Apostol, Izydor

    2012-02-15

    In this report, we examine the performance of a mass spectrometry (MS)-based method for quantification of protein posttranslational modifications (PTMs) using stable isotope labeled internal standards. Uniform labeling of proteins and highly similar behavior of the labeled vs nonlabeled analyte pairs during chromatographic separation and electrospray ionization (ESI) provide the means to directly quantify a wide range of PTMs. In the companion report (Jiang et al., Anal. Biochem., 421 (2012) 506-516.), we provided principles and example applications of the method. Here we show satisfactory accuracy and precision for quantifying protein modifications by using the SILIS method when the analyses were performed on different types of mass spectrometers, such as ion-trap, time-of-flight (TOF), and quadrupole instruments. Additionally, the stable isotope labeled internal standard (SILIS) method demonstrated an extended linear range of quantification expressed in accurate quantification up to at least a 4 log concentration range on three different types of mass spectrometers. We also demonstrate that lengthy chromatographic separation is no longer required to obtain quality results, offering an opportunity to significantly shorten the method run time. The results indicate the potential of this methodology for rapid and large-scale assessment of multiple quality attributes of a therapeutic protein in a single analysis.

  14. Fluorometric quantification of polyphosphate in environmental plankton samples: extraction protocols, matrix effects, and nucleic acid interference.

    Science.gov (United States)

    Martin, Patrick; Van Mooy, Benjamin A S

    2013-01-01

    Polyphosphate (polyP) is a ubiquitous biochemical with many cellular functions and comprises an important environmental phosphorus pool. However, methodological challenges have hampered routine quantification of polyP in environmental samples. We tested 15 protocols to extract inorganic polyphosphate from natural marine samples and cultured cyanobacteria for fluorometric quantification with 4',6-diamidino-2-phenylindole (DAPI) without prior purification. A combination of brief boiling and digestion with proteinase K was superior to all other protocols, including other enzymatic digestions and neutral or alkaline leaches. However, three successive extractions were required to extract all polyP. Standard addition revealed matrix effects that differed between sample types, causing polyP to be over- or underestimated by up to 50% in the samples tested here. Although previous studies judged that the presence of DNA would not complicate fluorometric quantification of polyP with DAPI, we show that RNA can cause significant interference at the wavelengths used to measure polyP. Importantly, treating samples with DNase and RNase before proteinase K digestion reduced fluorescence by up to 57%. We measured particulate polyP along a North Pacific coastal-to-open ocean transect and show that particulate polyP concentrations increased toward the open ocean. While our final method is optimized for marine particulate matter, different environmental sample types may need to be assessed for matrix effects, extraction efficiency, and nucleic acid interference.

  15. Quantification of organic acids in beer by nuclear magnetic resonance (NMR)-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, J.E.A. [CICECO-Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Erny, G.L. [CESAM - Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Barros, A.S. [QOPNAA-Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Esteves, V.I. [CESAM - Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Brandao, T.; Ferreira, A.A. [UNICER, Bebidas de Portugal, Leca do Balio, 4466-955 S. Mamede de Infesta (Portugal); Cabrita, E. [Department of Chemistry, New University of Lisbon, 2825-114 Caparica (Portugal); Gil, A.M., E-mail: agil@ua.pt [CICECO-Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal)

    2010-08-03

    The organic acids present in beer provide important information on the product's quality and history, determining organoleptic properties and being useful indicators of fermentation performance. NMR spectroscopy may be used for rapid quantification of organic acids in beer and different NMR-based methodologies are hereby compared for the six main acids found in beer (acetic, citric, lactic, malic, pyruvic and succinic). The use of partial least squares (PLS) regression enables faster quantification, compared to traditional integration methods, and the performance of PLS models built using different reference methods (capillary electrophoresis (CE), both with direct and indirect UV detection, and enzymatic essays) was investigated. The best multivariate models were obtained using CE/indirect detection and enzymatic essays as reference and their response was compared with NMR integration, either using an internal reference or an electrical reference signal (Electronic REference To access In vivo Concentrations, ERETIC). NMR integration results generally agree with those obtained by PLS, with some overestimation for malic and pyruvic acids, probably due to peak overlap and subsequent integral errors, and an apparent relative underestimation for citric acid. Overall, these results make the PLS-NMR method an interesting choice for organic acid quantification in beer.

  16. Simple and rapid quantification of brominated vegetable oil in commercial soft drinks by LC-MS.

    Science.gov (United States)

    Chitranshi, Priyanka; Gamboa da Costa, Gonçalo

    2016-12-15

    We report here a simple and rapid method for the quantification of brominated vegetable oil (BVO) in soft drinks based upon liquid chromatography-electrospray ionization mass spectrometry. Unlike previously reported methods, this novel method does not require hydrolysis, extraction or derivatization steps, but rather a simple "dilute and shoot" sample preparation. The quantification is conducted by mass spectrometry in selected ion recording mode and a single point standard addition procedure. The method was validated in the range of 5-25μg/mL BVO, encompassing the legal limit of 15μg/mL established by the US FDA for fruit-flavored beverages in the US market. The method was characterized by excellent intra- and inter-assay accuracy (97.3-103.4%) and very low imprecision [0.5-3.6% (RSD)]. The direct nature of the quantification, simplicity, and excellent statistical performance of this methodology constitute clear advantages in relation to previously published methods for the analysis of BVO in soft drinks.

  17. Bioluminescence regenerative cycle (BRC) system for nucleic acid quantification assays

    Science.gov (United States)

    Hassibi, Arjang; Lee, Thomas H.; Davis, Ronald W.; Pourmand, Nader

    2003-07-01

    A new label-free methodology for nucleic acid quantification has been developed where the number of pyrophosphate molecules (PPi) released during polymerization of the target nucleic acid is counted and correlated to DNA copy number. The technique uses the enzymatic complex of ATP-sulfurylase and firefly luciferase to generate photons from PPi. An enzymatic unity gain positive feedback is also implemented to regenerate the photon generation process and compensate any decay in light intensity by self regulation. Due to this positive feedback, the total number of photons generated by the bioluminescence regenerative cycle (BRC) can potentially be orders of magnitude higher than typical chemiluminescent processes. A system level kinetic model that incorporates the effects of contaminations and detector noise was used to show that the photon generation process is in fact steady and also proportional to the nucleic acid quantity. Here we show that BRC is capable of detecting quantities of DNA as low as 1 amol (10-18 mole) in 40μlit aqueous solutions, and this enzymatic assay has a controllable dynamic range of 5 orders of magnitude. The sensitivity of this technology, due to the excess number of photons generated by the regenerative cycle, is not constrained by detector performance, but rather by possible PPi or ATP (adenosine triphosphate) contamination, or background bioluminescence of the enzymatic complex.

  18. Methodological practicalities in analytical generalization

    DEFF Research Database (Denmark)

    Halkier, Bente

    2011-01-01

    In this article, I argue that the existing literature on qualitative methodologies tend to discuss analytical generalization at a relatively abstract and general theoretical level. It is, however, not particularly straightforward to “translate” such abstract epistemological principles into more...... operative methodological strategies for producing analytical generalizations in research practices. Thus, the aim of the article is to contribute to the discussions among qualitatively working researchers about generalizing by way of exemplifying some of the methodological practicalities in analytical...... and processes in producing the three different ways of generalizing: ideal typologizing, category zooming, and positioning....

  19. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  20. Can the CFO Trust the FX Exposure Quantification from a Stock Market Approach?

    DEFF Research Database (Denmark)

    Aabo, Tom; Brodin, Danielle

    This study examines the sensitivity of detected exchange rate exposures at the firm specific level to changes in methodological choices using a traditional two factor stock market approach for exposure quantification. We primarily focus on two methodological choices: the choice of market index...... and the choice of observation frequency. We investigate to which extent the detected exchange rate exposures for a given firm can be confirmed when the choice of market index and/or the choice of observation frequency are changed. Applying our sensitivity analysis to Scandinavian non-financial firms, we...... thirds of the number of detected exposures using weekly data and 2) there is no economic rationale that the detected exposures at the firm-specific level should change when going from the use of weekly data to the use of monthly data. In relation to a change in the choice of market index, we find...

  1. Can the CFO Trust the FX Exposure Quantification from a Stock Market Approach?

    DEFF Research Database (Denmark)

    Aabo, Tom; Brodin, Danielle

    and the choice of observation frequency. We investigate to which extent the detected exchange rate exposures for a given firm can be confirmed when the choice of market index and/or the choice of observation frequency are changed. Applying our sensitivity analysis to Scandinavian non-financial firms, we......This study examines the sensitivity of detected exchange rate exposures at the firm specific level to changes in methodological choices using a traditional two factor stock market approach for exposure quantification. We primarily focus on two methodological choices: the choice of market index...... thirds of the number of detected exposures using weekly data and 2) there is no economic rationale that the detected exposures at the firm-specific level should change when going from the use of weekly data to the use of monthly data. In relation to a change in the choice of market index, we find...

  2. Quantification of functional dynamics of membrane proteins reconstituted in nanodiscs membranes by single turnover functional readout

    DEFF Research Database (Denmark)

    Moses, Matias Emil; Hedegård, Per; Hatzakis, Nikos

    2016-01-01

    and quantification of the activity, abundance, and lifetime of multiple states and transient intermediates in the energy landscape that are typically averaged out in nonsynchronized ensemble measurements. Studying the function of membrane proteins at the single-molecule level remains a formidable challenge......, and to date there is limited number of available functional assays. In this chapter, we describe in detail our recently developed methodology to reconstitute membrane proteins such as the integral membrane protein cytochrome P450 oxidoreductase on membrane systems such as Nanodiscs and study their functional...... dynamics by recordings at the fundamental resolution of individual catalytic turnovers using prefluorescent substrate analogues. We initially describe the methodology for reconstitution, surface immobilization, and data acquisition of individual enzyme catalytic turnovers. We then explain in detail...

  3. Effects of climate model interdependency on the uncertainty quantification of extreme reinfall projections

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia; Madsen, H.; Rosbjerg, Dan;

    The inherent uncertainty in climate models is one of the most important uncertainties in climate change impact studies. In recent years, several uncertainty quantification methods based on multi-model ensembles have been suggested. Most of these methods assume that the climate models...... are independent. This study investigates the validity of this assumption and its effects on the estimated probabilistic projections of the changes in the 95% quantile of wet days. The methodology is divided in two main parts. First, the interdependency of the ENSEMBLES RCMs is estimated using the methodology...... developed by Pennell and Reichler (2011). The results show that the projections from the ENSEMBLES RCMs cannot be assumed independent. This result is then used to estimate the uncertainty in climate model projections. A Bayesian approach has been developed using the procedure suggested by Tebaldi et al...

  4. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    Science.gov (United States)

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  5. Near-optimal RNA-Seq quantification

    OpenAIRE

    Bray, Nicolas; Pimentel, Harold; Melsted, Páll; Pachter, Lior

    2015-01-01

    We present a novel approach to RNA-Seq quantification that is near optimal in speed and accuracy. Software implementing the approach, called kallisto, can be used to analyze 30 million unaligned paired-end RNA-Seq reads in less than 5 minutes on a standard laptop computer while providing results as accurate as those of the best existing tools. This removes a major computational bottleneck in RNA-Seq analysis.

  6. Standardized Relative Quantification of Immunofluorescence Tissue Staining

    OpenAIRE

    sprotocols

    2015-01-01

    Authors: Oriol Arqués, Irene Chicote, Stephan Tenbaum, Isabel Puig & Héctor G. Palmer ### Abstract The detection of correlations between the expression levels or sub-cellular localization of different proteins with specific characteristics of human tumors, such as e.g. grade of malignancy, may give important hints of functional associations. Here we describe the method we use for relative quantification of immunofluorescence staining of tumor tissue sections, which allows us to co...

  7. Whitepaper on Uncertainty Quantification for MPACT

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Mark L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-17

    The MPACT code provides the ability to perform high-fidelity deterministic calculations to obtain a wide variety of detailed results for very complex reactor core models. However MPACT currently does not have the capability to propagate the effects of input data uncertainties to provide uncertainties in the calculated results. This white paper discusses a potential method for MPACT uncertainty quantification (UQ) based on stochastic sampling.

  8. Methodologies used to estimate tobacco-attributable mortality: a review

    Directory of Open Access Journals (Sweden)

    Pérez-Ríos Mónica

    2008-01-01

    Full Text Available Abstract Background One of the most important measures for ascertaining the impact of tobacco on a population is the estimation of the mortality attributable to its use. To measure this, a number of indirect methods of quantification are available, yet there is no consensus as to which furnishes the best information. This study sought to provide a critical overview of the different methods of attribution of mortality due to tobacco consumption. Method A search was made in the Medline database until March 2005 in order to obtain papers that addressed the methodology employed for attributing mortality to tobacco use. Results Of the total of 7 methods obtained, the most widely used were the prevalence methods, followed by the approach proposed by Peto et al, with the remainder being used in a minority of studies. Conclusion Different methodologies are used to estimate tobacco attributable mortality, but their methodological foundations are quite similar in all. Mainly, they are based on the calculation of proportional attributable fractions. All methods show limitations of one type or another, sometimes common to all methods and sometimes specific.

  9. Automated quantification of synapses by fluorescence microscopy.

    Science.gov (United States)

    Schätzle, Philipp; Wuttke, René; Ziegler, Urs; Sonderegger, Peter

    2012-02-15

    The quantification of synapses in neuronal cultures is essential in studies of the molecular mechanisms underlying synaptogenesis and synaptic plasticity. Conventional counting of synapses based on morphological or immunocytochemical criteria is extremely work-intensive. We developed a fully automated method which quantifies synaptic elements and complete synapses based on immunocytochemistry. Pre- and postsynaptic elements are detected by their corresponding fluorescence signals and their proximity to dendrites. Synapses are defined as the combination of a pre- and postsynaptic element within a given distance. The analysis is performed in three dimensions and all parameters required for quantification can be easily adjusted by a graphical user interface. The integrated batch processing enables the analysis of large datasets without any further user interaction and is therefore efficient and timesaving. The potential of this method was demonstrated by an extensive quantification of synapses in neuronal cultures from DIV 7 to DIV 21. The method can be applied to all datasets containing a pre- and postsynaptic labeling plus a dendritic or cell surface marker.

  10. Automated Template Quantification for DNA Sequencing Facilities

    Science.gov (United States)

    Ivanetich, Kathryn M.; Yan, Wilson; Wunderlich, Kathleen M.; Weston, Jennifer; Walkup, Ward G.; Simeon, Christian

    2005-01-01

    The quantification of plasmid DNA by the PicoGreen dye binding assay has been automated, and the effect of quantification of user-submitted templates on DNA sequence quality in a core laboratory has been assessed. The protocol pipets, mixes and reads standards, blanks and up to 88 unknowns, generates a standard curve, and calculates template concentrations. For pUC19 replicates at five concentrations, coefficients of variance were 0.1, and percent errors were from 1% to 7% (n = 198). Standard curves with pUC19 DNA were nonlinear over the 1 to 1733 ng/μL concentration range required to assay the majority (98.7%) of user-submitted templates. Over 35,000 templates have been quantified using the protocol. For 1350 user-submitted plasmids, 87% deviated by ≥ 20% from the requested concentration (500 ng/μL). Based on data from 418 sequencing reactions, quantification of user-submitted templates was shown to significantly improve DNA sequence quality. The protocol is applicable to all types of double-stranded DNA, is unaffected by primer (1 pmol/μL), and is user modifiable. The protocol takes 30 min, saves 1 h of technical time, and costs approximately $0.20 per unknown. PMID:16461949

  11. Validated method for phytohormone quantification in plants

    Directory of Open Access Journals (Sweden)

    Marilia eAlmeida-Trapp

    2014-08-01

    Full Text Available Phytohormones are long time known as important components of signalling cascades in plant development and plant responses to various abiotic and biotic challenges. Quantifications of phytohormone levels in plants are typically carried out using GC or LC-MS/MS systems, due to their high sensitivity, specificity, and the fact that not much sample preparation is needed. However, mass spectrometer-based analyses are often affected by the particular sample type (different matrices, extraction procedure, and experimental setups, i.e. the chromatographic separation system and/or mass spectrometer analyser (Triple-quadrupole, Iontrap, TOF, Orbitrap. For these reasons, a validated method is required in order to enable comparison of data that are generated in different laboratories, under different experimental set-ups, and in different matrices.So far, many phytohormone quantification studies were done using either QTRAP or Triple-quadrupole mass spectrometers. None of them was performed under the regime of a fully-validated method. Therefore, we developed and established such validated method for quantification of stress-related phytohormones such as jasmonates, abscisic acid, salicylic acid, IAA, in the model plant Arabidopsis thaliana and the fruit crop Citrus sinensis, using an Iontrap mass spectrometer. All parameters recommended by FDA (US Food and Drug Administration or EMEA (European Medicines Evaluation Agency for validation of analytical methods were evaluated: sensitivity, selectivity, repeatability and reproducibility (accuracy and precision.

  12. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    The systematic quantification of the uncertainties affecting dynamical systems and the characterization of the uncertainty of their outcomes is critical for engineering design and analysis, where risks must be reduced as much as possible. Uncertainties stem naturally from our limitations in measu......The systematic quantification of the uncertainties affecting dynamical systems and the characterization of the uncertainty of their outcomes is critical for engineering design and analysis, where risks must be reduced as much as possible. Uncertainties stem naturally from our limitations...... in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... some auxiliary properties, we will apply PC on it, obtaining the STT-decomposition. This will allow the decoupling of each dimension, leading to a much cheaper construction of the PC surrogate. In the associated paper, the capabilities of the STT-decomposition are checked on commonly used test...

  13. The benefits of customer profitability analysis in the hospitality industry

    Directory of Open Access Journals (Sweden)

    Dragan Georgiev

    2017-03-01

    Full Text Available The article reveals the benefits of customer profitability analysis implementation according to the specifics of the hotel product and the state of the management accounting in hotels. On this basis is substantiated the necessity management accounting and information systems in the hotels to be anteriorly adapted and developed in relevance with the objectives and methodological tools of customer profitability analysis, while keeping their function in collecting information for operational revenues and costs by responsibility centers. A model for customer profitability analysis based on ABC method is proposed in this connection, providing an example to clarify its methodological aspects and benefits. The latter consist in providing information for the purposes of taking a variety of management decisions regarding costs, product mix, pricing, performance measurement and implementation of various marketing initiatives.

  14. Contextual factors, methodological principles and teacher cognition

    Directory of Open Access Journals (Sweden)

    Rupert Walsh

    2014-01-01

    Full Text Available Teachers in various contexts worldwide are sometimes unfairly criticized for not putting teaching methods developed for the well-resourced classrooms of Western countries into practice. Factors such as the teachers’ “misconceptualizations” of “imported” methods, including Communicative Language Teaching (CLT, are often blamed, though the challenges imposed by “contextual demands,” such as large class sizes, are sometimes recognised. Meanwhile, there is sometimes an assumption that in the West there is a happy congruence between policy supportive of CLT or Task-Based Language Teaching, teacher education and supervision, and curriculum design with teachers’ cognitions and their practices. Our case study of three EFL teachers at a UK adult education college is motivated by a wish to question this assumption. Findings from observational and interview data suggest the practices of two teachers were largely consistent with their methodological principles, relating to stronger and weaker forms of CLT respectively, as well as to more general educational principles, such as a concern for learners; the supportive environment seemed to help. The third teacher appeared to put “difficult” contextual factors, for example, tests, ahead of methodological principles without, however, obviously benefiting. Implications highlight the important role of teacher cognition research in challenging cultural assumptions.

  15. Methodology for evaluation of intertechnology tradeoffs

    Energy Technology Data Exchange (ETDEWEB)

    Buehring, W.A.; Whitfield, R.G.; Wolsko, T.D.

    1980-10-01

    A methodological interface between the impact-assessment process and the decision maker is developed. Although the presented method is rather widely applicable, it is particularly helpful for those who are dealing with the complex problems surrounding the environmental, health, and economic impacts of alternative-energy technologies and policies. These problems are characterized by: (1) multiple conflicting objectives, (2) uncertainty, (3) variable outcomes over time, and (4) dynamic behavior. The issues are complicated further by the different preference structures of stakeholders, analysts, and decision makers. Minimum cost, benefit-cost, and decision analysis are three approaches that have been taken with respect to this type of problem. Decision analysis has been found to offer certain advantages over the others in that impact aggregation and uncertainty can be incorporated rather successfully. The recommended methodology has three basic stages: (1) problem formulation, (2) objective hierarchy selection, and (3) alternatives evaluation. The output of the formal analysis is assessment information that can serve as the basis of the informal processes that result in policy statements or recommendations.

  16. Increasing Enrollment through Benefit Segmentation.

    Science.gov (United States)

    Goodnow, Betty

    1982-01-01

    The applicability of benefit segmentation, a market research technique which groups people according to benefits expected from a program offering, was tested at the College of DuPage. Preferences and demographic characteristics were analyzed and program improvements adopted, increasing enrollment by 20 percent. (Author/SK)

  17. Rapid Benefit Indicators (RBI) webinar

    Science.gov (United States)

    RBI process for assessing restoration sites using non-monetary benefit indicators. The RBI approach uses readily-available data to estimate and quantify benefits to people around an ecological restoration site using indicators of nature’s value to people.

  18. Gauging Technology Costs and Benefits

    Science.gov (United States)

    Kaestner, Rich

    2007-01-01

    Regardless of the role technology plays in a school district, district personnel should know the costs associated with technology, understand the consequences of technology purchases, and be able to measure the benefits of technology, so they can make more informed decisions. However, determining costs and benefits of current technology or…

  19. Increasing Enrollment through Benefit Segmentation.

    Science.gov (United States)

    Goodnow, Betty

    1982-01-01

    The applicability of benefit segmentation, a market research technique which groups people according to benefits expected from a program offering, was tested at the College of DuPage. Preferences and demographic characteristics were analyzed and program improvements adopted, increasing enrollment by 20 percent. (Author/SK)

  20. Taxability of Educational Benefits Trusts

    Science.gov (United States)

    Temple Law Quarterly, 1976

    1976-01-01

    Corporations have found the promise of providing a college education to the children of employees--without the recognition of income to the parent-employee--to be a popular fringe benefit. The Internal Revenue Service has attacked educational benefit trusts in Revenue Ruling 75-448. Implications are discussed. (LBH)

  1. Quantification and Negation in Event Semantics

    Directory of Open Access Journals (Sweden)

    Lucas Champollion

    2010-12-01

    Full Text Available Recently, it has been claimed that event semantics does not go well together with quantification, especially if one rejects syntactic, LF-based approaches to quantifier scope. This paper shows that such fears are unfounded, by presenting a simple, variable-free framework which combines a Neo-Davidsonian event semantics with a type-shifting based account of quantifier scope. The main innovation is that the event variable is bound inside the verbal denotation, rather than at sentence level by existential closure. Quantifiers can then be interpreted in situ. The resulting framework combines the strengths of event semantics and type-shifting accounts of quantifiers and thus does not force the semanticist to posit either a default underlying word order or a syntactic LF-style level. It is therefore well suited for applications to languages where word order is free and quantifier scope is determined by surface order. As an additional benefit, the system leads to a straightforward account of negation, which has also been claimed to be problematic for event-based frameworks.ReferencesBarker, Chris. 2002. ‘Continuations and the nature of quantification’. Natural Language Semantics 10: 211–242.http://dx.doi.org/10.1023/A:1022183511876Barker, Chris & Shan, Chung-chieh. 2008. ‘Donkey anaphora is in-scope binding’. Semantics and Pragmatics 1: 1–46.Beaver, David & Condoravdi, Cleo. 2007. ‘On the logic of verbal modification’. In Maria Aloni, Paul Dekker & Floris Roelofsen (eds. ‘Proceedings of the Sixteenth Amsterdam Colloquium’, 3–9. Amsterdam, Netherlands: University of Amsterdam.Beghelli, Filippo & Stowell, Tim. 1997. ‘Distributivity and negation: The syntax of each and every’. In Anna Szabolcsi (ed. ‘Ways of scope taking’, 71–107. Dordrecht, Netherlands: Kluwer.Brasoveanu, Adrian. 2010. ‘Modified Numerals as Post-Suppositions’. In Maria Aloni, Harald Bastiaanse, Tikitu de Jager & Katrin Schulz (eds. ‘Logic, Language

  2. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  3. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  4. [Costs and benefits of smoking].

    Science.gov (United States)

    Polder, J J; van Gils, P F; Kok, L; Talhout, R; Feenstra, T L

    2017-01-01

    - Two recent societal cost-benefit analyses have documented the costs of smoking and the cost-effectiveness of preventing smoking.- Smoking costs the Netherlands society EUR 33 billion per year.- The majority of this is the monetary value of health loss; these are "soft" euros that cannot be re-spent.- There is not a great deal of difference between costs and benefits when expressed in "hard" euros, which means that there is no clear business case for anti-smoking policy.- The greatest benefit of discouraging smoking is improved health for the individual and increased productivity for the business sector; however, the benefits cannot be easily realised, because even in the most favourable scenario the number of smokers will decrease slowly.- Excise duties seem to offer the most promising avenue for combating smoking. The benefits of anti-smoking policy, therefore, consist mainly of tax revenues for the government.- Stringent policy is required to transform tax revenues into health gains.

  5. Qualitative methodology in developmental psychology

    DEFF Research Database (Denmark)

    Demuth, Carolin; Mey, Günter

    2015-01-01

    Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing...

  6. Nanotoxicology materials, methodologies, and assessments

    CERN Document Server

    Durán, Nelson; Alves, Oswaldo L; Zucolotto, Valtencir

    2014-01-01

    This book begins with a detailed introduction to engineered nanostructures, followed by a section on methodologies used in research on cytotoxicity and genotoxicity, and concluding with evidence for the cyto- and genotoxicity of specific nanoparticles.

  7. No crisis but methodological separatism

    DEFF Research Database (Denmark)

    Erola, Jani; Reimer, David; Räsänen, Pekka;

    2015-01-01

    This article compares methodological trends in nationally and internationally oriented sociology using data from the articles of three Nordic sociological journals: one international (Acta Sociologica), one Finnish (Sosiologia), and one Danish (Dansk Sociologi). The data consists of 943 articles ...

  8. Methodological Reflections: Inter- ethnic Research

    DEFF Research Database (Denmark)

    Singla, Rashmi

    2010-01-01

    This article reflects on the methodological and epistemological aspects of the ethical issues involved in encounters between researcher and research participants with ethnic minority background in contexts with diversity. Specific challenges involved in longitudinal research (10 - 15 years) are a...

  9. Some notes on taxonomic methodology

    NARCIS (Netherlands)

    Hammen, van der L.

    1986-01-01

    The present paper constitutes an introduction to taxonomic methodology. After an analysis of taxonomic practice, and a brief survey of kinds of attributes, the paper deals with observation, description, comparison, arrangement and classification, hypothesis construction, deduction, model, experiment

  10. Methodology and Foreground of Metallomics

    Institute of Scientific and Technical Information of China (English)

    He Bin; Jiang Guibin

    2005-01-01

    Metallomics is proposed as a new omics to follow genomics, proteomics and metabolomics. This paper gives an overview of the development of metallomics based on the introduction of the concept of metallomics and its methodology.

  11. Qualitative methodology in developmental psychology

    DEFF Research Database (Denmark)

    Demuth, Carolin; Mey, Günter

    2015-01-01

    Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing...... recognition of the sociocultural embeddedness of human development, and of the importance to study individuals’ subjective experience, however, calls for adequate methodological procedures that allow for the study of processes of transformation across the life span. The wide range of established procedures...

  12. MOTOR VEHICLE SAFETY RESEARCH METHODOLOGY

    Directory of Open Access Journals (Sweden)

    A. Stepanov

    2015-07-01

    Full Text Available The issues of vehicle safety are considered. The methodology of approach to analyzing and solving the problem of safety management of vehicles and overall traffic is offered. The distinctive features of organization and management of vehicle safety are shown. There has been drawn a conclusion that the methodological approach to solving traffic safety problems is reduced to selection and classification of safety needs.

  13. Methodology of International Law1

    OpenAIRE

    Dominicé, Christian

    2014-01-01

    I. DEFINITION Methodology seeks to define the means of acquiring scientific knowledge. There is no generally accepted definition of the methodology of international law. In this article it will be taken to comprise both its wider meaning of the methods used in the acquisition of a scientific knowledge of the international legal system and its narrower and more specialized meaning, the methods used to determine the existence of norms or rules of international law. The correlation of these two ...

  14. Agile Methodology - Past and Future

    Science.gov (United States)

    2011-05-01

    Agile Methodology – P t d F t ”as an u ure Warren W. Tignor SAIC Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting...AND SUBTITLE Agile Methodology - Past and Future 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER...Takeuchi & Nonaka HBR 1986, p139 RUGBY Waterfall Red vs Agile Black Team- . - Manifesto 2001 SCRUM GRAPHIC* * Adapted from Schwaber (2007) Agile

  15. A methodology for evaluating environmental planning systems: a case study of Canada.

    Science.gov (United States)

    Ellis, Meghan; Gunton, Thomas; Rutherford, Murray

    2010-06-01

    Sustainable environmental management is contingent on having an effective environmental planning system. A new methodology for designing and evaluating environmental planning systems is described and applied to a case study evaluation of the Canadian environmental planning process. The methodology is based on eight international best practice principles for environmental planning and 45 indicators. The research illustrates the benefits of the evaluation methodology in identifying how to improve environmental planning systems to achieve desired results. The methodology is applicable to a wide variety of jurisdictions. (c) 2010. Published by Elsevier Ltd.

  16. Environmental benefits of compost use on land through LCA – a review of the current gaps

    DEFF Research Database (Denmark)

    Lazcano, Cristina; Martínez-Blanco, Julia; Christensen, Thomas Højlund

    2014-01-01

    included in few published works. In the present study, we reviewed the recent progresses made in the quantification of the effects associated to biowaste compost use on land by using life cycle assessment (LCA). Different research efforts are required for a full assessment of the potential benefits, apart......The use of biowaste compost on land can have beneficial effects on the plant–soil system. While the environmental impacts associated with compost production have been successfully assessed in previous studies, the assessment of the benefits of compost on plant and soil has been only partially...

  17. Optical Image Analysis Applied to Pore Network Quantification of Sandstones Under Experimental CO2 Injection

    Science.gov (United States)

    Berrezueta, E.; González, L.; Ordóñez, B.; Luquot, L.; Quintana, L.; Gallastegui, G.; Martínez, R.; Olaya, P.; Breitner, D.

    2015-12-01

    This research aims to propose a protocol for pore network quantification in sandstones applying the Optical Image Analysis (OIA) procedure, which guarantees the measurement reproducibility and its reliability. Two geological formations of sandstone, located in Spain and potentially suitable for CO2 sequestration, were selected for this study: a) the Cretaceous Utrillas unit, at the base of the Cenozoic Duero Basin and b) a Triassic unit at the base of the Cenozoic Guadalquivir Basin. Sandstone samples were studied before and after the CO2 experimental injection using Optical and scanning electronic microscopy (SEM), while the quantification of petrographic changes was done with OIA. The first phase of the rersearch consisted on a detailed mineralogical and petrographic study of the sandstones (before and after CO2-injection), for which we observed thin sections. Later, the methodological and experimental processes of the investigation were focused on i) adjustment and calibration of OIA tools; ii) data acquisition protocol based on image capture with different polarization conditions (synchronized movement of polarizers), using 7 images of the same mineral scene (6 in crossed polarizer and 1 in parallel polarizer); and iii) automated identification and segmentation of pore in 2D mineral images, generating applications by executable macros. Finally, once the procedure protocols had been, the compiled data was interpreted through an automated approach and the qualitative petrography was carried out. The quantification of changes in the pore network through OIA (porosity increase ≈ 2.5%) has allowed corroborate the descriptions obtained by SEM and microscopic techniques, which consisted in an increase in the porosity when CO2 treatment occurs. Automated-image identification and quantification of minerals, pores and textures together with petrographic analysis can be applied to improve pore system characterization in sedimentary rocks. This research offers numerical

  18. Rate of force development: physiological and methodological considerations.

    Science.gov (United States)

    Maffiuletti, Nicola A; Aagaard, Per; Blazevich, Anthony J; Folland, Jonathan; Tillin, Neale; Duchateau, Jacques

    2016-06-01

    The evaluation of rate of force development during rapid contractions has recently become quite popular for characterising explosive strength of athletes, elderly individuals and patients. The main aims of this narrative review are to describe the neuromuscular determinants of rate of force development and to discuss various methodological considerations inherent to its evaluation for research and clinical purposes. Rate of force development (1) seems to be mainly determined by the capacity to produce maximal voluntary activation in the early phase of an explosive contraction (first 50-75 ms), particularly as a result of increased motor unit discharge rate; (2) can be improved by both explosive-type and heavy-resistance strength training in different subject populations, mainly through an improvement in rapid muscle activation; (3) is quite difficult to evaluate in a valid and reliable way. Therefore, we provide evidence-based practical recommendations for rational quantification of rate of force development in both laboratory and clinical settings.

  19. High-Throughput Quantification of Monofluoroacetate (1080) in Milk as a Response to an Extortion Threat.

    Science.gov (United States)

    Cooney, Terry P; Varelis, Peter; Bendall, Justin G

    2016-02-01

    As a food defense measure against an extortion threat to poison infant formula with monofluoroacetate, a robust methodology for monofluoroacetate analysis in fluid milk and powdered dairy products was developed and optimized. Critical challenges posed by this situation required that the analytical methodology provide (i) high specificity, (ii) high throughput capable of analyzing thousands of samples of fluid milk per day, and (iii) trace-level detection of 1 ng/g or lower to achieve the maximum residue limit. Solid-phase extraction-purified acetone extracts of fluid milk were derivatized with aniline, and after ultrahigh-performance liquid chromatography using a Kinetex-C18 column packed with 1.3-μm shell particles, the resulting N-phenyl 2-fluoroacetamide could be determined by liquid chromatography-tandem mass spectrometry in a highly specific manner and with a limit of quantification of 0.5 ng/ml. By using 4-(4-chlorophenoxy)aniline as a derivatizing agent, the method could be extended to powdered dairy products with the same limit of quantification. Between January and July 2015, some 136,000 fluid milk samples were tested using this method. This analytical testing of fluid milk formed one element in a larger program of work by multiple agencies to ensure that consumers could continue to have confidence in the safety of New Zealand milk and dairy products.

  20. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    Science.gov (United States)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  1. Ecosystem Service Potentials, Flows and Demands – Concepts for Spatial Localisation, Indication and Quantification

    Directory of Open Access Journals (Sweden)

    Benjamin Burkhard

    2014-06-01

    Full Text Available The high variety of ecosystem service categorisation systems, assessment frameworks, indicators, quantification methods and spatial localisation approaches allows scientists and decision makers to harness experience, data, methods and tools. On the other hand, this variety of concepts and disagreements among scientists hamper an integration of ecosystem services into contemporary environmental management and decision making. In this article, the current state of the art of ecosystem service science regarding spatial localisation, indication and quantification of multiple ecosystem service supply and demand is reviewed and discussed. Concepts and tables for regulating, provisioning and cultural ecosystem service definitions, distinguishing between ecosystem service potential supply (stocks, flows (real supply and demands as well as related indicators for quantification are provided. Furthermore, spatial concepts of service providing units, benefitting areas, spatial relations, rivalry, spatial and temporal scales are elaborated. Finally, matrices linking CORINE land cover types to ecosystem service potentials, flows, demands and budget estimates are provided. The matrices show that ecosystem service potentials of landscapes differ from flows, especially for provisioning ecosystem services.

  2. Detection and quantification of proteins in clinical samples using high resolution mass spectrometry.

    Science.gov (United States)

    Gallien, Sebastien; Domon, Bruno

    2015-06-15

    Quantitative proteomics has benefited from the recent development of mass spectrometers capable of high-resolution and accurate-mass (HR/AM) measurements. While targeted experiments are routinely performed on triple quadrupole instruments in selected reaction monitoring (SRM; often referred as multiple reaction monitoring, MRM) mode, the quadrupole-orbitrap mass spectrometers allow quantification in MS/MS mode, also known as parallel reaction monitoring (PRM). This technique is characterized by higher selectivity and better confidence in the assignment of the precursor and fragment ions, and thus translates into an improved analytical performance. More fundamentally, PRM introduces a change of the overall paradigm of targeted experiments, by the decoupling of the acquisition and data processing. They rely on two distinct steps, with a simplified acquisition method in conjunction with a flexible, iterative, post-acquisition data processing. This account describes in detail the different steps of a PRM experiment, which include the design of the acquisition method, the confirmation of the identity of the analytes founded upon a full MS/MS fragmentation pattern, and the quantification based on the extraction of specific fragment ions (selected post-acquisition) using tight mass tolerance. The different types of PRM experiments, defined as large-scale screening or precise targeted quantification using calibrated internal standards, together with the considerations on the selection of experimental parameters are discussed.

  3. Quantification of hepatitis B surface antigen: a new concept for the management of chronic hepatitis B.

    Science.gov (United States)

    Moucari, Rami; Marcellin, Patrick

    2011-01-01

    HBsAg is a very important clinical test that might not only indicate active hepatitis B virus (HBV) infection but might also be used to predict clinical and treatment outcome. Clearance of HBsAg in patients with chronic HBV infection is associated with a much better clinical outcome, although surveillance for early detection of hepatocellular carcinoma (HCC) should continue. HBV DNA quantification is currently used for selecting candidates for therapy, monitoring response to therapy and detecting the emergence of drug resistance. Assays for HBsAg quantification are less expensive than HBV DNA and fully automated with a high throughput capacity. HBsAg titering may be a useful tool to manage patients with chronic HBV, to more clearly define which patients may, and more importantly, may not, benefit from treatment. Baseline and on-treatment HBsAg quantification may help to refine future treatment algorithms for both immune-modulator therapy and nucleos(t)ide analogues. Both HBV markers provide complementary information on the status of HBV infection. However, the relevance of serum HBsAg levels and its use as a reliable replacement for both covalently closed circular DNA and HBV DNA remain unclear.

  4. Multivariate analysis for quantification of Plutonium (IV) in nitric acid based on absorption spectra

    Energy Technology Data Exchange (ETDEWEB)

    Lines, Amanda M.; Adami, Susan R.; Sinkov, Sergey I.; Lumetta, Gregg J.; Bryan, Samuel A.

    2017-07-20

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example of a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.

  5. Quantification of polynuclear aromatic hydrocarbons in transformer oils by enzyme immunoassay.

    Science.gov (United States)

    Kim, I S; Ritchie, L; Setford, S; Allen, M; Wilson, G; Heywood, R; Pahlavanpour, B; Saini, S

    2001-01-01

    Many polynuclear aromatic hydrocarbons (PAHs) are either known or suspected carcinogens and are a common constituent of mineral oils. Due to the large number of possible PAH structures, standard quantification methods fail since they either lack specificity or are too complex, requiring individual fractionation, identification, and quantification. A rapid, low-cost, novel analytical screening method, incorporating a silica-based solid-phase extraction (SPE) method linked to co-solvent dilution and quantification of total and carcinogenic PAH levels by immunoassay, is reported here. The method yielded high extraction efficiencies and minimal matrix effects. This novel approach yielded total and carcinogenic PAH levels x 5.7 and x 126, respectively, lower than that recorded by the industry-recognised BS2000 Pt. 346 (IP346) method which estimates the polyaromatic carbon (PAC) content of oils by gravimetry. The method is expected to be of benefit where an indication of PAH levels in oils is important for purchasing, management or disposal purposes and also for risk assessment and for appropriate labelling of oils in line with current legislation.

  6. Improving perfusion quantification in arterial spin labeling for delayed arrival times by using optimized acquisition schemes

    Energy Technology Data Exchange (ETDEWEB)

    Kramme, Johanna [Fraunhofer MEVIS-Institute for Medical Image Computing, Bremen (Germany); Univ. Bremen (Germany). Faculty of Physics and Electronics; Gregori, Johannes [mediri GmbH, Heidelberg (Germany); Diehl, Volker [Fraunhofer MEVIS-Institute for Medical Image Computing, Bremen (Germany); ZEMODI (Zentrum fuer morderne Diagnostik), Bremen (Germany); Madai, Vince I.; Sobesky, Jan [Charite-Universitaetsmedizin Berlin (Germany). Center for Stroke Research Berlin (CSB); Charite-Universitaetsmedizin Berlin (Germany). Dept. of Neurology; Samson-Himmelstjerna, Frederico C. von [Fraunhofer MEVIS-Institute for Medical Image Computing, Bremen (Germany); Charite-Universitaetsmedizin Berlin (Germany). Center for Stroke Research Berlin (CSB); Charite-Universitaetsmedizin Berlin (Germany). Dept. of Neurology; Lentschig, Markus [ZEMODI (Zentrum fuer morderne Diagnostik), Bremen (Germany); Guenther, Matthias [Fraunhofer MEVIS-Institute for Medical Image Computing, Bremen (Germany); Univ. Bremen (Germany). Faculty of Physics and Electronics; mediri GmbH, Heidelberg (Germany)

    2015-07-01

    The improvement in Arterial Spin Labeling (ASL) perfusion quantification, especially for delayed bolus arrival times (BAT), with an acquisition redistribution scheme mitigating the T1 decay of the label in multi-TI ASL measurements is investigated. A multi inflow time (TI) 3D-GRASE sequence is presented which adapts the distribution of acquisitions accordingly, by keeping the scan time constant. The MR sequence increases the number of averages at long TIs and decreases their number at short TIs and thus compensating the T1 decay of the label. The improvement of perfusion quantification is evaluated in simulations as well as in-vivo in healthy volunteers and patients with prolonged BATs due to age or steno-occlusive disease. The improvement in perfusion quantification depends on BAT. At healthy BATs the differences are small, but become larger for longer BATs typically found in certain diseases. The relative error of perfusion is improved up to 30% at BATs > 1500 ms in comparison to the standard acquisition scheme. This adapted acquisition scheme improves the perfusion measurement in comparison to standard multi-TI ASL implementations. It provides relevant benefit in clinical conditions that cause prolonged BATs and is therefore of high clinical relevance for neuroimaging of steno-occlusive diseases.

  7. Design Science Methodology Applied to a Chemical Surveillance Tool

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Zhuanyi; Han, Kyungsik; Charles-Smith, Lauren E.; Henry, Michael J.

    2017-05-11

    Public health surveillance systems gain significant benefits from integrating existing early incident detection systems,supported by closed data sources, with open source data.However, identifying potential alerting incidents relies on finding accurate, reliable sources and presenting the high volume of data in a way that increases analysts work efficiency; a challenge for any system that leverages open source data. In this paper, we present the design concept and the applied design science research methodology of ChemVeillance, a chemical analyst surveillance system.Our work portrays a system design and approach that translates theoretical methodology into practice creating a powerful surveillance system built for specific use cases.Researchers, designers, developers, and related professionals in the health surveillance community can build upon the principles and methodology described here to enhance and broaden current surveillance systems leading to improved situational awareness based on a robust integrated early warning system.

  8. Mixed methods, mixed methodology health services research in practice.

    Science.gov (United States)

    Johnstone, P Lynne

    2004-02-01

    Mixed methods, mixed methodology research is a little documented but increasingly accepted approach employed to investigate organizational phenomena. The author presents a synthesis of literature that informed the decision to adopt a mixed methods, mixed methodology, dominantly naturalistic study approach to health services research in which she explored the process and organizational consequences of new artifact adoption in surgery. She describes the way whereby a collective case study involving five Australian hospitals yielded quantitative and qualitative data that were analyzed using inductive and/or deductive reasoning. She goes beyond the theoretical rational for employing a mixed methods, mixed methodology approach to present a summative conceptual model of the research process and describe the structural aspects of the dissertation in which the research was reported that should benefit researchers contemplating the value of such an approach.

  9. PEM Fuel Cells Redesign Using Biomimetic and TRIZ Design Methodologies

    Science.gov (United States)

    Fung, Keith Kin Kei

    Two formal design methodologies, biomimetic design and the Theory of Inventive Problem Solving, TRIZ, were applied to the redesign of a Proton Exchange Membrane (PEM) fuel cell. Proof of concept prototyping was performed on two of the concepts for water management. The liquid water collection with strategically placed wicks concept demonstrated the potential benefits for a fuel cell. Conversely, the periodic flow direction reversal concepts might cause a potential reduction water removal from a fuel cell. The causes of this water removal reduction remain unclear. In additional, three of the concepts generated with biomimetic design were further studied and demonstrated to stimulate more creative ideas in the thermal and water management of fuel cells. The biomimetic design and the TRIZ methodologies were successfully applied to fuel cells and provided different perspectives to the redesign of fuel cells. The methodologies should continue to be used to improve fuel cells.

  10. Do conditional benefits reduce equilibrium unemployment?

    NARCIS (Netherlands)

    van der Ploeg, F.

    2006-01-01

    Although unconditional unemployment benefits destroy jobs in competitive and noncompetitive labor markets, conditional benefits can spur job growth in noncompetitive labor markets. Unconditional benefits reduce the penalty of shirking and misconduct, while conditional benefits increase this penalty.

  11. Methane fugitive emissions quantification using the novel 'plume camera' (spatial correlation) method

    Science.gov (United States)

    Crosson, E.; Rella, C.

    2012-12-01

    Fugitive emissions of methane into the atmosphere are a major concern facing the natural gas production industry. Given that the global warming potential of methane is many times greater than that of carbon dioxide, the importance of quantifying methane emissions becomes clear. The rapidly increasing reliance on shale gas (or other unconventional sources) is only intensifying the interest in fugitive methane releases. Natural gas (which is predominantly methane) is an attractive energy source, as it emits 40% less carbon dioxide per Joule of energy generated than coal. However, if just a small percentage of the natural gas consumed is lost due to fugitive emissions during production, processing, or transport, this global warming benefit is lost (Howarth et al. 2012). It is therefore imperative, as production of natural gas increases, that the fugitive emissions of methane are quantified accurately. Traditional direct measurement techniques often involve physical access of the leak itself to quantify the emissions rate, and are generally require painstaking effort to first find the leak and then quantify the emissions rate. With over half a million natural gas producing wells in the U.S. (U.S. Energy Information Administration), not including the associated processing, storage, and transport facilities, and with each facility having hundreds or even thousands of fittings that can potentially leak, the need is clear to develop methodologies that can provide a rapid and accurate assessment of the total emissions rate on a per-well head basis. In this paper we present a novel method for emissions quantification which uses a 'plume camera' with three 'pixels' to quantify emissions using direct measurements of methane concentration in the downwind plume. By analyzing the spatial correlation between the pixels, the spatial extent of the instantaneous plume can be inferred. This information, when combined with the wind speed through the measurement plane, provides a direct

  12. A Methodology To Incorporate The Safety Culture Into Probabilistic Safety Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sunghyun; Kim, Namyeong; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2015-10-15

    In order to incorporate organizational factors into PSA, a methodology needs to be developed. Using the AHP to weigh organizational factors as well as the SLIM to rate those factors, a methodology is introduced in this study. The safety issues related to nuclear safety culture have occurred increasingly. The quantification tool has to be developed in order to include the organizational factor into Probabilistic Safety Assessments. In this study, the state-of-the-art for the organizational evaluation methodologies has been surveyed. This study includes the research for organizational factors, maintenance process, maintenance process analysis models, a quantitative methodology using Analytic Hierarchy Process, Success Likelihood Index Methodology. The purpose of this study is to develop a methodology to incorporate the safety culture into PSA for obtaining more objective risk than before. The organizational factor considered in nuclear safety culture might affect the potential risk of human error and hardware-failure. The safety culture impact index to monitor the plant safety culture can be assessed by applying the developed methodology into a nuclear power plant.

  13. Optimization of Design for SMR via Data Assimilation and Uncertainty Quantification

    Science.gov (United States)

    Heo, Jaeseok

    This thesis presents work on reducing the uncertainty in thermal-hydraulic transient predictions for nuclear power plants (NPP) with a focus on SMRs characterized by the integral PWR design. The objective of a part of the study was to determine the economic benefit of conducting transient experiments on an SMR NPP. To accomplish this, a thermalhydraulic simulator is used to complete data assimilation for input parameters to the simulator using experimental data generated by the plant. Since no such experimental data exists, it was generated using an altered simulator, referred to as the virtual NPP facilitating the investigation of the benefits of conducting various experiments and sensor deployment. The mathematical approach that is used to complete this analysis depends upon whether the system responses, i.e. sensor signals, and the system attributes, e.g. DNBR, are or are not linearly dependent upon the parameters. A linearity test showed that there exist highly nonlinear as well as mildly nonlinear responses, hence both deterministic and probabilistic methods were used to complete data assimilation and uncertainty quantification. For the mildly nonlinear transient, the Bayesian approach was used to obtain the parameters posteriori distributions assuming Gaussian distributions for the input parameters and responses. In order to obtain the a posteriori, given measurements of the observables and a priori distributions of the parameters, one solves an inverse problem calibrating the parameter values to achieve better agreement between measured and predicted sensor response values. For the highly nonlinear transient, the Markov Chain Monte Carlo method was utilized based upon Bayes theorem to estimate the posteriori distributions of parameters. This thesis also discusses the optimization methodology used to design the plant's experiments so as to reduce a posteriori system attribute uncertainties. The optimization problem decision variables include the selection of

  14. Culture, and a Metrics Methodology for Biological Countermeasure Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Mary J.

    2007-03-15

    Outcome Metrics Methodology defines a way to evaluate outcome metrics associated with scenario analyses related to biological countermeasures. Previous work developed a schema to allow evaluation of common elements of impacts across a wide range of potential threats and scenarios. Classes of metrics were identified that could be used by decision makers to differentiate the common bases among disparate scenarios. Typical impact metrics used in risk calculations include the anticipated number of deaths, casualties, and the direct economic costs should a given event occur. There are less obvious metrics that are often as important and require more intensive initial work to be incorporated. This study defines a methodology for quantifying, evaluating, and ranking metrics other than direct health and economic impacts. As has been observed with the consequences of Hurricane Katrina, impacts to the culture of specific sectors of society are less obvious on an immediate basis but equally important over the ensuing and long term. Culture is used as the example class of metrics within which • requirements for a methodology are explored • likely methodologies are examined • underlying assumptions for the respective methodologies are discussed • the basis for recommending a specific methodology is demonstrated. Culture, as a class of metrics, is shown to consist of political, sociological, and psychological elements that are highly valued by decision makers. In addition, cultural practices, dimensions, and kinds of knowledge offer complementary sets of information that contribute to the context within which experts can provide input. The quantification and evaluation of sociopolitical, socio-economic, and sociotechnical impacts depend predominantly on subjective, expert judgment. Epidemiological data is limited, resulting in samples with statistical limits. Dose response assessments and curves depend on the quality of data and its relevance to human modes of exposure

  15. 76 FR 71431 - Civil Penalty Calculation Methodology

    Science.gov (United States)

    2011-11-17

    ... TRANSPORTATION Federal Motor Carrier Safety Administration Civil Penalty Calculation Methodology AGENCY: Federal... its civil penalty methodology. Part of this evaluation includes a forthcoming explanation of the... methodology for calculation of certain civil penalties. To induce compliance with federal regulations,...

  16. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Propagation of parameter uncertainties through large computer models can be very resource intensive. Frameworks and tools for uncertainty quantification are...

  17. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses methods for efficient quantification of margins and uncertainties (QMU) for models that couple multiple, large-scale commercial or...

  18. Aerodynamic Modeling with Heterogeneous Data Assimilation and Uncertainty Quantification Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. proposes to develop an aerodynamic modeling tool that assimilates data from different sources and facilitates uncertainty quantification. The...

  19. 78 FR 76574 - Burial Benefits

    Science.gov (United States)

    2013-12-18

    ... purchasing power of VA's burial benefits. See S. Rep. No. 111-71, at 28-29. In response to this concern... index the allowances paid under that section to inflation and ``preserve the purchasing power of the...

  20. Medicare Benefits and Your Eyes

    Science.gov (United States)

    ... Subscribe to eNews Close Donate Medicare Benefits & Your Eyes Eye Health is Important! As you age, your risk ... that you need. Ask about eye exams! Routine Eye Exams Medicare does not generally cover the costs ...

  1. Do recommender systems benefit users?

    CERN Document Server

    Yeung, Chi Ho

    2015-01-01

    Recommender systems are present in many web applications to guide our choices. They increase sales and benefit sellers, but whether they benefit customers by providing relevant products is questionable. Here we introduce a model to examine the benefit of recommender systems for users, and found that recommendations from the system can be equivalent to random draws if one relies too strongly on the system. Nevertheless, with sufficient information about user preferences, recommendations become accurate and an abrupt transition to this accurate regime is observed for some algorithms. On the other hand, we found that a high accuracy evaluated by common accuracy metrics does not necessarily correspond to a high real accuracy nor a benefit for users, which serves as an alarm for operators and researchers of recommender systems. We tested our model with a real dataset and observed similar behaviors. Finally, a recommendation approach with improved accuracy is suggested. These results imply that recommender systems ...

  2. Q methodology in health economics.

    Science.gov (United States)

    Baker, Rachel; Thompson, Carl; Mannion, Russell

    2006-01-01

    The recognition that health economists need to understand the meaning of data if they are to adequately understand research findings which challenge conventional economic theory has led to the growth of qualitative modes of enquiry in health economics. The use of qualitative methods of exploration and description alongside quantitative techniques gives rise to a number of epistemological, ontological and methodological challenges: difficulties in accounting for subjectivity in choices, the need for rigour and transparency in method, and problems of disciplinary acceptability to health economists. Q methodology is introduced as a means of overcoming some of these challenges. We argue that Q offers a means of exploring subjectivity, beliefs and values while retaining the transparency, rigour and mathematical underpinnings of quantitative techniques. The various stages of Q methodological enquiry are outlined alongside potential areas of application in health economics, before discussing the strengths and limitations of the approach. We conclude that Q methodology is a useful addition to economists' methodological armoury and one that merits further consideration and evaluation in the study of health services.

  3. Employees' motivation and emloyees' benefits

    OpenAIRE

    Nedzelská, Eva

    2014-01-01

    The subject of this bachelor thesis is analysing methods how to stimulate and motivate employees. The theoretical part of the thesis deals with the concept of motivation, concepts close to motivation and selected existing theories of motivation. It also deals with employee benefits, function, division and benefits which are frequently offered to employees. The practical part of the thesis, mainly based on written and online questionnaires, concentrates on motivation of employees at Nedcon Boh...

  4. Employee benefits or wage increase?

    Directory of Open Access Journals (Sweden)

    Jiří Duda

    2011-01-01

    Full Text Available The paper comes from a survey done during the years 2007–2009. It focused on employee satisfaction with the provision of employee benefits. The research included 21 companies, 7 companies were from the engineering sector, 7 companies from the food industry, 3 companies represented the budgetary sphere, 3 companies the services sector and one company operates in pharmaceutical industry.The questionnaire survey consisted of 14 questions, including 5 identification-questions. The paper presents results of the questions on dealing with employees’ awareness of employee benefits and on choosing between employees’ preferences of wage increase or increase in value of benefits provided.Employees are informed about all options of providing employee benefits. Only in 3 cases employees stated dissatisfaction with information. This answer was related with the responses to the second monitored question. Employees of these companies preferred pay increases before benefits’ increases. There was no effect of gender of the respondents, neither the influence of the sector of operation, in the preference of increases in wages or in benefits. Exceptions were the employees of companies operating in the financial sector, who preferred employee benefits before a wage increase. It was found that employees of companies who participated in research in 2009, preferred wage increases before the extension of employee benefits, although the value of the net wage increase is lower than the monetary value of benefits increase.The paper is a part of solution of the research plan MSM 6215648904 The Czech economy in the process of integration and globalization, and the development of agricultural sector and the sector of services under the new conditions of the integrated European market.

  5. Would banning atrazine benefit farmers?

    OpenAIRE

    Ackerman, Frank; Whited, Melissa; Knight, Patrick

    2014-01-01

    Atrazine, an herbicide used on most of the US corn (maize) crop, is the subject of ongoing controversy, with increasing documentation of its potentially harmful health and environmental impacts. Supporters of atrazine often claim that it is of great value to farmers; most recently, Syngenta, the producer of atrazine, sponsored an “Atrazine Benefits Team” (ABT) of researchers who released a set of five papers in 2011, reporting huge economic benefits from atrazine use in US agriculture. A crit...

  6. Strategy study of quantification harmonization of SUV in PET/CT images; Estudo da estrategia de harmonizacao da quantificacao do SUV em imagens de PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-07-01

    and quantitative assessments in different scopes. We concluded that the harmonization strategy of the SUV quantification presented in this paper was effective in reducing the variability of small structures quantification. However, for the comparison of SUV quantification between different scanners and institutions, it is essential that, in addition to the harmonization of quantification, the standardization of the methodology of patient preparation is maintained, in order to minimize the SUV variability due to biological factors. (author)

  7. Education and training benefiting a career as entrepreneur

    DEFF Research Database (Denmark)

    Cheraghi, Maryam; Schøtt, Thomas

    2015-01-01

    ’s entrepreneurial career and widen or narrow due both to environmental forces that reconfigure the gap across career phases and to the gendering of competencies and benefits from education and training. Methodology – A representative sample of 110,689 adults around the world was surveyed in the Global...... are reduced slightly over time as women gain greater benefit from training than men. Implications for research – The cumulative effects of early gender gaps in education and training call for research on gendered learning, and recurrent gender effects across career phases call for research on gendering....... The finding that women gain greater benefit than men from training is informative for policies that foster gender equality and empower women pursuing careers. Originality/value – Conceptualising the entrepreneurial career as a sequence of several stages enables the assessment of gender gaps owing to initial...

  8. Quantification of competitive value of documents

    Directory of Open Access Journals (Sweden)

    Pavel Šimek

    2009-01-01

    Full Text Available The majority of Internet users use the global network to search for different information using fulltext search engines such as Google, Yahoo!, or Seznam. The web presentation operators are trying, with the help of different optimization techniques, to get to the top places in the results of fulltext search engines. Right there is a great importance of Search Engine Optimization and Search Engine Marketing, because normal users usually try links only on the first few pages of the fulltext search engines results on certain keywords and in catalogs they use primarily hierarchically higher placed links in each category. Key to success is the application of optimization methods which deal with the issue of keywords, structure and quality of content, domain names, individual sites and quantity and reliability of backward links. The process is demanding, long-lasting and without a guaranteed outcome. A website operator without advanced analytical tools do not identify the contribution of individual documents from which the entire web site consists. If the web presentation operators want to have an overview of their documents and web site in global, it is appropriate to quantify these positions in a specific way, depending on specific key words. For this purpose serves the quantification of competitive value of documents, which consequently sets global competitive value of a web site. Quantification of competitive values is performed on a specific full-text search engine. For each full-text search engine can be and often are, different results. According to published reports of ClickZ agency or Market Share is according to the number of searches by English-speaking users most widely used Google search engine, which has a market share of more than 80%. The whole procedure of quantification of competitive values is common, however, the initial step which is the analysis of keywords depends on a choice of the fulltext search engine.

  9. Food biotechnology: benefits and concerns.

    Science.gov (United States)

    Falk, Michael C; Chassy, Bruce M; Harlander, Susan K; Hoban, Thomas J; McGloughlin, Martina N; Akhlaghi, Amin R

    2002-06-01

    Recent advances in agricultural biotechnology have highlighted the need for experimental evidence and sound scientific judgment to assess the benefits and risks to society. Nutrition scientists and other animal biologists need a balanced understanding of the issues to participate in this assessment. To date most modifications to crop plants have benefited producers. Crops have been engineered to decrease pesticide and herbicide usage, protect against stressors, enhance yields and extend shelf life. Beyond the environmental benefits of decreased pesticide and herbicide application, consumers stand to benefit by development of food crops with increased nutritional value, medicinal properties, enhanced taste and esthetic appeal. There remains concern that these benefits come with a cost to the environment or increased risk to the consumer. Most U.S. consumers are not aware of the extent that genetically modified foods have entered the marketplace. Consumer awareness of biotechnology seems to have increased over the last decade, yet most consumers remain confused over the science. Concern over the impact on the safety of the food supply remains low in the United States, but is substantially elevated in Europe. Before a genetically engineered crop is introduced into commerce it must pass regulatory scrutiny by as many as four different federal regulatory bodies to ensure a safe food supply and minimize the risk to the environment. Key areas for more research are evaluation of the nutritional benefits of new crops, further investigation of the environmental impact, and development of better techniques to identify and track genetically engineered products.

  10. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  11. QUANTIFICATION OF TISSUE PROPERTIES IN SMALL VOLUMES

    Energy Technology Data Exchange (ETDEWEB)

    J. MOURANT; ET AL

    2000-12-01

    The quantification of tissue properties by optical measurements will facilitate the development of noninvasive methods of cancer diagnosis and detection. Optical measurements are sensitive to tissue structure which is known to change during tumorigenesis. The goals of the work presented in this paper were to verify that the primary scatterers of light in cells are structures much smaller than the nucleus and then to develop an optical technique that can quantify parameters of structures the same size as the scattering features in cells. Polarized, elastic back-scattering was found to be able to quantify changes in scattering properties for turbid media consisting of scatterers of the size found in tissue.

  12. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...... optimized according to the noise level in each voxel. The comparison is carried out using artificial data as well as data from healthy volunteers. It is shown that GPD is comparable to SVD with a variable optimized threshold when determining the maximum of the IRF, which is directly related to the perfusion...

  13. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  14. Quantification of thermal damage in skin tissue

    Institute of Scientific and Technical Information of China (English)

    Xu Feng; Wen Ting; Lu Tianjian; Seffen Keith

    2008-01-01

    Skin thermal damage or skin burns are the most commonly encountered type of trauma in civilian and military communities. Besides, advances in laser, microwave and similar technologies have led to recent developments of thermal treatments for disease and damage involving skin tissue, where the objective is to induce thermal damage precisely within targeted tissue structures but without affecting the surrounding, healthy tissue. Further, extended pain sensation induced by thermal damage has also brought great problem for burn patients. Thus, it is of great importance to quantify the thermal damage in skin tissue. In this paper, the available models and experimental methods for quantification of thermal damage in skin tissue are discussed.

  15. Tutorial examples for uncertainty quantification methods.

    Energy Technology Data Exchange (ETDEWEB)

    De Bord, Sarah [Univ. of California, Davis, CA (United States)

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  16. Methodology for ranking restoration options

    DEFF Research Database (Denmark)

    Jensen, Per Hedemann

    1999-01-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developinggeneric methodologies for ranking restoration...... techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps:-characterisation of relevant contaminated sites -identication and characterisation of relevant restoration...... techniques -assessment of the radiological impact -development and application of a selection methodology for restoration options -formulation ofgeneric conclusions and development of a manual The project is intended to apply to situations in which sites with nuclear installations have been contaminated...

  17. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  18. Reflections on Design Methodology Research

    DEFF Research Database (Denmark)

    2011-01-01

    We shall reflect on the results of Design Methodology research and their impact on design practice. In the past 50 years the number of researchers in the field has expanded enormously – as has the number of publications. During the same period design practice and its products have changed...... and produced are also now far more complex and distributed, putting designers under ever increasing pressure. We shall address the question: Are the results of Design Methodology research appropriate and are they delivering the expected results in design practice? In our attempt to answer this question we...... shall draw on our extensive experience of design research and design teaching, and on the recent book The Future of Design Methodology, edited by Professor Herbert Birkhofer. We shall also refer to a model that links the Results, Practices, Methods, and Sciences of designing. Some initial conclusions...

  19. Workshops as a Research Methodology

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Levinsen, Karin Tweddell

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and worksh......This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice......, and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learning through video conferencing. Through analysis and discussion of these studies’ findings, we argue...

  20. Acoustic emission methodology and application

    CERN Document Server

    Nazarchuk, Zinoviy; Serhiyenko, Oleh

    2017-01-01

    This monograph analyses in detail the physical aspects of the elastic waves radiation during deformation or fracture of materials. I presents the  methodological bases for the practical use of acoustic emission device, and describes the results of theoretical and experimental researches of evaluation of the crack growth resistance of materials, selection of the useful AE signals. The efficiency of this methodology is shown through the diagnostics of various-purpose industrial objects. The authors obtain results of experimental researches with the help of the new methods and facilities.

  1. Methodological Guidelines for Advertising Research

    DEFF Research Database (Denmark)

    Rossiter, John R.; Percy, Larry

    2017-01-01

    In this article, highly experienced advertising academics and advertising research consultants John R. Rossiter and Larry Percy present and discuss what they believe to be the seven most important methodological guidelines that need to be implemented to improve the practice of advertising research....... Their focus is on methodology, defined as first choosing a suitable theoretical framework to guide the research study and then identifying the advertising responses that need to be studied. Measurement of those responses is covered elsewhere in this special issue in the article by Bergkvist and Langner. Most...

  2. Methodological pluralism and narrative inquiry

    Science.gov (United States)

    Michie, Michael

    2013-09-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on `what meaning is being made' rather than `what is happening here' (quadrant 2 rather than quadrant 1). It is suggested that in using the integral theory model, a qualitative research project focuses primarily on one quadrant and is enhanced by approaches suggested in the other quadrants.

  3. New methodologies for patients rehabilitation.

    Science.gov (United States)

    Fardoun, H M; Mashat, A S; Lange, B

    2015-01-01

    The present editorial is part of the focus theme of Methods of Information in Medicine titled "New Methodologies for Patients Rehabilitation", with a specific focus on technologies and human factors related to the use of Information and Communication Technologies (ICT) for improving patient rehabilitation. The focus theme explores different dimensions of empowerment methodologies for disabled people in terms of rehabilitation and health care, and to explores the extent to which ICT is a useful tool in this process. The focus theme lists a set of research papers that present different ways of using ICT to develop advanced systems that help disabled people in their rehabilitation process.

  4. Observational methodology in sport sciences

    Directory of Open Access Journals (Sweden)

    M. Teresa Anguera

    2013-11-01

    Full Text Available This paper reviews the conceptual framework, the key literature and the methods (observation tools, such as category systems and field formats, and coding software, etc. that should be followed when conducting research from the perspective of observational methodology. The observational designs used by the authors’ research group over the last twenty years are discussed, and the procedures for analysing data and assessing their quality are described. Mention is also made of the latest methodological trends in this field, such as the use of mixed methods.

  5. Towards the development of a global probabilistic tsunami risk assessment methodology

    Science.gov (United States)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2017-04-01

    The assessment of tsunami risk is on many levels still ambiguous and under discussion. Over the last two decades, various methodologies and models have been developed to quantify tsunami risk, most of the time on a local or regional level, with either deterministic or probabilistic background. Probabilistic modelling has significant difficulties, as the underlying tsunami hazard modelling demands an immense amount of computational time and thus limits the assessment substantially, being often limited to either institutes with supercomputing access or the modellers are forced to reduce modelling resolution either quantitatively or qualitatively. Furthermore, data on the vulnerability of infrastructure and buildings is empirically limited to a few disasters in the recent years. Thus, a reliable quantification of socio-economic vulnerability is still questionable. Nonetheless, significant improvements have been developed recently on both the methodological site as well as computationally. This study, introduces a methodological framework for a globally uniform probabilistic tsunami risk assessment. Here, the power of recently developed hardware for desktop-based parallel computing plays a crucial role in the calculation of numerical tsunami wave propagation, while large-scale parametric models and paleo-seismological data enhances the return period assessment of tsunami-genic megathrust earthquake events. Adaptation of empirical tsunami vulnerability functions in conjunction with methodologies from flood modelling support a more reliable vulnerability quantification. In addition, methodologies for exposure modelling in coastal areas are introduced focusing on the diversity of coastal exposure landscapes and data availability. Overall, this study introduces a first overview of how a global tsunami risk modelling framework may be accomplished, while covering methodological, computational and data-driven aspects.

  6. Benefits and challenges of qualitative methodologies in cross-cultural psychology studies

    NARCIS (Netherlands)

    de Quadros Rigoni, R.

    2016-01-01

    Qualitative research has been considered increasingly valuable for cross-cultural psychology studies, but its contributions and challenges to the field remain under discussed. This chapter does that by analysing a qualitative study which compares interpretive beliefs and behaviour of street-level

  7. Computational benefits using artificial intelligent methodologies for the solution of an environmental design problem: saltwater intrusion.

    Science.gov (United States)

    Papadopoulou, Maria P; Nikolos, Ioannis K; Karatzas, George P

    2010-01-01

    Artificial Neural Networks (ANNs) comprise a powerful tool to approximate the complicated behavior and response of physical systems allowing considerable reduction in computation time during time-consuming optimization runs. In this work, a Radial Basis Function Artificial Neural Network (RBFN) is combined with a Differential Evolution (DE) algorithm to solve a water resources management problem, using an optimization procedure. The objective of the optimization scheme is to cover the daily water demand on the coastal aquifer east of the city of Heraklion, Crete, without reducing the subsurface water quality due to seawater intrusion. The RBFN is utilized as an on-line surrogate model to approximate the behavior of the aquifer and to replace some of the costly evaluations of an accurate numerical simulation model which solves the subsurface water flow differential equations. The RBFN is used as a local approximation model in such a way as to maintain the robustness of the DE algorithm. The results of this procedure are compared to the corresponding results obtained by using the Simplex method and by using the DE procedure without the surrogate model. As it is demonstrated, the use of the surrogate model accelerates the convergence of the DE optimization procedure and additionally provides a better solution at the same number of exact evaluations, compared to the original DE algorithm.

  8. A Methodology for Assessing the Military Benefits of Science and Technology Investments

    Science.gov (United States)

    2008-09-01

    ratio (BCR) greater than 2 to 1. In the area of biology, intrinsic chemical markers (ICMs) provide a means for verifying that food is fully...1.7 million per year. Since the current wastage due to less than optimal quality is about 20%, the savings due to avoiding wastage is about $2.2...3) focusing commercial capabilities on Army needs, such as food preparation, packaging, and preservation technologies that enabled the development

  9. METHODOLOGIES FOR QUANTIFYING POLLUTION PREVENTION BENEFITS FROM LANDFILL GAS CONTROL AND UTILIZATION

    Science.gov (United States)

    The report describes developing emission factors for controlled primary pollutants (e.g., nonmethane organic compounds) and secondary air pollutants (e.g., carbon monoxide). The report addresses the following criteria air pollutants and greenhouse gases: carbon dioxide, carbon mo...

  10. Calendar Year 2007 Program Benefits for U.S. EPA Energy Star Labeled Products: Expanded Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Marla; Homan, Gregory; Lai, Judy; Brown, Richard

    2009-09-24

    This report provides a top-level summary of national savings achieved by the Energy Star voluntary product labeling program. To best quantify and analyze savings for all products, we developed a bottom-up product-based model. Each Energy Star product type is characterized by product-specific inputs that result in a product savings estimate. Our results show that through 2007, U.S. EPA Energy Star labeled products saved 5.5 Quads of primary energy and avoided 100 MtC of emissions. Although Energy Star-labeled products encompass over forty product types, only five of those product types accounted for 65percent of all Energy Star carbon reductions achieved to date, including (listed in order of savings magnitude)monitors, printers, residential light fixtures, televisions, and furnaces. The forecast shows that U.S. EPA?s program is expected to save 12.2 Quads of primary energy and avoid 215 MtC of emissions over the period of 2008?2015.

  11. The Methodological Benefits of Social Media: "Studying Up" in Brazil in the Facebook Age

    Science.gov (United States)

    Straubhaar, Rolf

    2015-01-01

    While conducting research on the organizational cultures of elite nonprofit organizations in Rio de Janeiro, the author encountered many access issues identified in the current literature: in particular, difficulty in encountering research subjects due to the transitional nature of educational nonprofits and the role of secretaries and…

  12. Methodology

    OpenAIRE

    Köppel, Johannes

    2011-01-01

    The main research question of this thesis is the following: why did Swiss banks and Swiss authorities obediently accepted the dilution of banking privacy in the case of the SWIFT surveillance, when they are usually fierce advocates of banking secrecy? The author initially established three hypotheses: Hypothesis 1 assumes that Switzerland has not opposed the SWIFT program, either publicly or behind the scenes. This implies that Swiss banks and authorities have silently accepted the erosion of...

  13. Considerations for quantification of lipids in nerve tissue using matrix-assisted laser desorption/ionization mass spectrometric imaging.

    Science.gov (United States)

    Landgraf, Rachelle R; Garrett, Timothy J; Conaway, Maria C Prieto; Calcutt, Nigel A; Stacpoole, Peter W; Yost, Richard A

    2011-10-30

    Matrix-assisted laser desorption/ionization (MALDI) mass spectrometric imaging is a technique that provides the ability to identify and characterize endogenous and exogenous compounds spatially within tissue with relatively little sample preparation. While it is a proven methodology for qualitative analysis, little has been reported for its utility in quantitative measurements. In the current work, inherent challenges in MALDI quantification are addressed. Signal response is monitored over successive analyses of a single tissue section to minimize error due to variability in the laser, matrix application, and sample inhomogeneity. Methods for the application of an internal standard to tissue sections are evaluated and used to quantify endogenous lipids in nerve tissue. A precision of 5% or less standard error was achieved, illustrating that MALDI imaging offers a reliable means of in situ quantification for microgram-sized samples and requires minimal sample preparation.

  14. CT quantification of central airway in tracheobronchomalacia

    Energy Technology Data Exchange (ETDEWEB)

    Im, Won Hyeong; Jin, Gong Yong; Han, Young Min; Kim, Eun Young [Dept. of Radiology, Chonbuk National University Hospital, Jeonju (Korea, Republic of)

    2016-05-15

    To know which factors help to diagnose tracheobronchomalacia (TBM) using CT quantification of central airway. From April 2013 to July 2014, 19 patients (68.0 ± 15.0 years; 6 male, 13 female) were diagnosed as TBM on CT. As case-matching, 38 normal subjects (65.5 ± 21.5 years; 6 male, 13 female) were selected. All 57 subjects underwent CT with end-inspiration and end-expiration. Airway parameters of trachea and both main bronchus were assessed using software (VIDA diagnostic). Airway parameters of TBM patients and normal subjects were compared using the Student t-test. In expiration, both wall perimeter and wall thickness in TBM patients were significantly smaller than normal subjects (wall perimeter: trachea, 43.97 mm vs. 49.04 mm, p = 0.020; right main bronchus, 33.52 mm vs. 42.69 mm, p < 0.001; left main bronchus, 26.76 mm vs. 31.88 mm, p = 0.012; wall thickness: trachea, 1.89 mm vs. 2.22 mm, p = 0.017; right main bronchus, 1.64 mm vs. 1.83 mm, p = 0.021; left main bronchus, 1.61 mm vs. 1.75 mm, p = 0.016). Wall thinning and decreased perimeter of central airway of expiration by CT quantification would be a new diagnostic indicators in TBM.

  15. Analytical methodologies based on LC-MS/MS for monitoring selected emerging compounds in liquid and solid phases of the sewage sludge.

    Science.gov (United States)

    Boix, C; Ibáñez, M; Fabregat-Safont, D; Morales, E; Pastor, L; Sancho, J V; Sánchez-Ramírez, J E; Hernández, F

    2016-01-01

    In this work, two analytical methodologies based on liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) were developed for quantification of emerging pollutants identified in sewage sludge after a previous wide-scope screening. The target list included 13 emerging contaminants (EC): thiabendazole, acesulfame, fenofibric acid, valsartan, irbesartan, salicylic acid, diclofenac, carbamazepine, 4-aminoantipyrine (4-AA), 4-acetyl aminoantipyrine (4-AAA), 4-formyl aminoantipyrine (4-FAA), venlafaxine and benzoylecgonine. The aqueous and solid phases of the sewage sludge were analyzed making use of Solid-Phase Extraction (SPE) and UltraSonic Extraction (USE) for sample treatment, respectively. The methods were validated at three concentration levels: 0.2, 2 and 20 μg L(-1) for the aqueous phase, and 50, 500 and 2000 μg kg(-1) for the solid phase of the sludge. In general, the method was satisfactorily validated, showing good recoveries (70-120%) and precision (RSD solid phase for the majority of the analytes. The method applicability was tested by analysis of samples from a wider study on degradation of emerging pollutants in sewage sludge under anaerobic digestion. The key benefits of these methodologies are: • SPE and USE are appropriate sample procedures to extract selected emerging contaminants from the aqueous phase of the sewage sludge and the solid residue. • LC-MS/MS is highly suitable for determining emerging contaminants in both sludge phases. • Up to our knowledge, the main metabolites of dipyrone had not been studied before in sewage sludge.

  16. Philosophy, Methodology and Action Research

    Science.gov (United States)

    Carr, Wilfred

    2006-01-01

    The aim of this paper is to examine the role of methodology in action research. It begins by showing how, as a form of inquiry concerned with the development of practice, action research is nothing other than a modern 20th century manifestation of the pre-modern tradition of practical philosophy. It then draws in Gadamer's powerful vindication of…

  17. Unattended Monitoring System Design Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-07-08

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations.

  18. A Probabilistic Ontology Development Methodology

    Science.gov (United States)

    2014-06-01

    Model-Based Systems Engineering (MBSE) Methodologies," Seattle, 2008. [17] Jeffrey O. Grady, System Requirements Analysis. New York: McGraw-Hill, Inc...software. [Online]. http://www.norsys.com/index.html [26] Lise Getoor, Nir Friedman, Daphne Koller, Avi Pfeffer , and Ben Taskar, "Probabilistic

  19. Sustainable Innovation and Entrepreneurship Methodology

    DEFF Research Database (Denmark)

    Celik, Sine; Joore, Peter; Christodoulou, Panayiotis

    or regional “co-creation platform for sustainable solutions” to promote structural innovation. In this manual, the Sustainable Innovation and Entrepreneurship Methodology will be described. The organisational guidelines mainly take point of departure in how Aalborg University (AAU) in Denmark has organised...

  20. Test reactor risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jennings, R.H.; Rawlins, J.K.; Stewart, M.E.

    1976-04-01

    A methodology has been developed for the identification of accident initiating events and the fault modeling of systems, including common mode identification, as these methods are applied in overall test reactor risk assessment. The methods are exemplified by a determination of risks to a loss of primary coolant flow in the Engineering Test Reactor.

  1. Safety at Work : Research Methodology

    NARCIS (Netherlands)

    Beurden, van K. (Karin); Boer, de J. (Johannes); Brinks, G. (Ger); Goering-Zaburnenko, T. (Tatiana); Houten, van Y. (Ynze); Teeuw, W. (Wouter)

    2012-01-01

    In this document, we provide the methodological background for the Safety atWork project. This document combines several project deliverables as defined inthe overall project plan: validation techniques and methods (D5.1.1), performanceindicators for safety at work (D5.1.2), personal protection equi

  2. Philosophy, Methodology and Action Research

    Science.gov (United States)

    Carr, Wilfred

    2006-01-01

    The aim of this paper is to examine the role of methodology in action research. It begins by showing how, as a form of inquiry concerned with the development of practice, action research is nothing other than a modern 20th century manifestation of the pre-modern tradition of practical philosophy. It then draws in Gadamer's powerful vindication of…

  3. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  4. The Library Space Utilization Methodology.

    Science.gov (United States)

    Hall, Richard B.

    1978-01-01

    Describes the Library Space Utilization (LSU) methodology, which demonstrates that significant information about the functional requirements of a library can be measured and displayed in a quantitative and graphic form. It measures "spatial" relationships between selected functional divisions; it also determines how many people--staff and…

  5. Sharing water and benefits in transboundary river basins

    Science.gov (United States)

    Arjoon, Diane; Tilmant, Amaury; Herrmann, Markus

    2016-06-01

    The equitable sharing of benefits in transboundary river basins is necessary to solve disputes among riparian countries and to reach a consensus on basin-wide development and management activities. Benefit-sharing arrangements must be collaboratively developed to be perceived not only as efficient, but also as equitable in order to be considered acceptable to all riparian countries. The current literature mainly describes what is meant by the term benefit sharing in the context of transboundary river basins and discusses this from a conceptual point of view, but falls short of providing practical, institutional arrangements that ensure maximum economic welfare as well as collaboratively developed methods for encouraging the equitable sharing of benefits. In this study, we define an institutional arrangement that distributes welfare in a river basin by maximizing the economic benefits of water use and then sharing these benefits in an equitable manner using a method developed through stakeholder involvement. We describe a methodology in which (i) a hydrological model is used to allocate scarce water resources, in an economically efficient manner, to water users in a transboundary basin, (ii) water users are obliged to pay for water, and (iii) the total of these water charges is equitably redistributed as monetary compensation to users in an amount determined through the application of a sharing method developed by stakeholder input, thus based on a stakeholder vision of fairness, using an axiomatic approach. With the proposed benefit-sharing mechanism, the efficiency-equity trade-off still exists, but the extent of the imbalance is reduced because benefits are maximized and redistributed according to a key that has been collectively agreed upon by the participants. The whole system is overseen by a river basin authority. The methodology is applied to the Eastern Nile River basin as a case study. The described technique not only ensures economic efficiency, but may

  6. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  7. Molecular quantification of genes encoding for green-fluorescent proteins

    DEFF Research Database (Denmark)

    Felske, A; Vandieken, V; Pauling, B V

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification is pro...

  8. Planning manual for energy resource development on Indian lands. Volume I. Benefit--cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-01

    Section II follows a brief introduction and is entitled ''Benefit-Cost Analysis Framework.'' The analytical framework deals with two major steps involved in assessing the pros and cons of energy resource development (or any other type of development). The first is to identify and describe the overall tribal resource planning and decision process. The second is to develop a detailed methodological approach to the assessment of the benefits and costs of energy development alternatives within the context of the tribe's overall planning process. Sections III, IV, and V present the application of the benefit-cost analysis methodology to coal; oil and gas; and uranium, oil shale, and geothermal development, respectively. The methodology creates hypothetical examples that illustrate realistic development opportunities for the majority of tribes that have significant reserves of one or more of the resources that may be economic to develop.

  9. A method for the analysis of the benefits and costs for aeronautical research and technology

    Science.gov (United States)

    Williams, L. J.; Hoy, H. H.; Anderson, J. L.

    1978-01-01

    A relatively simple, consistent, and reasonable methodology for performing cost-benefit analyses which can be used to guide, justify, and explain investments in aeronautical research and technology is presented. The elements of this methodology (labeled ABC-ART for the Analysis of the Benefits and Costs of Aeronautical Research and Technology) include estimation of aircraft markets; manufacturer costs and return on investment versus aircraft price; airline costs and return on investment versus aircraft price and passenger yield; and potential system benefits--fuel savings, cost savings, and noise reduction. The application of this methodology is explained using the introduction of an advanced turboprop powered transport aircraft in the medium range market in 1978 as an example.

  10. Critical appraisal of the assessment of benefits and risks for foods, 'BRAFO Consensus Working Group'.

    Science.gov (United States)

    Boobis, Alan; Chiodini, Alessandro; Hoekstra, Jeljer; Lagiou, Pagona; Przyrembel, Hildegard; Schlatter, Josef; Schütte, Katrin; Verhagen, Hans; Watzl, Bernhard

    2013-05-01

    BRAFO, Benefit-Risk Analysis for Foods, was a European Commission project funded within Framework Six as a Specific Support Action and coordinated by ILSI Europe. BRAFO developed a tiered methodology for assessing the benefits and risks of foods and food components, utilising a quantitative, common scale for health assessment in higher tiers. This manuscript reports on the implications of the experience gained during the development of the project for the further improvement of benefit-risk assessment methodology. It was concluded that the methodology proposed is applicable to a range of situations and that it does help in optimising resource utilisation through early identification of those benefit-risk questions where benefit clearly outweighs risk or vice versa. However, higher tier assessments are complex and demanding of time and resources, emphasising the need for prioritisation. Areas identified as requiring further development to improve the utility of benefit-risk assessment include health weights for different populations and endpoints where they do not currently exist, extrapolation of effects from studies in animals to humans, use of in vitro data in benefit-risk assessments, and biomarkers of early effect and how these would be used in a quantitative assessment.

  11. Ethical Considerations in the Use of Sexually Explicit Visuals as an Instructional Methodology in College Sexuality Courses

    Science.gov (United States)

    Rhoades, Chuck

    2008-01-01

    This article examines ethical considerations in the use of sexually explicit materials (SEM) as an educational methodology in undergraduate human sexuality courses. While current research is lacking on the effects of this methodology on students, previous studies indicate that SEM use benefits student knowledge gain, attitude awareness, and…

  12. Cardiovascular Benefits of Dark Chocolate?

    Science.gov (United States)

    Higginbotham, Erin; Taub, Pam R

    2015-12-01

    The use of cacao for health benefits dates back at least 3000 years. Our understanding of cacao has evolved with modern science. It is now felt based on extensive research the main health benefits of cacao stem from epicatechin, a flavanol found in cacao. The process of manufacturing dark chocolate retains epicatechin, whereas milk chocolate does not contain significant amounts of epicatechin. Thus, most of the current research studies are focused on dark chocolate. Both epidemiological and clinical studies suggest a beneficial effect of dark chocolate on blood pressure, lipids, and inflammation. Proposed mechanisms underlying these benefits include enhanced nitric oxide bioavailability and improved mitochondrial structure/function. Ultimately, further studies of this promising compound are needed to elucidate its potential for prevention and treatment of cardiovascular and metabolic diseases as well as other diseases that have underlying mechanisms of mitochondrial dysfunction and nitric oxide deficiency.

  13. Health benefits of Moringa oleifera.

    Science.gov (United States)

    Abdull Razis, Ahmad Faizal; Ibrahim, Muhammad Din; Kntayya, Saie Brindha

    2014-01-01

    Phytomedicines are believed to have benefits over conventional drugs and are regaining interest in current research. Moringa oleifera is a multi-purpose herbal plant used as human food and an alternative for medicinal purposes worldwide. It has been identified by researchers as a plant with numerous health benefits including nutritional and medicinal advantages. Moringa oleifera contains essential amino acids, carotenoids in leaves, and components with nutraceutical properties, supporting the idea of using this plant as a nutritional supplement or constituent in food preparation. Some nutritional evaluation has been carried out in leaves and stem. An important factor that accounts for the medicinal uses of Moringa oleifera is its very wide range of vital antioxidants, antibiotics and nutrients including vitamins and minerals. Almost all parts from Moringa can be used as a source for nutrition with other useful values. This mini-review elaborate on details its health benefits.

  14. Benefits of exercise during pregnancy.

    Science.gov (United States)

    Prather, Heidi; Spitznagle, Tracy; Hunt, Devyani

    2012-11-01

    There is a direct link between healthy mothers and healthy infants. Exercise and appropriate nutrition are important contributors to maternal physical and psychological health. The benefits and potential risks of exercise during pregnancy have gained even more attention, with a number of studies having been published after the 2002 American College of Obstetrics and Gynecologists guidelines. A review of the literature was conducted by using PubMed, Scopus, and Embase to assess the literature regarding the benefits of exercise during pregnancy. The search revealed 219 publications, which the authors then narrowed to 125 publications. The purpose of this review is to briefly summarize the known benefits of exercise to the mother, fetus, and newborn.

  15. Health effects assessment of chemical exposures: ARIES methodology

    Energy Technology Data Exchange (ETDEWEB)

    Sierra, L; Montero, M.; Rabago, I.; Vidania, R.

    1995-07-01

    In this work, we present ARIES* update: a system designed in order to facilitate the human health effects assessment produced by accidental release of toxic chemicals. The first version of ARIES was developed in relation to 82/501/EEC Directive about mayor accidents in the chemical industry. So, the first aim was the support of the effects assessment derived for the chemicals included into this directive. From this establishment, it was considered acute exposures for high concentrations. In this report, we present the actual methodology for considering other type of exposures, such as environmental and occupational. Likewise other versions, the methodology comprises two approaches: quantitative and qualitative assessments. Quantitative assessment incorporates the mathematical algorithms useful to evaluate the effects produced by the most important routes of exposure: inhalation, ingestion, eye contact and skin absorption, in a short, medium and long term. It has been included models that realizes an accurate quantification of doses, effects,... and so on, such as simple approaches when the available information is not enough. Qualitative assessment, designed in order to complement or replace the previous one, is incorporated into an informatics system, developed in Clipper. It executes and displays outstanding and important toxicological information of about 100 chemicals. This information comes from ECDIN (Environmental Chemicals Data and Information Network) database through a collaboration with JRC-ISPRA working group. (Author) 24 refs.

  16. Feminist Methodologies and Engineering Education Research

    Science.gov (United States)

    Beddoes, Kacey

    2013-01-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory.…

  17. A new method for top-down quantification of methane and pollutant emissions from natural gas production

    Science.gov (United States)

    Alden, C. B.; Prasad, K.; Ghosh, S.; Sweeney, C.; Karion, A.; Coburn, S.; Wright, R.; Coddington, I.; Truong, G. W.; Baumann, E.; Cossel, K.; Newbury, N.; Rieker, G. B.

    2016-12-01

    A need exists for low-cost, high-precision, and high-accuracy quantification of pollutant and greenhouse gas emissions. This need is particularly pressing as new international agreements and national- and state-level rules require monitoring, quantification and reduction of emissions. We present a novel atmospheric observing system that locates and sizes pollutant sources using long-range (0.1 to 2+ km) open-path measurement of atmospheric constituents with a dual-frequency comb spectrometer, coupled with high resolution atmospheric transport modeling and inversion methods. Planned field deployment of the spectrometer in eastern Colorado and concurrent modeling efforts will constitute the first time that a dual-frequency comb spectrometer has been used in the field for top-down constraint and quantification of emissions. The initial field deployment is designed for quantification of methane emissions at oil and gas production and storage sites. A unique strength of the coupled spectrometer and inverse modeling system is the ability to locate and size multiple point sources of methane across a large area (>12 km2), potentially encompassing 10s of wells, with one spectrometer. We present inversion modeling techniques and transport modeling developed to constrain bottom-up inventories with detailed spatial information, and to provide continuous and temporally resolved emissions estimates. Synthetic data simulations have demonstrated the ability of the system to accurately locate and size very small methane emissions (3.17E-5 kg/s) within a relatively short period of time (>18-days). A particularly important consideration for quantifying emissions is background inflow variability. We will present novel methods for constraint of background concentrations, as well as our observing system design configuration, inversion methodology, results of synthetic data testing, and preliminary data from a controlled methane release field test. We will present new system development

  18. Towards the quantification of rockfall risk assessment for urban areas

    Science.gov (United States)

    Mavrouli, Olga; Corominas, Jordi

    2010-05-01

    In many mountainous inhabited areas rockfalls are a major threat for structures and population. The quantification of the risk gives an estimate of the potential consequences that allows the analysis of different scenarios, minimizing the subjectivity and the uncertainties that derive from judgmental and qualitative approaches. The four main phases of the rockfall phenomenon have to be determined including: a. the calculation of the frequency of the rock block volumes falling down the slope, b. the calculation of the probability of the rock blocks reaching a reference section with a certain level of kinetic energy; c. the calculation of the spatio-temporal probability of the exposed elements; and d. the calculation of the probability that an exposed element will suffer a certain degree of damage. Here, a step-by-step methodology for the quantification of risk is presented. The methodology focuses on steps (b) to (d). An example of an urban area that is situated at the toe of a talus cone below of a rocky slope is considered. Three different rock diameters are considered with their respective frequencies (step a). For the calculation of the spatial probability of a given rock size reaching a location, a probabilistic 3D trajectory analysis is performed using the software ROTOMAP. The inputs are the topographic relief, the rockfall source and velocity and the soil parameters (restitution coefficient and friction coefficients). The latter are evaluated by back analysis using historical events. The probability of a given rock magnitude reaching a critical section of the talus cone with a certain level of kinetic energy is evaluated. For the step (c), the spatio-temporal probability of the element at risk is calculated taking into account both the trajectographic analysis of the rock blocks and the location of the elements at risk on the talus cone. For the step (d), the probability of a certain degree of structural damage in the buildings is calculated. To this purpose

  19. 31 CFR 29.344 - Survivor benefits.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Survivor benefits. 29.344 Section 29... Benefit Payments § 29.344 Survivor benefits. (a) The general rule that Federal Benefit Payments are... months of total service at retirement (for elected survivor benefits) or death (for...

  20. Information Portal Costs and Benefits

    Directory of Open Access Journals (Sweden)

    Lorena BATAGAN

    2006-01-01

    Full Text Available All transformations of our society are the product of the large use of Information and Communications Technologies (ICT and Internet. ICT are technologies which facilitate communication, processing, and transmission of information by electronic means. It is very important to use the new technologies to the correct value because this determinate an increase of global benefits. Portal provides a consistent way to select, evaluate, prioritize and plan the right information. In research we point the important costs and benefits for an informational portal. The portal for local administrative determinate for citizens the access to information of interest and on the other hand make easier for employer to manage the documents.