WorldWideScience

Sample records for reduction estimation tool

  1. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  2. FIELD SCALE MODELING TO ESTIMATE PHOSPHORUS AND SEDIMENT LOAD REDUCTIONS USING A NEWLY DEVELOPED GRAPHICAL USER INTERFACE FOR SOIL AND WATER ASSESSMENT TOOL

    Directory of Open Access Journals (Sweden)

    Aaron R. Mittelstet

    2012-01-01

    Full Text Available Streams throughout the North Canadian River watershed in northwest Oklahoma, USA have elevated levels of nutrients and sediment. Soil and Water Assessment Tool (SWAT was used to identify areas that likely contributed disproportionate amounts of Phosphorus (P and sediment to Lake Overholser, the receiving reservoir at the watershed outlet. These sites were then targeted by the Oklahoma Conservation Commission (OCC to implement conservation practices, such as conservation tillage and pasture planting as part of a US Environmental Protection Agency Section 319(h project. Conservation practices were implemented on 238 fields. The objective of this project was to evaluate conservation practice effectiveness on these fields using the Texas Best Management Evaluation Tool (TBET, a simplified Graphic User Interface (GUI for SWAT developed for field-scale application. TBET was applied on each field to predict the effects of conservation practice implementation on P and sediment loads. These predictions were used to evaluate the implementation cost (per kg of pollutant associated with these reductions. Overall the implemented practices were predicted to reduce P loads to Lake Overholser by nine percent. The ‘riparian exclusion’ and ‘riparian exclusion with buffer’ practices provided the greatest reduction in P load while ‘conservation tillage’ and ‘converting wheat to bermuda grass’ produced the largest reduction in sediment load. The most cost efficient practices were ‘converting wheat to bermuda grass’ or ‘native range’ and ‘riparian exclusion’. This project demonstrates the importance of conservation practice selection and evaluation prior to implementation in order to optimize cost share funds. In addition, this information may lead to the implementation of more cost effective practices and an improvement in the overall effectiveness of water quality programs."

  3. Parameter Estimation, Model Reduction and Quantum Filtering

    CERN Document Server

    Chase, Bradley A

    2009-01-01

    This dissertation explores the topics of parameter estimation and model reduction in the context of quantum filtering. Chapters 2 and 3 provide a review of classical and quantum probability theory, stochastic calculus and filtering. Chapter 4 studies the problem of quantum parameter estimation and introduces the quantum particle filter as a practical computational method for parameter estimation via continuous measurement. Chapter 5 applies these techniques in magnetometry and studies the estimator's uncertainty scalings in a double-pass atomic magnetometer. Chapter 6 presents an efficient feedback controller for continuous-time quantum error correction. Chapter 7 presents an exact model of symmetric processes of collective qubit systems.

  4. Identification of effective screening strategies for cardiovascular disease prevention in a developing country: using cardiovascular risk-estimation and risk-reduction tools for policy recommendations.

    Science.gov (United States)

    Selvarajah, Sharmini; Haniff, Jamaiyah; Kaur, Gurpreet; Guat Hiong, Tee; Bujang, Adam; Chee Cheong, Kee; Bots, Michiel L

    2013-02-25

    Recent increases in cardiovascular risk-factor prevalences have led to new national policy recommendations of universal screening for primary prevention of cardiovascular disease in Malaysia. This study assessed whether the current national policy recommendation of universal screening was optimal, by comparing the effectiveness and impact of various cardiovascular screening strategies. Data from a national population based survey of 24 270 participants aged 30 to 74 was used. Five screening strategies were modelled for the overall population and by gender; universal and targeted screening (four age cut-off points). Screening strategies were assessed based on the ability to detect high cardiovascular risk populations (effectiveness), incremental effectiveness, impact on cardiovascular event prevention and cost of screening. 26.7% (95% confidence limits 25.7, 27.7) were at high cardiovascular risk, men 34.7% (33.6, 35.8) and women 18.9% (17.8, 20). Universal screening identified all those at high-risk and resulted in one high-risk individual detected for every 3.7 people screened, with an estimated cost of USD60. However, universal screening resulted in screening an additional 7169 persons, with an incremental cost of USD115,033 for detection of one additional high-risk individual in comparison to targeted screening of those aged ≥35 years. The cost, incremental cost and impact of detection of high-risk individuals were more for women than men for all screening strategies. The impact of screening women aged ≥45 years was similar to universal screening in men. Targeted gender- and age-specific screening strategies would ensure more optimal utilisation of scarce resources compared to the current policy recommendations of universal screening.

  5. Estimating a percent reduction in load

    Science.gov (United States)

    Millard, Steven P.

    This article extends the work of Cohn et al. [1989] on estimating constituent loads to the problem of estimating a percent reduction in load. Three estimators are considered: the maximum likelihood (MLE), a ``bias-corrected'' maximum likelihood (BCMLE), and the minimum variance unbiased (MVUE). In terms of root-mean-square error, both the MVUE and BCMLE are superior to the MLE, and for the cases considered here there is no appreciable difference between the MVUE and the BCMLE. The BCMLE is constructed from quantities computed by most regression packages and is therefore simpler to compute than the MVUE (which involves approximating an infinite series). All three estimators are applied to a case study in which an agricultural tax in the Everglades agricultural area is tied to an observed percent reduction in phosphorus load. For typical hydrological data, very large sample sizes (of the order of 100 observations each in the baseline period and after) are required to estimate a percent reduction in load with reasonable precision.

  6. A Cost Estimation Tool for Charter Schools

    Science.gov (United States)

    Hayes, Cheryl D.; Keller, Eric

    2009-01-01

    To align their financing strategies and fundraising efforts with their fiscal needs, charter school leaders need to know how much funding they need and what that funding will support. This cost estimation tool offers a simple set of worksheets to help start-up charter school operators identify and estimate the range of costs and timing of…

  7. Dimension reduction based on weighted variance estimate

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In this paper, we propose a new estimate for dimension reduction, called the weighted variance estimate (WVE), which includes Sliced Average Variance Estimate (SAVE) as a special case. Bootstrap method is used to select the best estimate from the WVE and to estimate the structure dimension. And this selected best estimate usually performs better than the existing methods such as Sliced Inverse Regression (SIR), SAVE, etc. Many methods such as SIR, SAVE, etc. usually put the same weight on each observation to estimate central subspace (CS). By introducing a weight function, WVE puts different weights on different observations according to distance of observations from CS. The weight function makes WVE have very good performance in general and complicated situations, for example, the distribution of regressor deviating severely from elliptical distribution which is the base of many methods, such as SIR, etc. And compared with many existing methods, WVE is insensitive to the distribution of the regressor. The consistency of the WVE is established. Simulations to compare the performances of WVE with other existing methods confirm the advantage of WVE.

  8. Dimension reduction based on weighted variance estimate

    Institute of Scientific and Technical Information of China (English)

    ZHAO JunLong; XU XingZhong

    2009-01-01

    In this paper,we propose a new estimate for dimension reduction,called the weighted variance estimate (WVE),which includes Sliced Average Variance Estimate (SAVE) as a special case.Bootstrap method is used to select the best estimate from the WVE and to estimate the structure dimension.And this selected best estimate usually performs better than the existing methods such as Sliced Inverse Regression (SIR),SAVE,etc.Many methods such as SIR,SAVE,etc.usually put the same weight on each observation to estimate central subspace (CS).By introducing a weight function,WVE puts different weights on different observations according to distance of observations from CS.The weight function makes WVE have very good performance in general and complicated situations,for example,the distribution of regressor deviating severely from elliptical distribution which is the base of many methods,such as SIR,etc.And compared with many existing methods,WVE is insensitive to the distribution of the regressor.The consistency of the WVE is established.Simulations to compare the performances of WVE with other existing methods confirm the advantage of WVE.

  9. Parameter estimation, model reduction and quantum filtering

    Science.gov (United States)

    Chase, Bradley A.

    This thesis explores the topics of parameter estimation and model reduction in the context of quantum filtering. The last is a mathematically rigorous formulation of continuous quantum measurement, in which a stream of auxiliary quantum systems is used to infer the state of a target quantum system. Fundamental quantum uncertainties appear as noise which corrupts the probe observations and therefore must be filtered in order to extract information about the target system. This is analogous to the classical filtering problem in which techniques of inference are used to process noisy observations of a system in order to estimate its state. Given the clear similarities between the two filtering problems, I devote the beginning of this thesis to a review of classical and quantum probability theory, stochastic calculus and filtering. This allows for a mathematically rigorous and technically adroit presentation of the quantum filtering problem and solution. Given this foundation, I next consider the related problem of quantum parameter estimation, in which one seeks to infer the strength of a parameter that drives the evolution of a probe quantum system. By embedding this problem in the state estimation problem solved by the quantum filter, I present the optimal Bayesian estimator for a parameter when given continuous measurements of the probe system to which it couples. For cases when the probe takes on a finite number of values, I review a set of sufficient conditions for asymptotic convergence of the estimator. For a continuous-valued parameter, I present a computational method called quantum particle filtering for practical estimation of the parameter. Using these methods, I then study the particular problem of atomic magnetometry and review an experimental method for potentially reducing the uncertainty in the estimate of the magnetic field beyond the standard quantum limit. The technique involves double-passing a probe laser field through the atomic system, giving

  10. Data Service Provider Cost Estimation Tool

    Science.gov (United States)

    Fontaine, Kathy; Hunolt, Greg; Booth, Arthur L.; Banks, Mel

    2011-01-01

    The Data Service Provider Cost Estimation Tool (CET) and Comparables Database (CDB) package provides to NASA s Earth Science Enterprise (ESE) the ability to estimate the full range of year-by-year lifecycle cost estimates for the implementation and operation of data service providers required by ESE to support its science and applications programs. The CET can make estimates dealing with staffing costs, supplies, facility costs, network services, hardware and maintenance, commercial off-the-shelf (COTS) software licenses, software development and sustaining engineering, and the changes in costs that result from changes in workload. Data Service Providers may be stand-alone or embedded in flight projects, field campaigns, research or applications projects, or other activities. The CET and CDB package employs a cost-estimation-by-analogy approach. It is based on a new, general data service provider reference model that provides a framework for construction of a database by describing existing data service providers that are analogs (or comparables) to planned, new ESE data service providers. The CET implements the staff effort and cost estimation algorithms that access the CDB and generates the lifecycle cost estimate for a new data services provider. This data creates a common basis for an ESE proposal evaluator for considering projected data service provider costs.

  11. Development of a simple estimation tool for LMFBR construction cost

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, Kazuo; Kinoshita, Izumi [Central Research Inst. of Electric Power Industry, Komae, Tokyo (Japan). Komae Research Lab

    1999-05-01

    A simple tool for estimating the construction costs of liquid-metal-cooled fast breeder reactors (LMFBRs), 'Simple Cost' was developed in this study. Simple Cost is based on a new estimation formula that can reduce the amount of design data required to estimate construction costs. Consequently, Simple cost can be used to estimate the construction costs of innovative LMFBR concepts for which detailed design has not been carried out. The results of test calculation show that Simple Cost provides cost estimations equivalent to those obtained with conventional methods within the range of plant power from 325 to 1500 MWe. Sensitivity analyses for typical design parameters were conducted using Simple Cost. The effects of four major parameters - reactor vessel diameter, core outlet temperature, sodium handling area and number of secondary loops - on the construction costs of LMFBRs were evaluated quantitatively. The results show that the reduction of sodium handling area is particularly effective in reducing construction costs. (author)

  12. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  13. Dimensionality reduction in Bayesian estimation algorithms

    Directory of Open Access Journals (Sweden)

    G. W. Petty

    2013-03-01

    Full Text Available An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M of pseudochannels while also regularizing the background (geophysical plus instrument noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.

  14. Tool Wear Estimate in Milling Operation by FEM

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Many researches show that, in metal cutting process, tool wear rate depends on some cutting process parameters, such as temperature at tool face, contact pressure and relative sliding velocity at tool/chip and tool/workpiece interfaces. Finite element method(FEM) application enables the estimate of these parameters and the tool wear. A tool wear estimate program based on chip formation and heat transfer analysis is designed and compiled with Python to calculate the wear rate and volume, and update tool geometry according to the tool wear. The progressive flank and crater wears in milling operation are estimated by the program. The FEM code ABAQUS/Explicit and Standard are employed to analyze chip formation and heat transfer process.

  15. Risk Reduction and Training using Simulation Based Tools - 12180

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Irin P. [Newport News Shipbuilding, Newport News, Virginia 23607 (United States)

    2012-07-01

    Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and S based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition

  16. RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

    Energy Technology Data Exchange (ETDEWEB)

    William W. Weiss

    2001-05-17

    Incomplete or sparse information on types of data such as geologic or formation characteristics introduces a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries have demonstrated beneficial results. A state-of-the-art exploration ''expert'' tool, relying on a computerized database and computer maps generated by neural networks, is being developed through the use of ''fuzzy'' logic, a relatively new mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk can be reduced with the use of a properly developed and validated ''Fuzzy Expert Exploration (FEE) Tool.'' This FEE Tool can be beneficial in many regions of the U.S. by enabling risk reduction in oil and gas prospecting as well as decreased prospecting and development costs. In the 1998-1999 oil industry environment, many smaller exploration companies lacked the resources of a pool of expert exploration personnel. Downsizing, low oil prices, and scarcity of exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the U.S. as reserves are depleted. The FEE Tool will benefit a diverse group in the U.S., leading to a more efficient use of scarce funds and lower product prices for consumers. This second annual report contains a summary of progress to date, problems encountered, plans for the next quarter, and an assessment of the prospects for future progress. During the second year of the project, data acquisition of the Brushy Canyon Formation was completed with the compiling and analyzing of well logs, geophysical data, and production information needed to characterize production potential in the Delaware Basin. A majority of this data now resides in several online databases on our servers and is in proper form to be accessed by external

  17. Estimation of toxicity using a Java based software tool

    Science.gov (United States)

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  18. Estimation of toxicity using a Java based software tool

    Science.gov (United States)

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  19. RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

    Energy Technology Data Exchange (ETDEWEB)

    Robert S. Balch; Ron Broadhead

    2005-03-01

    Incomplete or sparse data such as geologic or formation characteristics introduce a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries have demonstrated beneficial results when working with sparse data. State-of-the-art expert exploration tools, relying on a database, and computer maps generated by neural networks and user inputs, have been developed through the use of ''fuzzy'' logic, a mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk has been reduced with the use of these properly verified and validated ''Fuzzy Expert Exploration (FEE) Tools.'' Through the course of this project, FEE Tools and supporting software were developed for two producing formations in southeast New Mexico. Tools of this type can be beneficial in many regions of the U.S. by enabling risk reduction in oil and gas prospecting as well as decreased prospecting and development costs. In today's oil industry environment, many smaller exploration companies lack the resources of a pool of expert exploration personnel. Downsizing, volatile oil prices, and scarcity of domestic exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the U.S. as reserves are depleted. The FEE Tools benefit a diverse group in the U.S., allowing a more efficient use of scarce funds, and potentially reducing dependence on foreign oil and providing lower product prices for consumers.

  20. RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

    Energy Technology Data Exchange (ETDEWEB)

    Robert Balch

    2004-04-08

    Incomplete or sparse information on types of data such as geologic or formation characteristics introduces a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries have demonstrated beneficial results. A state-of-the-art exploration ''expert'' tool, relying on a computerized database and computer maps generated by neural networks, is being developed through the use of ''fuzzy'' logic, a relatively new mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk can be reduced with the use of a properly developed and validated ''Fuzzy Expert Exploration (FEE) Tool.'' This FEE Tool can be beneficial in many regions of the U.S. by enabling risk reduction in oil and gas prospecting as well as decreased prospecting and development costs. In the 1998-1999 oil industry environment, many smaller exploration companies lacked the resources of a pool of expert exploration personnel. Downsizing, low oil prices, and scarcity of exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the U.S. as reserves are depleted. The FEE Tool will benefit a diverse group in the U.S., leading to a more efficient use of scarce funds, and possibly decreasing dependence on foreign oil and lower product prices for consumers. This fifth annual (and tenth of 12 semi-annual reports) contains a summary of progress to date, problems encountered, plans for the next year, and an assessment of the prospects for future progress. The emphasis during the March 2003 through March 2004 period was directed toward completion of the Brushy Canyon FEE Tool and to Silurian-Devonian geology, and development of rules for the Devonian fuzzy system, and on-line software.

  1. Reduction of inequalities in health: assessing evidence-based tools

    Directory of Open Access Journals (Sweden)

    Shea Beverley

    2006-09-01

    Full Text Available Abstract Background The reduction of health inequalities is a focus of many national and international health organisations. The need for pragmatic evidence-based approaches has led to the development of a number of evidence-based equity initiatives. This paper describes a new program that focuses upon evidence- based tools, which are useful for policy initiatives that reduce inequities. Methods This paper is based on a presentation that was given at the "Regional Consultation on Policy Tools: Equity in Population Health Reports," held in Toronto, Canada in June 2002. Results Five assessment tools were presented. 1. A database of systematic reviews on the effects of educational, legal, social, and health interventions to reduce unfair inequalities is being established through the Cochrane and Campbell Collaborations. 2 Decision aids and shared decision making can be facilitated in disadvantaged groups by 'health coaches' to help people become better decision makers, negotiators, and navigators of the health system; a pilot study in Chile has provided proof of this concept. 3. The CIET Cycle: Combining adapted cluster survey techniques with qualitative methods, CIET's population based applications support evidence-based decision making at local and national levels. The CIET map generates maps directly from survey or routine institutional data, to be used as evidence-based decisions aids. Complex data can be displayed attractively, providing an important tool for studying and comparing health indicators among and between different populations. 4. The Ottawa Equity Gauge is applying the Global Equity Gauge Alliance framework to an industrialised country setting. 5 The Needs-Based Health Assessment Toolkit, established to assemble information on which clinical and health policy decisions can be based, is being expanded to ensure a focus on distribution and average health indicators. Conclusion Evidence-based planning tools have much to offer the

  2. Standardized cost estimation for new technology (SCENT) - methodology and tool

    NARCIS (Netherlands)

    Ereev, S.Y.; Patel, M.K.

    2012-01-01

    This paper presents the development of a methodology and tool (called SCENT) to prepare preliminary economic estimates of the total production costs related to manufacturing in the process industries. The methodology uses the factorial approach – cost objects are estimated using factors and

  3. FIESTA—An R estimation tool for FIA analysts

    Science.gov (United States)

    Tracey S. Frescino; Paul L. Patterson; Gretchen G. Moisen; Elizabeth A. Freeman

    2015-01-01

    FIESTA (Forest Inventory ESTimation for Analysis) is a user-friendly R package that was originally developed to support the production of estimates consistent with current tools available for the Forest Inventory and Analysis (FIA) National Program, such as FIDO (Forest Inventory Data Online) and EVALIDator. FIESTA provides an alternative data retrieval and reporting...

  4. RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

    Energy Technology Data Exchange (ETDEWEB)

    William W. Weiss

    2001-09-30

    Incomplete or sparse information on types of data such as geologic or formation characteristics introduces a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries have demonstrated beneficial results. A state-of-the-art exploration ''expert'' tool, relying on a computerized database and computer maps generated by neural networks, is being developed through the use of ''fuzzy'' logic, a relatively new mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk can be reduced with the use of a properly developed and validated ''Fuzzy Expert Exploration (FEE) Tool.'' This FEE Tool can be beneficial in many regions of the U.S. by enabling risk reduction in oil and gas prospecting as well as decreased prospecting and development costs. In the 1998-1999 oil industry environment, many smaller exploration companies lacked the resources of a pool of expert exploration personnel. Downsizing, low oil prices, and scarcity of exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the U.S. as reserves are depleted. As a result, today's pool of experts is much reduced. The FEE Tool will benefit a diverse group in the U.S., leading to a more efficient use of scarce funds and lower product prices for consumers. This fifth of ten semi-annual reports contains a summary of progress to date, problems encountered, plans for the next year, and an assessment of the prospects for future progress. The emphasis during the May 2001 through September 2001 was directed toward development of rules for the fuzzy system.

  5. RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

    Energy Technology Data Exchange (ETDEWEB)

    Robert Balch

    2003-04-15

    Incomplete or sparse information on types of data such as geologic or formation characteristics introduces a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries have demonstrated beneficial results. A state-of-the-art exploration ''expert'' tool, relying on a computerized database and computer maps generated by neural networks, is being developed through the use of ''fuzzy'' logic, a relatively new mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk can be reduced with the use of a properly developed and validated ''Fuzzy Expert Exploration (FEE) Tool.'' This FEE Tool can be beneficial in many regions of the U.S. by enabling risk reduction in oil and gas prospecting as well as decreased prospecting and development costs. In the 1998-1999 oil industry environment, many smaller exploration companies lacked the resources of a pool of expert exploration personnel. Downsizing, low oil prices, and scarcity of exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the U.S. as reserves are depleted. The pool of experts is much reduced today. The FEE Tool will benefit a diverse group in the U.S., leading to a more efficient use of scarce funds, and possibly decreasing dependence on foreign oil and lower product prices for consumers. This fourth of five annual reports contains a summary of progress to date, problems encountered, plans for the next year, and an assessment of the prospects for future progress. The emphasis during the April 2002 through March 2003 period was directed toward Silurian-Devonian geology, development of rules for the fuzzy system, and on-line software.

  6. RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

    Energy Technology Data Exchange (ETDEWEB)

    Robert Balch

    2003-10-15

    Incomplete or sparse information on types of data such as geologic or formation characteristics introduces a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries have demonstrated beneficial results. A state-of-the-art exploration ''expert'' tool, relying on a computerized database and computer maps generated by neural networks, is being developed through the use of ''fuzzy'' logic, a relatively new mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk can be reduced with the use of a properly developed and validated ''Fuzzy Expert Exploration (FEE) Tool.'' This FEE Tool can be beneficial in many regions of the U.S. by enabling risk reduction in oil and gas prospecting as well as decreased prospecting and development costs. In the 1998-1999 oil industry environment, many smaller exploration companies lacked the resources of a pool of expert exploration personnel. Downsizing, low oil prices, and scarcity of exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the U.S. as reserves are depleted. The FEE Tool will benefit a diverse group in the U.S., leading to a more efficient use of scarce funds, and possibly decreasing dependence on foreign oil and lower product prices for consumers. This ninth of ten semi-annual reports contains a summary of progress to date, problems encountered, plans for the next year, and an assessment of the prospects for future progress. The emphasis during the March 2003 through September 2003 period was directed toward Silurian-Devonian geology, development of rules for the fuzzy system, and on-line software.

  7. RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

    Energy Technology Data Exchange (ETDEWEB)

    William W. Weiss

    2000-12-31

    Incomplete or sparse information on types of data such as geologic or formation characteristics introduces a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries, including medical diagnostics, have demonstrated beneficial results. A state-of-the-art exploration ''expert'' tool, relying on a computerized data base and computer maps generated by neural networks, is proposed for development through the use of ''fuzzy'' logic, a relatively new mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk can be reduced with the use of a properly developed and validated ''Fuzzy Expert Exploration (FEE) Tool.'' This tool will be beneficial in many regions of the US, enabling risk reduction in oil and gas prospecting and decreased prospecting and development costs. In the 1998-1999 oil industry environment, many smaller exploration companies lacked the resources of a pool of expert exploration personnel. Downsizing, low oil prices and scarcity of exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the US as reserves are depleted. The proposed expert exploration tool will benefit a diverse group in the US, leading to a more efficient use of scarce funds and lower product prices for consumers. This third of ten semi-annual reports contains an account of the progress, problems encountered, plans for the next quarter, and an assessment of the prospects for future progress.

  8. XLISP-Stat Tools for Building Generalised Estimating Equation Models

    Directory of Open Access Journals (Sweden)

    Thomas Lumley

    1996-12-01

    Full Text Available This paper describes a set of Lisp-Stat tools for building Generalised Estimating Equation models to analyse longitudinal or clustered measurements. The user interface is based on the built-in regression and generalised linear model prototypes, with the addition of object-based error functions, correlation structures and model formula tools. Residual and deletion diagnostic plots are available on the cluster and observation level and use the dynamic graphics capabilities of Lisp-Stat.

  9. Estimating Tool-Tissue Forces Using a 3-Degree-of-Freedom Robotic Surgical Tool.

    Science.gov (United States)

    Zhao, Baoliang; Nelson, Carl A

    2016-10-01

    Robot-assisted minimally invasive surgery (MIS) has gained popularity due to its high dexterity and reduced invasiveness to the patient; however, due to the loss of direct touch of the surgical site, surgeons may be prone to exert larger forces and cause tissue damage. To quantify tool-tissue interaction forces, researchers have tried to attach different kinds of sensors on the surgical tools. This sensor attachment generally makes the tools bulky and/or unduly expensive and may hinder the normal function of the tools; it is also unlikely that these sensors can survive harsh sterilization processes. This paper investigates an alternative method by estimating tool-tissue interaction forces using driving motors' current, and validates this sensorless force estimation method on a 3-degree-of-freedom (DOF) robotic surgical grasper prototype. The results show that the performance of this method is acceptable with regard to latency and accuracy. With this tool-tissue interaction force estimation method, it is possible to implement force feedback on existing robotic surgical systems without any sensors. This may allow a haptic surgical robot which is compatible with existing sterilization methods and surgical procedures, so that the surgeon can obtain tool-tissue interaction forces in real time, thereby increasing surgical efficiency and safety.

  10. RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

    Energy Technology Data Exchange (ETDEWEB)

    William W. Weiss

    2000-06-30

    Incomplete or sparse information on geologic or formation characteristics introduces a high level of risk for oil exploration and development projects. Expert systems have been developed and used in several disciplines and industries, including medical diagnostics, with favorable results. A state-of-the-art exploration ''expert'' tool, relying on a computerized data base and computer maps generated by neural networks, is proposed through the use of ''fuzzy'' logic, a relatively new mathematical treatment of imprecise or non-explicit parameters and values. This project will develop an Artificial Intelligence system that will draw upon a wide variety of information to provide realistic estimates of risk. ''Fuzzy logic,'' a system of integrating large amounts of inexact, incomplete information with modern computational methods to derive usable conclusions, has been demonstrated as a cost-effective computational technology in many industrial applications. During project year 1, 90% of geologic, geophysical, production and price data were assimilated for installation into the database. Logs provided geologic data consisting of formation tops of the Brushy Canyon, Lower Brushy Canyon, and Bone Springs zones of 700 wells used to construct regional cross sections. Regional structure and isopach maps were constructed using kriging to interpolate between the measured points. One of the structure derivative maps (azimuth of curvature) visually correlates with Brushy Canyon fields on the maximum change contours. Derivatives of the regional geophysical data also visually correlate with the location of the fields. The azimuth of maximum dip approximately locates fields on the maximum change contours. In a similar manner the second derivative in the x-direction of the gravity map visually correlates with the alignment of the known fields. The visual correlations strongly suggest that neural network architectures will be

  11. The Management Standards Indicator Tool and the estimation of risk.

    Science.gov (United States)

    Bevan, A; Houdmont, J; Menear, N

    2010-10-01

    The Health & Safety Executive's (HSE) Indicator Tool offers a measure of exposure to psychosocial work conditions that may be linked to stress-related outcomes. The HSE recommends that Indicator Tool data should be used as a basis for discussions concerned with the identification of psychosocial work conditions that might warrant prioritization for intervention. However, operational constraints may render discussions difficult to convene and, when they do take place, the absence of information on harms associated with exposures can make it difficult to identify intervention priorities. To examine (i) the utility of the Indicator Tool for the identification of a manageable number of psychosocial work conditions as intervention candidates and (ii) whether administration of a measure of stress-related outcomes alongside the Indicator Tool can facilitate the identification of intervention priorities. One thousand and thirty-eight employees in the London region of the Her Majesty's Prison Service completed the Indicator Tool and a measure of psychological well-being. Odds ratios were calculated to estimate the risk of impairment to well-being associated with exposure to psychosocial work conditions. The Indicator Tool identified 34 psychosocial work conditions as warranting improvement. Intervention priority was given to those working conditions that were both reported to be poor by ≥50% of respondents and associated with risk of impairment to well-being. This method allowed for the identification of four areas. Augmentation of the Indicator Tool with a measure of stress-related outcomes and the calculation of simple risk estimation statistics can assist the prioritization of intervention candidates.

  12. Adaptive on-line estimation and control of overlay tool bias

    Science.gov (United States)

    Martinez, Victor M.; Finn, Karen; Edgar, Thomas F.

    2003-06-01

    Modern lithographic manufacturing processes rely on various types of exposure tools, used in a mix-and-match fashion. The motivation to use older tools alongside state-of-the-art tools is lower cost and one of the tradeoffs is a degradation in overlay performance. While average prices of semiconductor products continue to fall, the cost of manufacturing equipment rises with every product generation. Lithography processing, including the cost of ownership for tools, accounts for roughly 30% of the wafer processing costs, thus the importance of mix-and-match strategies. Exponentially Weighted Moving Average (EWMA) run-by-run controllers are widely used in the semiconductor manufacturing industry. This type of controller has been implemented successfully in volume manufacturing, improving Cpk values dramatically in processes like photolithography and chemical mechanical planarization. This simple, but powerful control scheme is well suited for adding corrections to compensate for Overlay Tool Bias (OTB). We have developed an adaptive estimation technique to compensate for overlay variability due to differences in the processing tools. The OTB can be dynamically calculated for each tool, based on the most recent measurements available, and used to correct the control variables. One approach to tracking the effect of different tools is adaptive modeling and control. The basic premise of an adaptive system is to change or adapt the controller as the operating conditions of the system change. Using closed-loop data, the adaptive control algorithm estimates the controller parameters using a recursive estimation technique. Once an updated model of the system is available, modelbased control becomes feasible. In the simplest scenario, the control law can be reformulated to include the current state of the tool (or its estimate) to compensate dynamically for OTB. We have performed simulation studies to predict the impact of deploying this strategy in production. The results

  13. Time-to-Compromise Model for Cyber Risk Reduction Estimation

    Energy Technology Data Exchange (ETDEWEB)

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2005-09-01

    We propose a new model for estimating the time to compromise a system component that is visible to an attacker. The model provides an estimate of the expected value of the time-to-compromise as a function of known and visible vulnerabilities, and attacker skill level. The time-to-compromise random process model is a composite of three subprocesses associated with attacker actions aimed at the exploitation of vulnerabilities. In a case study, the model was used to aid in a risk reduction estimate between a baseline Supervisory Control and Data Acquisition (SCADA) system and the baseline system enhanced through a specific set of control system security remedial actions. For our case study, the total number of system vulnerabilities was reduced by 86% but the dominant attack path was through a component where the number of vulnerabilities was reduced by only 42% and the time-to-compromise of that component was increased by only 13% to 30% depending on attacker skill level.

  14. Development of pollution reduction strategies for Mexico City: Estimating cost and ozone reduction effectiveness

    Energy Technology Data Exchange (ETDEWEB)

    Thayer, G.R.; Hardie, R.W. [Los Alamos National Lab., NM (United States); Barrera-Roldan, A. [Instituto Mexicano de Petroleo, Mexico City (Mexico)

    1993-12-31

    This reports on the collection and preparation of data (costs and air quality improvement) for the strategic evaluation portion of the Mexico City Air Quality Research Initiative (MARI). Reports written for the Mexico City government by various international organizations were used to identify proposed options along with estimates of cost and emission reductions. Information from appropriate options identified by SCAQMD for Southem California were also used in the analysis. A linear optimization method was used to select a group of options or a strategy to be evaluated by decision analysis. However, the reduction of ozone levels is not a linear function of the reduction of hydrocarbon and NO{sub x} emissions. Therefore, a more detailed analysis was required for ozone. An equation for a plane on an isopleth calculated with a trajectory model was obtained using two endpoints that bracket the expected total ozone precursor reductions plus the starting concentrations for hydrocarbons and NO{sub x}. The relationship between ozone levels and the hydrocarbon and NO{sub x} concentrations was assumed to lie on this plane. This relationship was used in the linear optimization program to select the options comprising a strategy.

  15. COSTMODL: An automated software development cost estimation tool

    Science.gov (United States)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  16. Estimation and reduction of harmonic currents from power converters

    DEFF Research Database (Denmark)

    Asiminoaei, Lucian

    of estimation and reduction of the harmonic currents from industrial Adjustable Speed Drives, in order to come up with an optimized solution for customers. The work is structured in two main directions. The first direction consists of analyzing the mechanism of the harmonic current generation from ASD...... and harmonic rotating frame transformations, generalized harmonic integrators. Extensive simulations are employed in Matlab/Simulink and a laboratory stand is built to test the shunt APF topology. The actual implementation is done in fundamental rotating frame, where a new selective harmonic integrator....... Although the purpose of APF study was not finding a solution directly applicable for the Heat Power Station, there was an influence in the way the research line is further approached. As the cost of the APF is still relatively high, its utilization becomes more efficient for mediumor high...

  17. Risk Reduction with a Fuzzy Expert Exploration Tool

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, William W.; Broadhead, Ron

    2002-06-27

    In the first three years of the Fee Tool Project, an immense amount of data on the Delaware Basin has been accumulated. Data on geology, structure, production, regional information such as gravity as well as local data, such as well logs. This data, organized and cataloged into several online databases, is available for the Expert System and users as needed and as appropriate in analyzing production potential.

  18. Risk Reduction with a Fuzzy Expert Exploration Tool

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, William W.; Broadhead, Ron; Sung, Andrew

    2000-10-24

    This project developed an Artificial Intelligence system that drew up on a wide variety of information in providing realistic estimates of risk. ''Fuzzy logic,'' a system of integrating large amounts of inexact, incomplete information with modern computational methods derived usable conclusions, were demonstrated as a cost-effective computational technology in many industrial applications.

  19. Tools and techniques for estimating high intensity RF effects

    Science.gov (United States)

    Zacharias, Richard L.; Pennock, Steve T.; Poggio, Andrew J.; Ray, Scott L.

    1992-01-01

    Tools and techniques for estimating and measuring coupling and component disturbance for avionics and electronic controls are described. A finite-difference-time-domain (FD-TD) modeling code, TSAR, used to predict coupling is described. This code can quickly generate a mesh model to represent the test object. Some recent applications as well as the advantages and limitations of using such a code are described. Facilities and techniques for making low-power coupling measurements and for making direct injection test measurements of device disturbance are also described. Some scaling laws for coupling and device effects are presented. A method for extrapolating these low-power test results to high-power full-system effects are presented.

  20. Energy Saving Melting and Revert Reduction Technology (E-SMARRT): Design Support for Tooling Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Dongtao

    2011-09-23

    High pressure die casting is an intrinsically efficient net shape process and improvements in energy efficiency are strongly dependent on design and process improvements that reduce scrap rates so that more of the total consumed energy goes into acceptable, usable castings. Computer simulation has become widely used within the industry but use is not universal. Further, many key design decisions must be made before the simulation can be run and expense in terms of money and time often limits the number of decision iterations that can be explored. This work continues several years of work creating simple, very fast, design tools that can assist with the early stage design decisions so that the benefits of simulation can be maximized and, more importantly, so that the chances of first shot success are maximized. First shot success and better running processes contributes to less scrap and significantly better energy utilization by the process. This new technology was predicted to result in an average energy savings of 1.83 trillion BTUs/year over a 10 year period. Current (2011) annual energy saving estimates over a ten year period, based on commercial introduction in 2012, a market penetration of 30% by 2015 is 1.89 trillion BTUs/year by 2022. Along with these energy savings, reduction of scrap and improvement in yield will result in a reduction of the environmental emissions associated with the melting and pouring of the metal which will be saved as a result of this technology. The average annual estimate of CO2 reduction per year through 2022 is 0.037 Million Metric Tons of Carbon Equivalent (MM TCE).

  1. Estimating Hardness from the USDC Tool-Bit Temperature Rise

    Science.gov (United States)

    Bar-Cohen, Yoseph; Sherrit, Stewart

    2008-01-01

    A method of real-time quantification of the hardness of a rock or similar material involves measurement of the temperature, as a function of time, of the tool bit of an ultrasonic/sonic drill corer (USDC) that is being used to drill into the material. The method is based on the idea that, other things being about equal, the rate of rise of temperature and the maximum temperature reached during drilling increase with the hardness of the drilled material. In this method, the temperature is measured by means of a thermocouple embedded in the USDC tool bit near the drilling tip. The hardness of the drilled material can then be determined through correlation of the temperature-rise-versus-time data with time-dependent temperature rises determined in finite-element simulations of, and/or experiments on, drilling at various known rates of advance or known power levels through materials of known hardness. The figure presents an example of empirical temperature-versus-time data for a particular 3.6-mm USDC bit, driven at an average power somewhat below 40 W, drilling through materials of various hardness levels. The temperature readings from within a USDC tool bit can also be used for purposes other than estimating the hardness of the drilled material. For example, they can be especially useful as feedback to control the driving power to prevent thermal damage to the drilled material, the drill bit, or both. In the case of drilling through ice, the temperature readings could be used as a guide to maintaining sufficient drive power to prevent jamming of the drill by preventing refreezing of melted ice in contact with the drill.

  2. HydrogeoEstimatorXL: an Excel-based tool for estimating hydraulic gradient magnitude and direction

    Science.gov (United States)

    Devlin, J. F.; Schillig, P. C.

    2017-01-01

    HydrogeoEstimatorXL is a free software tool for the interpretation of flow systems based on spatial hydrogeological field data from multi-well networks. It runs on the familiar Excel spreadsheet platform. The program accepts well location coordinates and hydraulic head data, and returns an analysis of the area flow system in two dimensions based on (1) a single best fit plane of the potentiometric surface and (2) three-point estimators, i.e., well triplets assumed to bound planar sections of the potentiometric surface. The software produces graphical outputs including histograms of hydraulic gradient magnitude and direction, groundwater velocity (based on a site average hydraulic properties), as well as mapped renditions of the estimator triangles and the velocity vectors associated with them. Within the software, a transect can be defined and the mass discharge of a groundwater contaminant crossing the transect can be estimated. This kind of analysis is helpful in gaining an overview of a site's hydrogeology, for problem definition, and as a review tool to check the reasonableness of other independent calculations.

  3. Optical Coherence Tomography as a Tool for Ocular Dynamics Estimation

    Directory of Open Access Journals (Sweden)

    Damian Siedlecki

    2015-01-01

    Full Text Available Purpose. The aim of the study is to demonstrate that the ocular dynamics of the anterior chamber of the eye can be estimated quantitatively by means of optical coherence tomography (OCT. Methods. A commercial high speed, high resolution optical coherence tomographer was used. The sequences of tomographic images of the iridocorneal angle of three subjects were captured and each image from the sequence was processed in MATLAB environment in order to detect and identify the contours of the cornea and iris. The data on pulsatile displacements of the cornea and iris and the changes of the depth of the gap between them were retrieved from the sequences. Finally, the spectral analysis of the changes of these parameters was performed. Results. The results of the temporal and spectral analysis manifest the ocular microfluctuation that might be associated with breathing (manifested by 0.25 Hz peak in the power spectra, heart rate (1–1.5 Hz peak, and ocular hemodynamics (3.75–4.5 Hz peak. Conclusions. This paper shows that the optical coherence tomography can be used as a tool for noninvasive estimation of the ocular dynamics of the anterior segment of the eye, but its usability in diagnostics of the ocular hemodynamics needs further investigations.

  4. Application of the loss estimation tool QLARM in Algeria

    Science.gov (United States)

    Rosset, P.; Trendafiloski, G.; Yelles, K.; Semmane, F.; Wyss, M.

    2009-04-01

    During the last six years, WAPMERR has used Quakeloss for real-time loss estimation for more than 440 earthquakes worldwide. Loss reports, posted with an average delay of 30 minutes, include a map showing the average degree of damage in settlements near the epicenter, the total number of fatalities, the total number of injured, and a detailed list of casualties and damage rates in these settlements. After the M6.7 Boumerdes earthquake in 2003, we reported 1690-3660 fatalities. The official death toll was around 2270. Since the El Asnam earthquake, seismic events in Algeria have killed about 6,000 people, injured more than 20,000 and left more than 300,000 homeless. On average, one earthquake with the potential to kill people (M>5.4) happens every three years in Algeria. In the frame of a collaborative project between WAPMERR and CRAAG, we propose to calibrate our new loss estimation tool QLARM (qlarm.ethz.ch) and estimate human losses for future likely earthquakes in Algeria. The parameters needed for this calculation are the following. (1) Ground motion relation and soil amplification factors (2) distribution of building stock and population into vulnerability classes of the European Macroseismic Scale (EMS-98) as given in the PAGER database and (3) population by settlement. Considering the resolution of the available data, we construct 1) point city models for cases where only summary data for the city are available and, 2) discrete city models when data regarding city districts are available. Damage and losses are calculated using: (a) vulnerability models pertinent to EMS-98 vulnerability classes previously validated with the existing ones in Algeria (Tipaza and Chlef) (b) building collapse models pertinent to Algeria as given in the World Housing Encyclopedia and, (c) casualty matrices pertinent to EMS-98 vulnerability classes assembled from HAZUS casualty rates. As a first trial, we simulated the 2003 Boumerdes earthquake to check the validity of the proposed

  5. The Lives Saved Tool (LiST) as a model for diarrhea mortality reduction

    Science.gov (United States)

    2014-01-01

    Background Diarrhea is a leading cause of morbidity and mortality among children under five years of age. The Lives Saved Tool (LiST) is a model used to calculate deaths averted or lives saved by past interventions and for the purposes of program planning when costly and time consuming impact studies are not possible. Discussion LiST models the relationship between coverage of interventions and outputs, such as stunting, diarrhea incidence and diarrhea mortality. Each intervention directly prevents a proportion of diarrhea deaths such that the effect size of the intervention is multiplied by coverage to calculate lives saved. That is, the maximum effect size could be achieved at 100% coverage, but at 50% coverage only 50% of possible deaths are prevented. Diarrhea mortality is one of the most complex causes of death to be modeled. The complexity is driven by the combination of direct prevention and treatment interventions as well as interventions that operate indirectly via the reduction in risk factors, such as stunting and wasting. Published evidence is used to quantify the effect sizes for each direct and indirect relationship. Several studies have compared measured changes in mortality to LiST estimates of mortality change looking at different sets of interventions in different countries. While comparison work has generally found good agreement between the LiST estimates and measured mortality reduction, where data availability is weak, the model is less likely to produce accurate results. LiST can be used as a component of program evaluation, but should be coupled with more complete information on inputs, processes and outputs, not just outcomes and impact. Summary LiST is an effective tool for modeling diarrhea mortality and can be a useful alternative to large and expensive mortality impact studies. Predicting the impact of interventions or comparing the impact of more than one intervention without having to wait for the results of large and expensive

  6. New Swift UVOT data reduction tools and AGN variability studies

    Science.gov (United States)

    Gelbord, Jonathan; Edelson, Rick

    2017-08-01

    The efficient slewing and flexible scheduling of the Swift observatory have made it possible to conduct monitoring campaigns that are both intensive and prolonged, with multiple visits per day sustained over weeks and months. Recent Swift monitoring campaigns of a handful of AGN provide simultaneous optical, UV and X-ray light curves that can be used to measure variability and interband correlations on timescales from hours to months, providing new constraints for the structures within AGN and the relationships between them. However, the first of these campaigns, thrice-per-day observations of NGC 5548 through four months, revealed anomalous dropouts in the UVOT light curves (Edelson, Gelbord, et al. 2015). We identified the cause as localized regions of reduced detector sensitivity that are not corrected by standard processing. Properly interpreting the light curves required identifying and screening out the affected measurements.We are now using archival Swift data to better characterize these low sensitivity regions. Our immediate goal is to produce a more complete mapping of their locations so that affected measurements can be identified and screened before further analysis. Our longer-term goal is to build a more quantitative model of the effect in order to define a correction for measured fluxes, if possible, or at least to put limits on the impact upon any observation. We will combine data from numerous background stars in well-monitored fields in order to quantify the strength of the effect as a function of filter as well as location on the detector, and to test for other dependencies such as evolution over time or sensitivity to the count rate of the target. Our UVOT sensitivity maps and any correction tools will be provided to the community of Swift users.

  7. Carbon Dioxide Emissions Reduction Estimates: Potential Use of ...

    African Journals Online (AJOL)

    User

    with ethanol to determine CO2 emissions reduction for 1998−2007, and thereafter making emissions ... Keywords: Carbon dioxide; Transport; Biofuel; Gasoline−ethanol blends; ..... Mechanical Engineers, Part A: Journal of Power and Energy.

  8. Polynomial Estimates for c-functions on Reductive Symmetric Spaces

    DEFF Research Database (Denmark)

    van den Ban, Erik; Schlichtkrull, Henrik

    2012-01-01

    The c-functions, related to a reductive symmetric space G/H and a fixed representation τ of a maximal compact subgroup K of G, are shown to satisfy polynomial bounds in imaginary directions.......The c-functions, related to a reductive symmetric space G/H and a fixed representation τ of a maximal compact subgroup K of G, are shown to satisfy polynomial bounds in imaginary directions....

  9. Reduction of CO2 Emissions Due to Wind Energy - Methods and Issues in Estimating Operational Emission Reductions

    Energy Technology Data Exchange (ETDEWEB)

    Holttinen, Hannele; Kiviluoma, Juha; McCann, John; Clancy, Matthew; Millgan, Michael; Pineda, Ivan; Eriksen, Peter Borre; Orths, Antje; Wolfgang, Ove

    2015-10-05

    This paper presents ways of estimating CO2 reductions of wind power using different methodologies. Estimates based on historical data have more pitfalls in methodology than estimates based on dispatch simulations. Taking into account exchange of electricity with neighboring regions is challenging for all methods. Results for CO2 emission reductions are shown from several countries. Wind power will reduce emissions for about 0.3-0.4 MtCO2/MWh when replacing mainly gas and up to 0.7 MtCO2/MWh when replacing mainly coal powered generation. The paper focuses on CO2 emissions from power system operation phase, but long term impacts are shortly discussed.

  10. SPERCS-A tool for environmental emission estimation.

    Science.gov (United States)

    Reihlen, Antonia; Bahr, Tobias; Bögi, Christian; Dobe, Christopher; May, Thomas; Verdonck, Frederik; Wind, Thorsten; Zullo, Lorenzo; Tolls, Johannes

    2016-10-01

    The European Union (EU) chemicals regulation Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) requires a hazardous substance registration to identify the uses of a substance and the corresponding conditions of safe use. This requirement includes a human and an environmental safety assessment. Exposure scenarios are developed and employed for estimating emissions resulting from the uses of hazardous substances. To support the environmental assessments, the REACH guidance documents define 22 environmental release categories (ERCs) with conservative release factors (RFs) to water, air, and soil. Several industry associations target the ERCs to more specific uses and respective emission scenarios to enable more realistic emission estimations. They have developed more than 190 specific ERCs (SPERCs) as standardized descriptions of operational conditions (OCs) and risk management measures (RMMs). SPERCs reflect the current good practice and are documented in factsheets. These factsheets contain the information necessary for environmental emission modeling. Key parameters are the substance use rate, the efficiency of the risk management measures (if applicable), and the RFs. These parameters can be based on literature or measured company data or are justified by qualitative arguments. The majority of SPERCs have been implemented as realistic worst-case emission values in screening-level chemical safety assessment (CSA) tools. Three regulatory reviews in Europe have established requirements for documenting the SPERCs and for justifying the RFs. In addition, each of the reviews included recommendations for improving the SPERCs. The latest review proposed a condensed factsheet that focuses on the essentials for exposure assessment and subsequent communication in safety data sheets. It is complemented with a background document for providing details on the emission scenarios and justifications. In the EU the SPERCs will be further progressed in a

  11. Airborne LIDAR as a tool for estimating inherent optical properties

    Science.gov (United States)

    Trees, Charles; Arnone, Robert

    2012-06-01

    LIght Detection and Ranging (LIDAR) systems have been used most extensively to generate elevation maps of land, ice and coastal bathymetry. There has been space-, airborne- and land-based LIDAR systems. They have also been used in underwater communication. What have not been investigated are the capabilities of LIDARs to measure ocean temperature and optical properties vertically in the water column, individually or simultaneously. The practical use of bathymetric LIDAR as a tool for the estimation of inherent optical properties remains one of the most challenging problems in the field of optical oceanography. LIDARs can retrieve data as deep as 3-4 optical depths (e.g. optical properties can be measured through the thermocline for ~70% of the world's oceans). Similar to AUVs (gliders), UAV-based LIDAR systems will increase temporal and spatial measurements by several orders of magnitude. The LIDAR Observations of Optical and Physical Properties (LOOPP) Conference was held at NURC (2011) to review past, current and future LIDAR research efforts in retrieving water column optical/physical properties. This new observational platform/sensor system is ideally suited for ground truthing hyperspectral/geostationary satellite data in coastal regions and for model data assimilation.

  12. Application of Krylov Reduction Technique for a Machine Tool Multibody Modelling

    Directory of Open Access Journals (Sweden)

    M. Sulitka

    2014-02-01

    Full Text Available Quick calculation of machine tool dynamic response represents one of the major requirements for machine tool virtual modelling and virtual machining, aiming at simulating the machining process performance, quality, and precision of a workpiece. Enhanced time effectiveness in machine tool dynamic simulations may be achieved by employing model order reduction (MOR techniques of the full finite element (FE models. The paper provides a case study aimed at comparison of Krylov subspace base and mode truncation technique. Application of both of the reduction techniques for creating a machine tool multibody model is evaluated. The Krylov subspace reduction technique shows high quality in terms of both dynamic properties of the reduced multibody model and very low time demands at the same time.

  13. A novel TOA estimation method with effective NLOS error reduction

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yi-heng; CUI Qi-mei; LI Yu-xiang; ZHANG Ping

    2008-01-01

    It is well known that non-line-of-sight (NLOS)error has been the major factor impeding the enhancement ofaccuracy for time of arrival (TOA) estimation and wirelesspositioning. This article proposes a novel method of TOAestimation effectively reducing the NLOS error by 60%,comparing with the traditional timing and synchronizationmethod. By constructing the orthogonal training sequences,this method converts the traditional TOA estimation to thedetection of the first arrival path (FAP) in the NLOS multipathenvironment, and then estimates the TOA by the round-triptransmission (RTT) technology. Both theoretical analysis andnumerical simulations prove that the method proposed in thisarticle achieves better performance than the traditional methods.

  14. Discrete derivative estimation in LISA Pathfinder data reduction

    CERN Document Server

    Ferraioli, Luigi; Vitale, Stefano

    2009-01-01

    Data analysis for the LISA Technology package (LTP) experiment to be flown aboard the LISA Pathfinder mission requires the solution of the system dynamics for the calculation of the force acting on the test masses (TMs) starting from interferometer position data. The need for a solution to this problem has prompted us to implement a discrete time domain derivative estimator suited for the LTP experiment requirements. We first report on the mathematical procedures for the definition of two methods; the first based on a parabolic fit approximation and the second based on a Taylor series expansion. These two methods are then generalized and incorporated in a more general class of five point discrete derivative estimators. The same procedure employed for the second derivative can be applied to the estimation of the first derivative and of a data smoother allowing defining a class of simple five points estimators for both. The performances of three particular realization of the five point second derivative estimat...

  15. Estimation of the Postmortem Interval by Measuring Blood Oxidation-reduction Potential Values

    Directory of Open Access Journals (Sweden)

    Zhuqing Jiang

    2016-01-01

    Full Text Available Accurate estimation of the postmortem interval (PMI is an important task in forensic practice. In the last half-century, the use of postmortem biochemistry has become an important ancillary method in determining the time of death. The present study was carried out to determine the correlation between blood oxidation-reduction potential (ORP values and PMIs, and to develop a three-dimensional surface equation to estimate the PMI under various temperature conditions. A total of 48 rabbits were placed into six groups and sacrificed by air embolism. Blood was obtained from the right ventricle of each rabbit, and specimens were stored at 10°C, 15°C, 20°C, 25°C, 30°C, and 35°C. At different PMIs (once every 4 h, the blood ORP values were measured using a PB-21 electrochemical analyzer. Statistical analysis and curve fitting of the data yielded cubic polynomial regression equations and a surface equation at different temperatures. Result: The results showed that there was a strong positive correlation between the blood ORP values at different temperatures and the PMI. This study provides another example of using a three-dimensional surface equation as a tool to estimate the PMI at various temperature conditions.

  16. SBAT: A Tool for Estimating Metal Bioaccessibility in Soils

    Energy Technology Data Exchange (ETDEWEB)

    Heuscher, S.A.

    2004-04-21

    Heavy metals such as chromium and arsenic are widespread in the environment due to their usage in many industrial processes. These metals may pose significant health risks to humans, especially children, due to their mutagenic and carcinogenic properties. Typically, the health risks associated with the ingestion of soil-bound metals are estimated by assuming that the metals are completely absorbed through the human intestinal tract (100% bioavailable). This assumption potentially overestimates the risk since soils are known to strongly sequester metals thereby potentially lowering their bioavailability. Beginning in 2000, researchers at Oak Ridge National Laboratory, with funding from the Strategic Environmental Research and Development Program (SERDP), studied the effect of soil properties on the bioaccessibility of soil-bound arsenic and chromium. Representative A and upper-B horizons from seven major U.S. soil orders were obtained from the U.S. Department of Agriculture's National Resources Conservation Service and the U.S. Department of Energy's Oak Ridge Reservation. The soils were spiked with known concentrations of arsenic (As(III) and As(V)) and chromium (Cr(III) and Cr(VI)), and the bioaccessibility was measured using a physiologically based extraction test that mimics the gastric activity of children. Linear regression models were then developed to relate the bioaccessibility measurements to the soil properties (Yang et al. 2002; Stewart et al. 2003a). Important results from these publications and other studies include: (1) Cr(VI) and As(III) are more toxic and bioavailable than Cr(III) and As(V) respectively. (2) Several favorable processes can occur in soils that promote the oxidation of As(III) to As(V) and the reduction of Cr(VI) to Cr(III), thereby lowering bioaccessibility. Iron and manganese oxides are capable of oxidizing As(III) to As(V), whereas organic matter and Fe(II)-bearing minerals are capable of reducing Cr(VI) to Cr(III). (3

  17. Conceptual Study of OFDM-Coding, PAPR Reduction, Channel Estimation

    Directory of Open Access Journals (Sweden)

    S S Riya Rani

    2014-06-01

    Full Text Available At present for high data rate transmission, Orthogonal Frequency Division Multiplexing (OFDM which is one of multi-carrier modulation (MCM techniques offers a considerable spectral efficiency; multipath delay spread tolerance, immunity to the frequency selective fading channels and power efficiency. As a result, OFDM has widely been deployed in many wireless communication standards such as Digital Video Broadcasting (DVB.In using turbo codes for OFDM performance can be sufficiently improved as seen in LTE standard systems. One of the challenging issues for Orthogonal Frequency Division Multiplexing (OFDM system is its high Peak-to-Average Power Ratio (PAPR. In this paper we present turbo coded OFDM systems, its channel estimation scheme and methods for reducing PAPR in the system.

  18. MURMoT. Design and Application of Microbial Uranium Reduction Monitoring Tools

    Energy Technology Data Exchange (ETDEWEB)

    Loeffler, Frank E. [Univ. of Tennessee, Knoxville, TN (United States)

    2014-12-31

    Uranium (U) contamination in the subsurface is a major remediation challenge at many DOE sites. Traditional site remedies present enormous costs to DOE; hence, enhanced bioremediation technologies (i.e., biostimulation and bioaugmentation) combined with monitoring efforts are being considered as cost-effective corrective actions to address subsurface contamination. This research effort improved understanding of the microbial U reduction process and developed new tools for monitoring microbial activities. Application of these tools will promote science-based site management decisions that achieve contaminant detoxification, plume control, and long-term stewardship in the most efficient manner. The overarching hypothesis was that the design, validation and application of a suite of new molecular and biogeochemical tools advance process understanding, and improve environmental monitoring regimes to assess and predict in situ U immobilization. Accomplishments: This project (i) advanced nucleic acid-based approaches to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-detoxifying bacteria; (ii) developed proteomics workflows for detection of metal reduction biomarker proteins in laboratory cultures and contaminated site groundwater; (iii) developed and demonstrated the utility of U isotopic fractionation using high precision mass spectrometry to quantify U(VI) reduction for a range of reduction mechanisms and environmental conditions; and (iv) validated the new tools using field samples from U-contaminated IFRC sites, and demonstrated their prognostic and diagnostic capabilities in guiding decision making for environmental remediation and long-term site stewardship.

  19. Evaluation tools for the effectiveness of infrared countermeasures and signature reduction for ships

    NARCIS (Netherlands)

    Schoemaker, R.M.; Schleijpen, H.M.A.

    2010-01-01

    The protection of ships against infrared guided missiles is a concern for modern naval forces. The vulnerability of ships can be reduced by applying countermeasures such as infrared decoys and infrared signature reduction. This paper will present a set of simulation tools which can be used for

  20. MURMoT. Design and Application of Microbial Uranium Reduction Monitoring Tools

    Energy Technology Data Exchange (ETDEWEB)

    Loffler, Frank E. [Univ. of Tennessee, Knoxville, TN (United States); Ritalahti, Kirsti [Univ. of Tennessee, Knoxville, TN (United States); Sanford, Robert A. [Univ. of Illinois at Urbana-Champaign, IL (United States); Lundstrom, Craig C. [Univ. of Illinois at Urbana-Champaign, IL (United States); Johnson, Thomas M. [Univ. of Illinois at Urbana-Champaign, IL (United States); Kemner, Kenneth [Argonne National Lab. (ANL), Argonne, IL (United States); Boyanov, Maxim [Argonne National Lab. (ANL), Argonne, IL (United States)

    2009-07-01

    Uranium (U) contamination in the subsurface is a major remediation challenge at many DOE sites. Traditional site remedies present enormous costs to DOE; hence, enhanced bioremediation technologies (i.e., biostimulation and bioaugmentation) combined with monitoring efforts are being considered as cost-effective corrective actions to address subsurface contamination. This research effort improved understanding of the microbial U reduction process and developed new tools for monitoring microbial activities. Application of these tools will promote science-based site management decisions that achieve contaminant detoxification, plume control, and long-term stewardship in the most efficient manner. The overarching hypothesis was that the design, validation and application of a suite of new molecular and biogeochemical tools advance process understanding, and improve environmental monitoring regimes to assess and predict in situ U immobilization. Accomplishments: This project (i) advanced nucleic acid-based approaches to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-detoxifying bacteria; (ii) developed proteomics workflows for detection of metal reduction biomarker proteins in laboratory cultures and contaminated site groundwater; (iii) developed and demonstrated the utility of U isotopic fractionation using high precision mass spectrometry to quantify U(VI) reduction for a range of reduction mechanisms and environmental conditions; and (iv) validated the new tools using field samples from U-contaminated IFRC sites, and demonstrated their prognostic and diagnostic capabilities in guiding decision making for environmental remediation and long-term site stewardship.

  1. Presenting LiteRed: a tool for the Loop InTEgrals REDuction

    CERN Document Server

    Lee, R N

    2012-01-01

    Mathematica package LiteRed is described. It performs the heuristic search of the symbolic IBP reduction rules for loop integrals. It implements also several convenient tools for the search of the symmetry relations, construction of the differential equations and dimensional recurrence relations.

  2. Estimating the Reduction of Generating System CO2 Emissions Resulting from Significant Wind Energy Penetration

    Energy Technology Data Exchange (ETDEWEB)

    Holttinen, Hannele; Kiviluoma, Juha; Pineda, Ivan; McCann, John; Clancy, Matthew; Milligan, Michael

    2014-11-13

    This paper presents ways of estimating CO2 reductions of wind power using different methodologies. The paper discusses pitfalls in methodology and proposes appropriate methods to perform the calculations. Results for CO2 emission reductions are shown from several countries. This paper is an international collaboration of IEA Wind Task 25 on wind integration.

  3. Development of a customised design flood estimation tool to ...

    African Journals Online (AJOL)

    The objectives of this study were: (i) to review the methods currently used for design ... the latest design rainfall information and recognised estimation methods used in ... methods in gauged catchments with areas ranging from 100 km² to 10 000 ... of the catchment sizes and return periods, except for the 2-year return period.

  4. Cost Estimating Cases: Educational Tools for Cost Analysts

    Science.gov (United States)

    1993-09-01

    DATE: 22 Jul 9X WBS ELEMENT/NUM JR: Systems Engineering/Program Mangement /1070 PRODUCTION BRIEF DESCRIPTION: The estimating/technical teams evaluated...Overcoming a Weakness of the Case Method," Cornell Hotel and Restaurant Administration Ouarterly, 3:69-72 (August 1992). 14. Takekuwa, Laura. Budget Course

  5. Development of a customised design flood estimation tool to ...

    African Journals Online (AJOL)

    2011-03-07

    Mar 7, 2011 ... design flood estimation methods available in South Africa. The objectives of this study ... catchment parameters (e.g. average catchment and main water- course slopes .... and discharge into the Orange-Vaal River drainage system. (Midgley et al. ...... DFET to provide some additional support in the decision-.

  6. A generic tool for cost estimating in aircraft design

    NARCIS (Netherlands)

    Castagne, S.; Curran, R.; Rothwell, A.; Price, M.; Benard, E.; Raghunathan, S.

    2008-01-01

    A methodology to estimate the cost implications of design decisions by integrating cost as a design parameter at an early design stage is presented. The model is developed on a hierarchical basis, the manufacturing cost of aircraft fuselage panels being analysed in this paper. The manufacturing cost

  7. The national-level nutrient loading estimation tool for Finland: WSFS-Vemala

    Science.gov (United States)

    Huttunen, Markus; Huttunen, Inese; Korppoo, Marie; Seppänen, Vanamo; Vehviläinen, Bertel

    2013-04-01

    The WSFS-Vemala tool has been developed for the estimation of nutrients loading to rivers and lakes in Finland and to the Baltic Sea. The tool includes total phosphorus, total nitrogen, suspended solids and total organic carbon. WSFS-Vemala provides for each of the 58 000 lakes about in Finland an estimate of nutrient concentration in the lake, incoming and outgoing nutrient load and division of incoming load by sources, i.e. agriculture, forests and forestry, scattered dwelling and point sources. The aim of the tool is especially to answer the needs rising from the practical implementation of the WFD. For that purpose, the WSFS-Vemala tool provides an estimate of the present state of the lake using nutrient concentrations, an understanding of the reasons explaining the state of the lake by presenting a division of the loading by sources and finally scenarios for the future state and loading of the lake with different load reduction options. The WSFS-Vemala tool is based on a modeling system which includes the simulation of hydrology, nutrient leaching from fields and forests and nutrient transport in rivers and lakes. The hydrological simulation is based on the WSFS system, which simulates the hydrological cycle on a daily time step using daily precipitation and temperature. The simulated components are snow accumulation and melt, soil moisture, evaporation, ground water flow and runoff and, discharges and water levels of rivers and lakes. The remote sensing data used in the model includes satellite data of snow coverage and snow water equivalent and precipitation from weather radars. Since agriculture is the main source of nutrient loading, fields are described in detail. Slope profile, crop and soil type data for each 1 100 000 fields in Finland are described, which cover 2 450 000 hectares of fields. For phosphorus leaching and erosion simulations, the field scale Icecream model is applied. In the Icecream model farming practices, fertilization, crop growth

  8. Ecohealth System Dynamic Model as a Planning Tool for the Reduction of Breeding Sites

    Science.gov (United States)

    Respati, T.; Raksanagara, A.; Djuhaeni, H.; Sofyan, A.; Shandriasti, A.

    2017-03-01

    Dengue is still one of major health problem in Indonesia. Dengue transmission is influenced by dengue prevention and eradication program, community participation, housing environment and climate. The complexity of the disease coupled with limited resources necessitates different approach for prevention methods that include factors contribute to the transmission. One way to prevent the dengue transmission is by reducing the mosquito’s breeding sites. Four factors suspected to influence breeding sites are dengue prevention and eradication program, community participation, housing environment, and weather condition. In order to have an effective program in reducing the breeding site it is needed to have a model which can predict existence of the breeding sites while the four factors under study are controlled. The objective of this study is to develop an Ecohealth model using system dynamic as a planning tool for the reduction of breeding sites to prevent dengue transmission with regard to dengue prevention and eradication program, community participation, housing environment, and weather condition. The methodology is a mixed method study using sequential exploratory design. The study comprised of 3 stages: first a qualitative study to 14 respondents using in-depth interview and 6 respondents for focus group discussion. The results from the first stage was used to develop entomology and household survey questionnaires for second stage conducted in 2036 households across 12 sub districts in Bandung City. Ecohealth system dynamic model was developed using data from first and second stages. Analyses used are thematic analysis for qualitative data; spatial, generalized estimating equation (GEE) and structural equation modeling for quantitative data; also average mean error (AME) and average variance error (AVE) for dynamic system model validation. System dynamic model showed that the most effective approach to eliminate breeding places was by ensuring the availability

  9. Behaviour characteristics estimation tool of genetic distance between sheep breeds

    Directory of Open Access Journals (Sweden)

    Eko Handiwirawan

    2014-12-01

    Full Text Available Information on the estimation of genetic distances and differentiation among sheep breeds are needed in crossing and conservation programs. This research aims to study of utilizing behaviour characteristic variables to differentiate and estimate genetic distance between the sheep breeds. The study was conducted at Cilebut and Bogor Animal House of Indonesian Research Institute for Animal Production. Five sheep breeds used were Barbados Black Belly Cross (BC, Garut Composite (KG, Garut Local (LG, Sumatera Composite (KS and St. Croix Cross (SC, with total sample of 50 heads. A total of 10 variables of behavior traits were observed in this study. Analysis of variances and significance tests were applied to compare between sheep breeds and performed for all of behavior traits using PROC GLM of SAS Program ver. 9.0. PROC CANDISC was used for canonical discriminant analyses, the hierarchical clustering was performed using the PROC CLUSTER by Average Linkage method (Unweighted Pair-Group Method Using Arithmetic Averages, UPGMA, and the dendogram for the five sheep breeds was described using PROC TREE. The differentiator variables for the behavior traits were standing and feeding duration. The canonical plotting based on behavioral characteristics could differentiate BC, KS and LG (with KG and SC sheeps. Estimation of genetic distance based on the behavior traits is less accurate for grouping of sheep breeds.

  10. Surface evaluation by estimation of fractal dimension and statistical tools.

    Science.gov (United States)

    Hotar, Vlastimil; Salac, Petr

    2014-01-01

    Structured and complex data can be found in many applications in research and development, and also in industrial practice. We developed a methodology for describing the structured data complexity and applied it in development and industrial practice. The methodology uses fractal dimension together with statistical tools and with software modification is able to analyse data in a form of sequence (signals, surface roughness), 2D images, and dividing lines. The methodology had not been tested for a relatively large collection of data. For this reason, samples with structured surfaces produced with different technologies and properties were measured and evaluated with many types of parameters. The paper intends to analyse data measured by a surface roughness tester. The methodology shown compares standard and nonstandard parameters, searches the optimal parameters for a complete analysis, and specifies the sensitivity to directionality of samples for these types of surfaces. The text presents application of fractal geometry (fractal dimension) for complex surface analysis in combination with standard roughness parameters (statistical tool).

  11. Estimating SIT-driven population reduction in the Mediterranean fruit fly, Ceratitis capitata, from sterile mating.

    Science.gov (United States)

    Juan-Blasco, M; Sabater-Muñoz, B; Pla, I; Argilés, R; Castañera, P; Jacas, J A; Ibáñez-Gual, M V; Urbaneja, A

    2014-04-01

    Area-wide sterile insect technique (SIT) programs assume that offspring reduction of the target population correlates with the mating success of the sterile males released. However, there is a lack of monitoring tools to prove the success of these programs in real-time. Field-cage tests were conducted under the environmental conditions of the Mediterranean coast of Spain to estimate: (a) the mating success of sterile Vienna-8 (V8) Ceratitis capitata males using molecular markers and (b) their efficacy to reduce C. capitata populations under six release ratios of wild females to wild males to V8 males (1:0:0, 1:1:0, 1:1:1, 1:1:5, 1:1:10, and 1:1:20). Statistical models were developed to predict: (a) the number of females captured in traps, (b) sperm ID (sterile or not) in spermathecae of the trapped females, and (c) the viable offspring produced, using release ratio and temperature as predictors. The number of females captured was affected by relative humidity. However, its influence in the model was low. Female captures were significantly higher in ratios 1:0:0 compared to ratios where V8 males were released. The proportion of V8 sperm in spermathecae increased with temperature and with the number of V8 males released, but leveled off between ratios 1:1:10 and 1:1:20. In all seasons, except winter (no offspring), viable offspring increased with temperature and was lowest for ratio 1:1:20. For the first time, a strong negative relationship between proportion of V8 sperm detected by molecular tools and C. capitata offspring was established. The models obtained should contribute to enhance the efficacy of SIT programs against this pest.

  12. Artificial Neural Network-Based Clutter Reduction Systems for Ship Size Estimation in Maritime Radars

    Directory of Open Access Journals (Sweden)

    Vicen-Bueno R

    2010-01-01

    Full Text Available The existence of clutter in maritime radars deteriorates the estimation of some physical parameters of the objects detected over the sea surface. For that reason, maritime radars should incorporate efficient clutter reduction techniques. Due to the intrinsic nonlinear dynamic of sea clutter, nonlinear signal processing is needed, what can be achieved by artificial neural networks (ANNs. In this paper, an estimation of the ship size using an ANN-based clutter reduction system followed by a fixed threshold is proposed. High clutter reduction rates are achieved using 1-dimensional (horizontal or vertical integration modes, although inaccurate ship width estimations are achieved. These estimations are improved using a 2-dimensional (rhombus integration mode. The proposed system is compared with a CA-CFAR system, denoting a great performance improvement and a great robustness against changes in sea clutter conditions and ship parameters, independently of the direction of movement of the ocean waves and ships.

  13. Twitter as a Potential Disaster Risk Reduction Tool. Part I: Introduction, Terminology, Research and Operational Applications.

    Science.gov (United States)

    Cooper, Guy Paul; Yeager, Violet; Burkle, Frederick M; Subbarao, Italo

    2015-06-29

    Twitter, a popular communications platform, is identified as contributing to improved mortality and morbidity outcomes resulting from the 2013 Hattiesburg, Mississippi EF-4 Tornado. This study describes the methodology by which Twitter was investigated as a potential disaster risk reduction and management tool at the community level and the process by which the at-risk population was identified from the broader Twitter user population. By understanding how various factors contribute to the superspreading of messages, one can better optimize Twitter as an essential communications and risk reduction tool. This study introduces Parts II, III and IV which further define the technological and scientific knowledge base necessary for developing future competency base curriculum and content for Twitter assisted disaster management education and training at the community level.

  14. An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System

    Science.gov (United States)

    1989-09-01

    C~4p DTIC S ELECTE fl JAN12 19 .1R ~OF S%. B -U AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L...Ohio go 91 022 AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L. Hanson Major, USAF...Department of Defense. AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Presented to the

  15. Raman spectroscopy as a tool for reagent free estimation

    CERN Document Server

    Kumar, S

    2014-01-01

    We present results of Raman spectroscopic studies of urine to determine the suitability of near-infrared Raman spectroscopy for quantitative estimation of urinary urea. The Raman spectra were acquired from the urine samples with an inbuilt Raman spectroscopy setup that employs a 785-nm diode laser as the Raman excitation source. A multivariate algorithm based on partial least square (PLS) regression was developed to predict the concentration of urea depending on the measured sets of Raman spectra and the reference urea concentration. The computed results shows that Raman spectroscopy in amalgamation with PLS-based multivariate chemometric algorithm can detect urea in urine samples with an accuracy of >90 %.

  16. ADVANCEMENT OF NUCLEIC ACID-BASED TOOLS FOR MONITORING IN SITU REDUCTIVE DECHLORINATION

    Energy Technology Data Exchange (ETDEWEB)

    Vangelas, K; ELIZABETH EDWARDS, E; FRANK LOFFLER, F; Brian02 Looney, B

    2006-11-17

    Regulatory protocols generally recognize that destructive processes are the most effective mechanisms that support natural attenuation of chlorinated solvents. In many cases, these destructive processes will be biological processes and, for chlorinated compounds, will often be reductive processes that occur under anaerobic conditions. The existing EPA guidance (EPA, 1998) provides a list of parameters that provide indirect evidence of reductive dechlorination processes. In an effort to gather direct evidence of these processes, scientists have identified key microorganisms and are currently developing tools to measure the abundance and activity of these organisms in subsurface systems. Drs. Edwards and Luffler are two recognized leaders in this field. The research described herein continues their development efforts to provide a suite of tools to enable direct measures of biological processes related to the reductive dechlorination of TCE and PCE. This study investigated the strengths and weaknesses of the 16S rRNA gene-based approach to characterizing the natural attenuation capabilities in samples. The results suggested that an approach based solely on 16S rRNA may not provide sufficient information to document the natural attenuation capabilities in a system because it does not distinguish between strains of organisms that have different biodegradation capabilities. The results of the investigations provided evidence that tools focusing on relevant enzymes for functionally desired characteristics may be useful adjuncts to the 16SrRNA methods.

  17. Estimation of Risk Factors - Useful Tools in Assessing Calves Welfare

    Directory of Open Access Journals (Sweden)

    Ioana Andronie

    2013-05-01

    Full Text Available The study has been aimed at identify risk factors that may be used in welfare assessment of calves reared in intensive farming systems. These factors may be useful to the farmers in planning breeder measures in order to increase the animal welfare levels in relation to the legislative requirements. The estimation considered the housing conditions of calves aged 0-6 months grouped in two lots A (n: 50 and B (n: 60, depending on their accommodation system. We have monitored the calves decubitus on the housing surface, body hygiene as well as that of the resting area and the thermal comfort. The assessment was made by direct observation and numerical estimation, based on the Welfare Quality ® 2009 protocol (Assessment protocol for cattle as well as by means of a calves safety and welfare evaluation chart according to the European and national legislation on minimum calves safety and protection standards. Data collected and processed have shown the fact that not all housing conditions completely answer calves physiological requirements. Thus the appropriate housing criterion in the present study was met at B lot of 85 % and to a much smaller degree by the A lot (76 %. The assessment carried out by means of the safety chart have indicated that only the minimum criteria for calves rearing were met, which does not translate into a high level of their welfare.

  18. The Influence of Tool Texture on Friction and Lubrication in Strip Reduction Testing

    Directory of Open Access Journals (Sweden)

    Mohd Hafis Sulaiman

    2017-01-01

    Full Text Available While texturing of workpiece surfaces to promote lubrication in metal forming has been applied for several decades, tool surface texturing is rather new. In the present paper, tool texturing is studied as a method to prevent galling. A strip reduction test was conducted with tools provided with shallow, longitudinal pockets oriented perpendicular to the sliding direction. The pockets had small angles to the workpiece surface and the distance between them were varied. The experiments reveal that the distance between pockets should be larger than the pocket width, thereby creating a topography similar to flat table mountains to avoid mechanical interlocking in the valleys; otherwise, an increase in drawing load and pick-up on the tools are observed. The textured tool surface lowers friction and improves lubrication performance, provided that the distance between pockets is 2–4 times larger than the pocket width. Larger drawing speed facilitates escape of the entrapped lubricant in the pockets. Testing with low-to-medium viscosity oils leads to a low sheet roughness on the plateaus, but also local workpiece material pick-up on the tool plateaus. Large lubricant viscosity results in higher sheet plateau roughness, but also prevents pick-up and galling.

  19. Estimating thermodynamic expectations and free energies in expanded ensemble simulations: Systematic variance reduction through conditioning

    Science.gov (United States)

    Athènes, Manuel; Terrier, Pierre

    2017-05-01

    Markov chain Monte Carlo methods are primarily used for sampling from a given probability distribution and estimating multi-dimensional integrals based on the information contained in the generated samples. Whenever it is possible, more accurate estimates are obtained by combining Monte Carlo integration and integration by numerical quadrature along particular coordinates. We show that this variance reduction technique, referred to as conditioning in probability theory, can be advantageously implemented in expanded ensemble simulations. These simulations aim at estimating thermodynamic expectations as a function of an external parameter that is sampled like an additional coordinate. Conditioning therein entails integrating along the external coordinate by numerical quadrature. We prove variance reduction with respect to alternative standard estimators and demonstrate the practical efficiency of the technique by estimating free energies and characterizing a structural phase transition between two solid phases.

  20. Remote sensing as a tool for estimating soil erosion potential

    Science.gov (United States)

    Morris-Jones, D. R.; Morgan, K. M.; Kiefer, R. W.

    1979-01-01

    The Universal Soil Loss Equation is a frequently used methodology for estimating soil erosion potential. The Universal Soil Loss Equation requires a variety of types of geographic information (e.g. topographic slope, soil erodibility, land use, crop type, and soil conservation practice) in order to function. This information is traditionally gathered from topographic maps, soil surveys, field surveys, and interviews with farmers. Remote sensing data sources and interpretation techniques provide an alternative method for collecting information regarding land use, crop type, and soil conservation practice. Airphoto interpretation techniques and medium altitude, multi-date color and color infrared positive transparencies (70mm) were utilized in this study to determine their effectiveness for gathering the desired land use/land cover data. Successful results were obtained within the test site, a 6136 hectare watershed in Dane County, Wisconsin.

  1. Mapping grey matter reductions in schizophrenia: an anatomical likelihood estimation analysis of voxel-based morphometry studies.

    Science.gov (United States)

    Fornito, A; Yücel, M; Patti, J; Wood, S J; Pantelis, C

    2009-03-01

    Voxel-based morphometry (VBM) is a popular tool for mapping neuroanatomical changes in schizophrenia patients. Several recent meta-analyses have identified the brain regions in which patients most consistently show grey matter reductions, although they have not examined whether such changes reflect differences in grey matter concentration (GMC) or grey matter volume (GMV). These measures assess different aspects of grey matter integrity, and may therefore reflect different pathological processes. In this study, we used the Anatomical Likelihood Estimation procedure to analyse significant differences reported in 37 VBM studies of schizophrenia patients, incorporating data from 1646 patients and 1690 controls, and compared the findings of studies using either GMC or GMV to index grey matter differences. Analysis of all studies combined indicated that grey matter reductions in a network of frontal, temporal, thalamic and striatal regions are among the most frequently reported in literature. GMC reductions were generally larger and more consistent than GMV reductions, and were more frequent in the insula, medial prefrontal, medial temporal and striatal regions. GMV reductions were more frequent in dorso-medial frontal cortex, and lateral and orbital frontal areas. These findings support the primacy of frontal, limbic, and subcortical dysfunction in the pathophysiology of schizophrenia, and suggest that the grey matter changes observed with MRI may not necessarily result from a unitary pathological process.

  2. Mixed Lp Estimators Variety for Model Order Reduction in Control Oriented System Identification

    Directory of Open Access Journals (Sweden)

    Christophe Corbier

    2015-01-01

    Full Text Available A new family of MLE type Lp estimators for model order reduction in dynamical systems identification is presented in this paper. A family of Lp distributions proposed in this work combines Lp2 (1estimation criterion and reduce the estimated model complexity. Convergence consistency properties of the estimator are analysed and the model order reduction is established. Experimental results are presented and discussed on a real vibration complex dynamical system and pseudo-linear models are considered.

  3. Estimation of breast dose reduction potential for organ-based tube current modulated CT with wide dose reduction arc

    Science.gov (United States)

    Fu, Wanyi; Sturgeon, Gregory M.; Agasthya, Greeshma; Segars, W. Paul; Kapadia, Anuj J.; Samei, Ehsan

    2017-03-01

    This study aimed to estimate the organ dose reduction potential for organ-dose-based tube current modulated (ODM) thoracic CT with wide dose reduction arc. Twenty-one computational anthropomorphic phantoms (XCAT, age range: 27- 75 years, weight range: 52.0-105.8 kg) were used to create a virtual patient population with clinical anatomic variations. For each phantom, two breast tissue compositions were simulated: 50/50 and 20/80 (glandular-to-adipose ratio). A validated Monte Carlo program was used to estimate the organ dose for standard tube current modulation (TCM) (SmartmA, GE Healthcare) and ODM (GE Healthcare) for a commercial CT scanner (Revolution, GE Healthcare) with explicitly modeled tube current modulation profile, scanner geometry, bowtie filtration, and source spectrum. Organ dose was determined using a typical clinical thoracic CT protocol. Both organ dose and CTDIvol-to-organ dose conversion coefficients (h factors) were compared between TCM and ODM. ODM significantly reduced all radiosensitive organ doses (psaw an increase or no significant change. The organ-dose-based tube current modulation significantly reduced organ doses especially for radiosensitive superficial anterior organs such as the breasts.

  4. Effect of tool wear per discharge estimation error on the depth of machined cavities in micro-EDM milling

    DEFF Research Database (Denmark)

    Puthumana, Govindan; Bissacco, Giuliano; Hansen, Hans Nørgaard

    2017-01-01

    In micro-EDM milling, real time electrode wear compensation based on tool wear per discharge (TWD) estimation permits the direct control of the position of the tool electrode frontal surface. However, TWD estimation errors will cause errors on the tool electrode axial depth. A simulation tool is ...

  5. Proceedings from the workshop on estimating the contributions of sodium reduction to preventable death

    NARCIS (Netherlands)

    Schmidt, M.K.; Andrews, T.; Bibbins-Domingo, K.; Burt, V.; Cook, N.R.; Ezzati, M.; Geleijnse, J.M.; Homer, J.; Joffres, M.; Keenan, N.L.; Labarthe, D.R.; Law, M.; Loria, C.M.; Orenstein, D.; Schooley, M.W.; Sukumar, S.; Hong, Y.

    2011-01-01

    The primary goal of this workshop was to identify the most appropriate method to estimate the potential effect of reduction in sodium consumption on mortality. Difficulty controlling hypertension at the individual level has motivated international, federal, state, and local efforts to identify and i

  6. Quantitative Cyber Risk Reduction Estimation Methodology for a Small Scada Control System

    Energy Technology Data Exchange (ETDEWEB)

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2006-01-01

    We propose a new methodology for obtaining a quick quantitative measurement of the risk reduction achieved when a control system is modified with the intent to improve cyber security defense against external attackers. The proposed methodology employs a directed graph called a compromise graph, where the nodes represent stages of a potential attack and the edges represent the expected time-to-compromise for differing attacker skill levels. Time-to-compromise is modeled as a function of known vulnerabilities and attacker skill level. The methodology was used to calculate risk reduction estimates for a specific SCADA system and for a specific set of control system security remedial actions. Despite an 86% reduction in the total number of vulnerabilities, the estimated time-to-compromise was increased only by about 3 to 30% depending on target and attacker skill level.

  7. The effect of retrospective sampling on estimates of prediction error for multifactor dimensionality reduction.

    Science.gov (United States)

    Winham, Stacey J; Motsinger-Reif, Alison A

    2011-01-01

    The standard in genetic association studies of complex diseases is replication and validation of positive results, with an emphasis on assessing the predictive value of associations. In response to this need, a number of analytical approaches have been developed to identify predictive models that account for complex genetic etiologies. Multifactor Dimensionality Reduction (MDR) is a commonly used, highly successful method designed to evaluate potential gene-gene interactions. MDR relies on classification error in a cross-validation framework to rank and evaluate potentially predictive models. Previous work has demonstrated the high power of MDR, but has not considered the accuracy and variance of the MDR prediction error estimate. Currently, we evaluate the bias and variance of the MDR error estimate as both a retrospective and prospective estimator and show that MDR can both underestimate and overestimate error. We argue that a prospective error estimate is necessary if MDR models are used for prediction, and propose a bootstrap resampling estimate, integrating population prevalence, to accurately estimate prospective error. We demonstrate that this bootstrap estimate is preferable for prediction to the error estimate currently produced by MDR. While demonstrated with MDR, the proposed estimation is applicable to all data-mining methods that use similar estimates.

  8. Missing texture reconstruction method based on error reduction algorithm using Fourier transform magnitude estimation scheme.

    Science.gov (United States)

    Ogawa, Takahiro; Haseyama, Miki

    2013-03-01

    A missing texture reconstruction method based on an error reduction (ER) algorithm, including a novel estimation scheme of Fourier transform magnitudes is presented in this brief. In our method, Fourier transform magnitude is estimated for a target patch including missing areas, and the missing intensities are estimated by retrieving its phase based on the ER algorithm. Specifically, by monitoring errors converged in the ER algorithm, known patches whose Fourier transform magnitudes are similar to that of the target patch are selected from the target image. In the second approach, the Fourier transform magnitude of the target patch is estimated from those of the selected known patches and their corresponding errors. Consequently, by using the ER algorithm, we can estimate both the Fourier transform magnitudes and phases to reconstruct the missing areas.

  9. Estimation of Thermal Contact Conductance between Blank and Tool Surface in Hot Stamping Process

    Science.gov (United States)

    Taha, Zahari; Hanafiah Shaharudin, M. A.

    2016-02-01

    In hot stamping, the determination of the thermal contact conductance values between the blank and tool surface during the process is crucial for the purpose of simulating the blank rapid cooling inside the tool using finite element analysis (FEA). The thermal contact conductance value represents the coefficient of the heat transfer at the surface of two solid bodies in contact and is known to be influenced greatly by the applied pressure. In order to estimate the value and its dependency on applied pressure, the process of hot stamping was replicated and simplified into a process of compression of heated flat blank in between the tool at different applied pressure. The temperature of the blank and tool surface were measured by means of thermocouples installed inside the tool. Based on the measured temperature, the thermal contact conductance between the surfaces was calculated using Newton's cooling law equation. The calculated value was then used to simulate the blank cooling inside the tool using FEA commercial software. This paper describes an experimental approach to estimate the thermal contact conductance between a blank made of Boron Steel (USIBOR 1500) and tool made of Tool Steel (STAVAX). Its dependency on applied pressure is also studied and the experimental results were then compared with FEA simulations.

  10. Adaptive speckle reduction of ultrasound images based on maximum likelihood estimation

    Institute of Scientific and Technical Information of China (English)

    Xu Liu(刘旭); Yongfeng Huang(黄永锋); Wende Shou(寿文德); Tao Ying(应涛)

    2004-01-01

    A method has been developed in this paper to gain effective speckle reduction in medical ultrasound images.To exploit full knowledge of the speckle distribution, here maximum likelihood was used to estimate speckle parameters corresponding to its statistical mode. Then the results were incorporated into the nonlinear anisotropic diffusion to achieve adaptive speckle reduction. Verified with simulated and ultrasound images,we show that this algorithm is capable of enhancing features of clinical interest and reduces speckle noise more efficiently than just applying classical filters. To avoid edge contribution, changes of contrast-to-noise ratio of different regions are also compared to investigate the performance of this approach.

  11. Parameter estimation of cutting tool temperature nonlinear model using PSO algorithm

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    In cutting tool temperature experiment, a large number of related data could be available. In order to define the relationship among the experiment data, the nonlinear regressive curve of cutting tool temperature must be constructed based on the data. This paper proposes the Particle Swarm Optimization (PSO) algorithm for estimating the parameters such a curve. The PSO algorithm is an evolutional method based on a very simple concept. Comparison of PSO results with those of GA and LS methods showed that the PSO algorithm is more effective for estimating the parameters of the above curve.

  12. Using a relative health indicator (RHI) metric to estimate health risk reductions in drinking water.

    Science.gov (United States)

    Alfredo, Katherine A; Seidel, Chad; Ghosh, Amlan; Roberson, J Alan

    2017-03-01

    When a new drinking water regulation is being developed, the USEPA conducts a health risk reduction and cost analysis to, in part, estimate quantifiable and non-quantifiable cost and benefits of the various regulatory alternatives. Numerous methodologies are available for cumulative risk assessment ranging from primarily qualitative to primarily quantitative. This research developed a summary metric of relative cumulative health impacts resulting from drinking water, the relative health indicator (RHI). An intermediate level of quantification and modeling was chosen, one which retains the concept of an aggregated metric of public health impact and hence allows for comparisons to be made across "cups of water," but avoids the need for development and use of complex models that are beyond the existing state of the science. Using the USEPA Six-Year Review data and available national occurrence surveys of drinking water contaminants, the metric is used to test risk reduction as it pertains to the implementation of the arsenic and uranium maximum contaminant levels and quantify "meaningful" risk reduction. Uranium represented the threshold risk reduction against which national non-compliance risk reduction was compared for arsenic, nitrate, and radium. Arsenic non-compliance is most significant and efforts focused on bringing those non-compliant utilities into compliance with the 10 μg/L maximum contaminant level would meet the threshold for meaningful risk reduction.

  13. SBML-PET-MPI: a parallel parameter estimation tool for Systems Biology Markup Language based models.

    Science.gov (United States)

    Zi, Zhike

    2011-04-01

    Parameter estimation is crucial for the modeling and dynamic analysis of biological systems. However, implementing parameter estimation is time consuming and computationally demanding. Here, we introduced a parallel parameter estimation tool for Systems Biology Markup Language (SBML)-based models (SBML-PET-MPI). SBML-PET-MPI allows the user to perform parameter estimation and parameter uncertainty analysis by collectively fitting multiple experimental datasets. The tool is developed and parallelized using the message passing interface (MPI) protocol, which provides good scalability with the number of processors. SBML-PET-MPI is freely available for non-commercial use at http://www.bioss.uni-freiburg.de/cms/sbml-pet-mpi.html or http://sites.google.com/site/sbmlpetmpi/.

  14. PAPR Reduction Approach Based on Channel Estimation Pilots for Next Generations Broadcasting Systems

    Directory of Open Access Journals (Sweden)

    Anh-Tai Ho

    2011-01-01

    Full Text Available A novel peak-to-average power ratio (PAPR reduction technique for orthogonal frequency division multiplexing (OFDM systems is addressed. Instead of using dedicated pilots for PAPR reduction as with tone reservation (TR method selected by the DVB-T2 standard, we propose to use existing pilots used for channel estimation. In this way, we avoid the use of reserved tone pilots and then improve the spectral efficiency of the system. In order to allow their recovery at the receiver, these pilots have to follow particular laws which permit their blind detection and avoid sending side information. In this work, we propose and investigate a multiplicative law operating in discrete frequency domain. The operation in discrete domain aims at reducing degradation due to detection and estimation error in continuous domain. Simulation results are performed using the new DVB-T2 standard parameters. Its performance is compared to the DVB-T2 PAPR gradient algorithm and to the second-order cone programming (SOCP competitive technique proposed in the literature. We show that the proposed technique is efficient in terms of PAPR reduction value and of spectral efficiency while the channel estimation performance is maintained.

  15. A tool for the estimation of the distribution of landslide area in R

    Science.gov (United States)

    Rossi, M.; Cardinali, M.; Fiorucci, F.; Marchesini, I.; Mondini, A. C.; Santangelo, M.; Ghosh, S.; Riguer, D. E. L.; Lahousse, T.; Chang, K. T.; Guzzetti, F.

    2012-04-01

    We have developed a tool in R (the free software environment for statistical computing, http://www.r-project.org/) to estimate the probability density and the frequency density of landslide area. The tool implements parametric and non-parametric approaches to the estimation of the probability density and the frequency density of landslide area, including: (i) Histogram Density Estimation (HDE), (ii) Kernel Density Estimation (KDE), and (iii) Maximum Likelihood Estimation (MLE). The tool is available as a standard Open Geospatial Consortium (OGC) Web Processing Service (WPS), and is accessible through the web using different GIS software clients. We tested the tool to compare Double Pareto and Inverse Gamma models for the probability density of landslide area in different geological, morphological and climatological settings, and to compare landslides shown in inventory maps prepared using different mapping techniques, including (i) field mapping, (ii) visual interpretation of monoscopic and stereoscopic aerial photographs, (iii) visual interpretation of monoscopic and stereoscopic VHR satellite images and (iv) semi-automatic detection and mapping from VHR satellite images. Results show that both models are applicable in different geomorphological settings. In most cases the two models provided very similar results. Non-parametric estimation methods (i.e., HDE and KDE) provided reasonable results for all the tested landslide datasets. For some of the datasets, MLE failed to provide a result, for convergence problems. The two tested models (Double Pareto and Inverse Gamma) resulted in very similar results for large and very large datasets (> 150 samples). Differences in the modeling results were observed for small datasets affected by systematic biases. A distinct rollover was observed in all analyzed landslide datasets, except for a few datasets obtained from landslide inventories prepared through field mapping or by semi-automatic mapping from VHR satellite imagery

  16. Tool to estimate optical metrics from summary wave-front analysis data in the human eye

    NARCIS (Netherlands)

    Jansonius, Nomdo M.

    2013-01-01

    Purpose Studies in the field of cataract and refractive surgery often report only summary wave-front analysis data data that are too condensed to allow for a retrospective calculation of metrics relevant to visual perception. The aim of this study was to develop a tool that can be used to estimate t

  17. Tool for Studying the Effects of Range Restriction in Correlation Coefficient Estimation

    Science.gov (United States)

    1990-07-01

    AFHRL-TP-90-6 AIR FORCE TOOL FOR STUDYING THE EFFECTS OF RANGE RESTRICTION IN CORRELATION COEFFICIENT ESTIMATION H U Douglas E. JacksonM Eastern New...the Lftects of kange Restriction in Correlation Coefficient Estimation PE - 62703F PR - 7719 4. AUTHOR(S) TA - 18 Douglas E. Jackson WU - 46 Malcolm J...that one must try to estimate the correlation coefficient between two random variables X and Y in some population P using data taken f-om a

  18. Estimating the Condition of the Heat Resistant Lining in an Electrical Reduction Furnace

    Directory of Open Access Journals (Sweden)

    Jan G. Waalmann

    1988-01-01

    Full Text Available This paper presents a system for estimating the condition of the heat resistant lining in an electrical reduction furnace for ferrosilicon. The system uses temperature measured with thermocouples placed on the outside of the furnace-pot. These measurements are used together with a mathematical model of the temperature distribution in the lining in a recursive least squares algorithm to estimate the position of 'the transformation front'. The system is part of a monitoring system which is being developed in the AIP-project: 'Condition monitoring of strongly exposed process equipment in thc ferroalloy industry'. The estimator runs on-line, and results arc presented in colour-graphics on a display unit. The goal is to locate the transformation front with an accuracy of +- 5cm.

  19. Development of an online tool for tsunami inundation simulation and tsunami loss estimation

    Science.gov (United States)

    Srivihok, P.; Honda, K.; Ruangrassamee, A.; Muangsin, V.; Naparat, P.; Foytong, P.; Promdumrong, N.; Aphimaeteethomrong, P.; Intavee, A.; Layug, J. E.; Kosin, T.

    2014-05-01

    The devastating impacts of the 2004 Indian Ocean tsunami highlighted the need for an effective end-to-end tsunami early warning system in the region that connects the scientific components of warning with preparedness of institutions and communities to respond to an emergency. Essential to preparedness planning is knowledge of tsunami risks. In this study, development of an online tool named “INSPIRE” for tsunami inundation simulation and tsunami loss estimation is presented. The tool is designed to accommodate various accuracy levels of tsunami exposure data which will support the users to undertake preliminary tsunami risk assessment from the existing data with progressive improvement with the use of more detailed and accurate datasets. Sampling survey technique is introduced to improve the local vulnerability data with lower cost and manpower. The performance of the proposed methodology and the INSPIRE tool were tested against the dataset in Kamala and Patong municipalities, Phuket province, Thailand. The estimated building type ratios from the sampling survey show the satisfactory agreement with the actual building data at the test sites. Sub-area classification by land use can improve the accuracy of the building type ratio estimation. For the resulting loss estimation, the exposure data generated from detailed field survey can provide the agreeable results when comparing to the actual building damage recorded for the Indian Ocean tsunami event in 2004. However, lower accuracy exposure data derived from sampling survey and remote sensing can still provide a comparative overview of estimated loss.

  20. SLIMMER: SLIce MRI motion estimation and reconstruction tool for studies of fetal anatomy

    Science.gov (United States)

    Kim, Kio; Habas, Piotr A.; Rajagopalan, Vidya; Scott, Julia; Rousseau, Francois; Barkovich, A. James; Glenn, Orit A.; Studholme, Colin

    2011-03-01

    We describe a free software tool which combines a set of algorithms that provide a framework for building 3D volumetric images of regions of moving anatomy using multiple fast multi-slice MRI studies. It is specifically motivated by the clinical application of unsedated fetal brain imaging, which has emerged as an important area for image analysis. The tool reads multiple DICOM image stacks acquired in any angulation into a consistent patient coordinate frame and allows the user to select regions to be locally motion corrected. It combines algorithms for slice motion estimation, bias field inconsistency correction and 3D volume reconstruction from multiple scattered slice stacks. The tool is built onto the RView (http://rview.colin-studholme.net) medical image display software and allows the user to inspect slice stacks, and apply both stack and slice level motion estimation that incorporates temporal constraints based on slice timing and interleave information read from the DICOM data. Following motion estimation an algorithm for bias field inconsistency correction provides the user with the ability to remove artifacts arising from the motion of the local anatomy relative to the imaging coils. Full 3D visualization of the slice stacks and individual slice orientations is provided to assist in evaluating the quality of the motion correction and final image reconstruction. The tool has been evaluated on a range of clinical data acquired on GE, Siemens and Philips MRI scanners.

  1. Estimation of individual cumulative ultraviolet exposure using a geographically-adjusted, openly-accessible tool.

    Science.gov (United States)

    Zhu, Gefei A; Raber, Inbar; Sakshuwong, Sukolsak; Li, Shufeng; Li, Angela S; Tan, Caroline; Chang, Anne Lynn S

    2016-01-20

    Estimates of an individual's cumulative ultraviolet (UV) radiation exposure can be useful since ultraviolet radiation exposure increases skin cancer risk, but a comprehensive tool that is practical for use in the clinic does not currently exist. The objective of this study is to develop a geographically-adjusted tool to systematically estimate an individual's self-reported cumulative UV radiation exposure, investigate the association of these estimates with skin cancer diagnosis, and assess test reliability. A 12-item online questionnaire from validated survey items for UV exposure and skin cancer was administered to online volunteers across the United States and results cross-referenced with UV radiation indices. Cumulative UV exposure scores (CUES) were calculated and correlated with personal history of skin cancer in a case-control design. Reliability was assessed in a separate convenience sample. 1,118 responses were included in the overall sample; the mean age of respondents was 46 (standard deviation 15, range 18 - 81) and 150 (13 %) reported a history of skin cancer. In bivariate analysis of 1:2 age-matched cases (n = 149) and controls (n = 298), skin cancer cases were associated with (1) greater CUES prior to first skin cancer diagnosis than controls without skin cancer history (242,074 vs. 205,379, p = 0.003) and (2) less engagement in UV protective behaviors (p analysis of age-matched data, individuals with CUES in the lowest quartile were less likely to develop skin cancer compared to those in the highest quartile. In reliability testing among 19 volunteers, the 2-week intra-class correlation coefficient for CUES was 0.94. We have provided the programming code for this tool as well as the tool itself via open access. CUES is a useable and comprehensive tool to better estimate lifetime ultraviolet exposure, so that individuals with higher levels of exposure may be identified for counseling on photo-protective measures.

  2. Fuel Consumption Reduction and Weight Estimate of an Intercooled-Recuperated Turboprop Engine

    Science.gov (United States)

    Andriani, Roberto; Ghezzi, Umberto; Ingenito, Antonella; Gamma, Fausto

    2012-09-01

    The introduction of intercooling and regeneration in a gas turbine engine can lead to performance improvement and fuel consumption reduction. Moreover, as first consequence of the saved fuel, also the pollutant emission can be greatly reduced. Turboprop seems to be the most suitable gas turbine engine to be equipped with intercooler and heat recuperator thanks to the relatively small mass flow rate and the small propulsion power fraction due to the exhaust nozzle. However, the extra weight and drag due to the heat exchangers must be carefully considered. An intercooled-recuperated turboprop engine is studied by means of a thermodynamic numeric code that, computing the thermal cycle, simulates the engine behavior at different operating conditions. The main aero engine performances, as specific power and specific fuel consumption, are then evaluated from the cycle analysis. The saved fuel, the pollution reduction, and the engine weight are then estimated for an example case.

  3. Uni-Vector-Sensor Dimensionality Reduction MUSIC Algorithm for DOA and Polarization Estimation

    Directory of Open Access Journals (Sweden)

    Lanmei Wang

    2014-01-01

    Full Text Available This paper addresses the problem of multiple signal classification- (MUSIC- based direction of arrival (DOA and polarization estimation and proposes a new dimensionality reduction MUSIC (DR-MUSIC algorithm. Uni-vector-sensor MUSIC algorithm provides estimation for DOA and polarization; accordingly, a four-dimensional peak search is required, which hence incurs vast amount of computation. In the proposed DR-MUSIC method, the signal steering vector is expressed in the product form of arrival angle function matrix and polarization function vector. The MUSIC joint spectrum is converted to the form of Rayleigh-Ritz ratio by using the feature where the 2-norm of polarization function vector is constant. A four-dimensional MUSIC search reduced the dimension to two two-dimensional searches and the amount of computation is greatly decreased. The theoretical analysis and simulation results have verified the effectiveness of the proposed algorithm.

  4. Noise reduction and estimation in multiple micro-electro-mechanical inertial systems

    Science.gov (United States)

    Waegli, Adrian; Skaloud, Jan; Guerrier, Stéphane; Eulàlia Parés, Maria; Colomina, Ismael

    2010-06-01

    This research studies the reduction and the estimation of the noise level within a redundant configuration of low-cost (MEMS-type) inertial measurement units (IMUs). Firstly, independent observations between units and sensors are assumed and the theoretical decrease in the system noise level is analyzed in an experiment with four MEMS-IMU triads. Then, more complex scenarios are presented in which the noise level can vary in time and for each sensor. A statistical method employed for studying the volatility of financial markets (GARCH) is adapted and tested for the usage with inertial data. This paper demonstrates experimentally and through simulations the benefit of direct noise estimation in redundant IMU setups.

  5. Performance of cumulant-based rank reduction estimator in presence of unexpected modeling errors

    Institute of Scientific and Technical Information of China (English)

    王鼎

    2015-01-01

    Compared with the rank reduction estimator (RARE) based on second-order statistics (called SOS-RARE), the RARE based on fourth-order cumulants (referred to as FOC-RARE) can handle more sources and restrain the negative impacts of the Gaussian colored noise. However, the unexpected modeling errors appearing in practice are known to significantly degrade the performance of the RARE. Therefore, the direction-of-arrival (DOA) estimation performance of the FOC-RARE is quantitatively derived. The explicit expression for direction-finding (DF) error is derived via the first-order perturbation analysis, and then the theoretical formula for the mean square error (MSE) is given. Simulation results demonstrate the validation of the theoretical analysis and reveal that the FOC-RARE is more robust to the unexpected modeling errors than the SOS-RARE.

  6. A tool for computing time-dependent permeability reduction of fractured volcanic conduit margins.

    Science.gov (United States)

    Farquharson, Jamie; Wadsworth, Fabian; Heap, Michael; Baud, Patrick

    2016-04-01

    Laterally-oriented fractures within volcanic conduit margins are thought to play an important role in tempering eruption explosivity by allowing magmatic volatiles to outgas. The permeability of a fractured conduit margin - the equivalent permeability - can be modelled as the sum of permeability contributions of the edifice host rock and the fracture(s) within it. We present here a flexible MATLAB® tool which computes the time-dependent equivalent permeability of a volcanic conduit margin containing ash-filled fractures. The tool is designed so that the end-user can define a wide range of input parameters to yield equivalent permeability estimates for their application. The time-dependence of the equivalent permeability is incorporated by considering permeability decrease as a function of porosity loss in the ash-filled fractures due to viscous sintering (after Russell and Quane, 2005), which is in turn dependent on the depth and temperature of each fracture and the crystal-content of the magma (all user-defined variables). The initial viscosity of the granular material filling the fracture is dependent on the water content (Hess and Dingwell, 1996), which is computed assuming equilibrium depth-dependent water content (Liu et al., 2005). Crystallinity is subsequently accounted for by employing the particle-suspension rheological model of Mueller et al. (2010). The user then defines the number of fractures, their widths, and their depths, and the lengthscale of interest (e.g. the length of the conduit). Using these data, the combined influence of transient fractures on the equivalent permeability of the conduit margin is then calculated by adapting a parallel-plate flow model (developed by Baud et al., 2012 for porous sandstones), for host rock permeabilities from 10-11 to 10-22 m2. The calculated values of porosity and equivalent permeability with time for each host rock permeability is then output in text and worksheet file formats. We introduce two dimensionless

  7. Parameter estimation approach to banding artifact reduction in balanced steady-state free precession.

    Science.gov (United States)

    Björk, Marcus; Ingle, R Reeve; Gudmundson, Erik; Stoica, Petre; Nishimura, Dwight G; Barral, Joëlle K

    2014-09-01

    The balanced steady-state free precession (bSSFP) pulse sequence has shown to be of great interest due to its high signal-to-noise ratio efficiency. However, bSSFP images often suffer from banding artifacts due to off-resonance effects, which we aim to minimize in this article. We present a general and fast two-step algorithm for 1) estimating the unknowns in the bSSFP signal model from multiple phase-cycled acquisitions, and 2) reconstructing band-free images. The first step, linearization for off-resonance estimation (LORE), solves the nonlinear problem approximately by a robust linear approach. The second step applies a Gauss-Newton algorithm, initialized by LORE, to minimize the nonlinear least squares criterion. We name the full algorithm LORE-GN. We derive the Cramér-Rao bound, a theoretical lower bound of the variance for any unbiased estimator, and show that LORE-GN is statistically efficient. Furthermore, we show that simultaneous estimation of T1 and T2 from phase-cycled bSSFP is difficult, since the Cramér-Rao bound is high at common signal-to-noise ratio. Using simulated, phantom, and in vivo data, we illustrate the band-reduction capabilities of LORE-GN compared to other techniques, such as sum-of-squares. Using LORE-GN we can successfully minimize banding artifacts in bSSFP. Copyright © 2013 Wiley Periodicals, Inc.

  8. EXPERIMENTAL ESTIMATION OF TOOL WEAR AND CUTTING TEMPERATURES IN MQL USING CUTTING FLUIDS WITH CNT INCLUSION

    Directory of Open Access Journals (Sweden)

    S.NARAYANA RAO,

    2011-04-01

    Full Text Available Machining often witnesses frequent interruption caused by friction and heat. Cutting fluids are being used in machining for centuries to counter the effects of friction and temperatures. However, because of many disadvantages, MQL has emerged. MQL demands fluids with high performance. This work tries to estimate tool wear and cutting temperatures while using cutting fluids, prepared with carbon nano tube (CNT inclusion.

  9. Techniques and Tools for Estimating Ionospheric Effects in Interferometric and Polarimetric SAR Data

    Science.gov (United States)

    Rosen, P.; Lavalle, M.; Pi, X.; Buckley, S.; Szeliga, W.; Zebker, H.; Gurrola, E.

    2011-01-01

    The InSAR Scientific Computing Environment (ISCE) is a flexible, extensible software tool designed for the end-to-end processing and analysis of synthetic aperture radar data. ISCE inherits the core of the ROI_PAC interferometric tool, but contains improvements at all levels of the radar processing chain, including a modular and extensible architecture, new focusing approach, better geocoding of the data, handling of multi-polarization data, radiometric calibration, and estimation and correction of ionospheric effects. In this paper we describe the characteristics of ISCE with emphasis on the ionospheric modules. To detect ionospheric anomalies, ISCE implements the Faraday rotation method using quadpolarimetric images, and the split-spectrum technique using interferometric single-, dual- and quad-polarimetric images. The ability to generate co-registered time series of quad-polarimetric images makes ISCE also an ideal tool to be used for polarimetric-interferometric radar applications.

  10. Techniques and Tools for Estimating Ionospheric Effects in Interferometric and Polarimetric SAR Data

    Science.gov (United States)

    Rosen, P.; Lavalle, M.; Pi, X.; Buckley, S.; Szeliga, W.; Zebker, H.; Gurrola, E.

    2011-01-01

    The InSAR Scientific Computing Environment (ISCE) is a flexible, extensible software tool designed for the end-to-end processing and analysis of synthetic aperture radar data. ISCE inherits the core of the ROI_PAC interferometric tool, but contains improvements at all levels of the radar processing chain, including a modular and extensible architecture, new focusing approach, better geocoding of the data, handling of multi-polarization data, radiometric calibration, and estimation and correction of ionospheric effects. In this paper we describe the characteristics of ISCE with emphasis on the ionospheric modules. To detect ionospheric anomalies, ISCE implements the Faraday rotation method using quadpolarimetric images, and the split-spectrum technique using interferometric single-, dual- and quad-polarimetric images. The ability to generate co-registered time series of quad-polarimetric images makes ISCE also an ideal tool to be used for polarimetric-interferometric radar applications.

  11. A Web-Based Tool to Estimate Pollutant Loading Using LOADEST

    Directory of Open Access Journals (Sweden)

    Youn Shik Park

    2015-09-01

    Full Text Available Collecting and analyzing water quality samples is costly and typically requires significant effort compared to streamflow data, thus water quality data are typically collected at a low frequency. Regression models, identifying a relationship between streamflow and water quality data, are often used to estimate pollutant loads. A web-based tool using LOAD ESTimator (LOADEST as a core engine with four modules was developed to provide user-friendly interfaces and input data collection via web access. The first module requests and receives streamflow and water quality data from the U.S. Geological Survey. The second module retrieves watershed area for computation of pollutant loads per unit area. The third module examines potential error of input datasets for LOADEST runs, and the last module computes estimated and allowable annual average pollutant loads and provides tabular and graphical LOADEST outputs. The web-based tool was applied to two watersheds in this study, one agriculturally-dominated and one urban-dominated. It was found that annual sediment load at the urban-dominant watershed exceeded the target load; therefore, the web-based tool identified correctly the watershed requiring best management practices to reduce pollutant loads.

  12. Estimating the Fiscal Effects of Public Pharmaceutical Expenditure Reduction in Greece.

    Science.gov (United States)

    Souliotis, Kyriakos; Papageorgiou, Manto; Politi, Anastasia; Frangos, Nikolaos; Tountas, Yiannis

    2015-01-01

    The purpose of the present study is to estimate the impact of pharmaceutical spending reduction on public revenue, based on data from the national health accounts as well as on reports of Greece's organizations. The methodology of the analysis is structured in two basic parts. The first part presents the urgency for rapid cutbacks on public pharmaceutical costs due to the financial crisis and provides a conceptual framework for the contribution of the Greek pharmaceutical branch to the country's economy. In the second part, we perform a quantitative analysis for the estimation of multiplier effects of public pharmaceutical expenditure reduction on main revenue sources, such as taxes and social contributions. We also fit projection models with multipliers as regressands for the evaluation of the efficiency of the particular fiscal measure in the short run. According to the results, nearly half of the gains from the measure's application is offset by financially equivalent decreases in the government's revenue, i.e., losses in tax revenues and social security contributions alone, not considering any other direct or indirect costs. The findings of multipliers' high value and increasing short-term trend imply the measure's inefficiency henceforward and signal the risk of vicious circles that will provoke the economy's deprivation of useful resources.

  13. Estimating the fiscal effects of public pharmaceutical expenditure reduction in Greece

    Directory of Open Access Journals (Sweden)

    Kyriakos eSouliotis

    2015-08-01

    Full Text Available The purpose of the present study is to estimate the impact of pharmaceutical spending reduction on public revenue, based on data from the national health accounts as well as on reports of Greece’s organizations. The methodology of the analysis is structured in two basic parts. The first part presents the urgency for rapid cutbacks on public pharmaceutical costs due to the financial crisis and provides a conceptual framework for the contribution of the Greek pharmaceutical branch to the country’s economy. In the second part, we perform a quantitative analysis for the estimation of multiplier effects of public pharmaceutical expenditure reduction on main revenue sources such as taxes and social contributions. We also fit projection models with multipliers as regressands for the evaluation of the efficiency of the particular fiscal measure in the short run. According to the results, near half of the gains from the measure’s application is offset by financially equivalent decreases in the government’s revenue, i.e. losses in tax revenues and social security contributions alone, not considering any other direct or indirect costs. The findings of multipliers’ high value and increasing short-term trend imply the measure’s inefficiency henceforward and signal the risk of vicious circles that will provoke the economy’s deprivation of useful resources.

  14. PREMIM and EMIM: tools for estimation of maternal, imprinting and interaction effects using multinomial modelling

    Directory of Open Access Journals (Sweden)

    Howey Richard

    2012-06-01

    Full Text Available Abstract Background Here we present two new computer tools, PREMIM and EMIM, for the estimation of parental and child genetic effects, based on genotype data from a variety of different child-parent configurations. PREMIM allows the extraction of child-parent genotype data from standard-format pedigree data files, while EMIM uses the extracted genotype data to perform subsequent statistical analysis. The use of genotype data from the parents as well as from the child in question allows the estimation of complex genetic effects such as maternal genotype effects, maternal-foetal interactions and parent-of-origin (imprinting effects. These effects are estimated by EMIM, incorporating chosen assumptions such as Hardy-Weinberg equilibrium or exchangeability of parental matings as required. Results In application to simulated data, we show that the inference provided by EMIM is essentially equivalent to that provided by alternative (competing software packages such as MENDEL and LEM. However, PREMIM and EMIM (used in combination considerably outperform MENDEL and LEM in terms of speed and ease of execution. Conclusions Together, EMIM and PREMIM provide easy-to-use command-line tools for the analysis of pedigree data, giving unbiased estimates of parental and child genotype relative risks.

  15. Estimating the health impacts of tobacco harm reduction policies: a simulation modeling approach.

    Science.gov (United States)

    Ahmad, Sajjad; Billimek, John

    2005-08-01

    With adult smoking prevalence rates declining too slowly to reach national objectives, opinion leaders are considering policies to improve tobacco-related outcomes by regulating the composition of cigarettes to be (1) less harmful and/or (2) less addictive. Because harm reduction efforts may actually encourage higher cigarette consumption by promoting a safer image, and addictiveness reduction may increase the harmfulness of cigarettes by encouraging compensatory smoking behaviors, policymakers must consider the tradeoffs between these two approaches when proposing legislation to control cigarette content. To estimate health impacts, we developed a dynamic computer model simulating changes in the age- and gender-specific smoking behaviors of the U.S. population over time. Secondary data for model parameters were obtained from publicly available sources. Population health impacts were measured as change in smoking prevalence and the change in cumulative quality-adjusted life-years (QALYs) in the U.S. population over 75 years. According to the risk-use threshold matrix generated by the simulation, modifying cigarettes to reduce their harmfulness and/or addictiveness could result in important gains to the nation's health. Addictiveness reduction efforts producing a 60% improvement in smoking behavior change probabilities would produce a net gain in population health at every plausible level of increase of smoking-related harm that was modeled. A 40% reduction in smoking-related harm would produce a net QALY gain at every level of behavior change considered. This research should prove useful to policymakers as they contemplate giving the FDA the authority to regulate the composition of cigarettes.

  16. Statistical tools applied for the reduction of the defect rate of coffee degassing valves

    Directory of Open Access Journals (Sweden)

    Giorgio Olmi

    2015-04-01

    Full Text Available Coffee is a very common beverage exported all over the world: just after roasting, coffee beans are packed in plastic or paper bags, which then experience long transfers with long storage times. Fresh roasted coffee emits large amounts of CO2 for several weeks. This gas must be gradually released, to prevent package over-inflation and to preserve aroma, moreover beans must be protected from oxygen coming from outside. Therefore, one-way degassing valves are applied to each package: their correct functionality is strictly related to the interference coupling between their bodies and covers and to the correct assembly of the other involved parts. This work takes inspiration from an industrial problem: a company that assembles valve components, supplied by different manufacturers, observed a high level of defect rate, affecting its valve production. An integrated approach, consisting in the adoption of quality charts, in an experimental campaign for the dimensional analysis of the mating parts and in the statistical processing of the data, was necessary to tackle the question. In particular, a simple statistical tool was made available to predict the defect rate and to individuate the best strategy for its reduction. The outcome was that requiring a strict protocol, regarding the combinations of parts from different manufacturers for assembly, would have been almost ineffective. Conversely, this study led to the individuation of the weak point in the manufacturing process of the mating components and to the suggestion of a slight improvement to be performed, with the final result of a significant (one order of magnitude decrease of the defect rate.

  17. Software tools overview : process integration, modelling and optimisation for energy saving and pollution reduction

    OpenAIRE

    Lam, Hon Loong; Klemeš, Jiri; Kravanja, Zdravko; Varbanov, Petar

    2012-01-01

    This paper provides an overview of software tools based on long experience andapplications in the area of process integration, modelling and optimisation. The first part reviews the current design practice and the development of supporting software tools. Those are categorised as: (1) process integration and retrofit analysis tools, (2) general mathematical modelling suites with optimisation libraries, (3) flowsheeting simulation and (4) graph-based process optimisation tools. The second part...

  18. User’s guide for the Delaware River Basin Streamflow Estimator Tool (DRB-SET)

    Science.gov (United States)

    Stuckey, Marla H.; Ulrich, James E.

    2016-06-09

    IntroductionThe Delaware River Basin Streamflow Estimator Tool (DRB-SET) is a tool for the simulation of streamflow at a daily time step for an ungaged stream location in the Delaware River Basin. DRB-SET was developed by the U.S. Geological Survey (USGS) and funded through WaterSMART as part of the National Water Census, a USGS research program on national water availability and use that develops new water accounting tools and assesses water availability at the regional and national scales. DRB-SET relates probability exceedances at a gaged location to those at an ungaged stream location. Once the ungaged stream location has been identified by the user, an appropriate streamgage is automatically selected in DRB-SET using streamflow correlation (map correlation method). Alternately, the user can manually select a different streamgage or use the closest streamgage. A report file is generated documenting the reference streamgage and ungaged stream location information, basin characteristics, any warnings, baseline (minimally altered) and altered (affected by regulation, diversion, mining, or other anthropogenic activities) daily mean streamflow, and the mean and median streamflow. The estimated daily flows for the ungaged stream location can be easily exported as a text file that can be used as input into a statistical software package to determine additional streamflow statistics, such as flow duration exceedance or streamflow frequency statistics.

  19. BOMBER: A tool for estimating water quality and bottom properties from remote sensing images

    Science.gov (United States)

    Giardino, Claudia; Candiani, Gabriele; Bresciani, Mariano; Lee, Zhongping; Gagliano, Stefano; Pepe, Monica

    2012-08-01

    BOMBER (Bio-Optical Model Based tool for Estimating water quality and bottom properties from Remote sensing images) is a software package for simultaneous retrieval of the optical properties of water column and bottom from remotely sensed imagery, which makes use of bio-optical models for optically deep and optically shallow waters. Several menus allow the user to choose the model type, to specify the input and output files, and to set all of the variables involved in the model parameterization and inversion. The optimization technique allows the user to retrieve the maps of chlorophyll concentration, suspended particulate matter concentration, coloured dissolved organic matter absorption and, in case of shallow waters, bottom depth and distributions of up to three different types of substrate, defined by the user according to their albedo. The software requires input image data that must be atmospherically corrected to remote sensing reflectance values. For both deep and shallow water models, a map of the relative error involved in the inversion procedure is also given. The tool was originally intended to estimate water quality in lakes; however thanks to its general design, it can be applied to any other aquatic environments (e.g., coastal zones, estuaries, lagoons) for which remote sensing reflectance values are known. BOMBER is fully programmed in IDL (Interactive Data Language) and uses IDL widgets as graphical user interface. It runs as an add-on tool for the ENVI+IDL image processing software and is available on request.

  20. ISOT_Calc: A versatile tool for parameter estimation in sorption isotherms

    Science.gov (United States)

    Beltrán, José L.; Pignatello, Joseph J.; Teixidó, Marc

    2016-09-01

    Geochemists and soil chemists commonly use parametrized sorption data to assess transport and impact of pollutants in the environment. However, this evaluation is often hampered by a lack of detailed sorption data analysis, which implies further non-accurate transport modeling. To this end, we present a novel software tool to precisely analyze and interpret sorption isotherm data. Our developed tool, coded in Visual Basic for Applications (VBA), operates embedded within the Microsoft Excel™ environment. It consists of a user-defined function named ISOT_Calc, followed by a supplementary optimization Excel macro (Ref_GN_LM). The ISOT_Calc function estimates the solute equilibrium concentration in the aqueous and solid phases (Ce and q, respectively). Hence, it represents a very flexible way in the optimization of the sorption isotherm parameters, as it can be carried out over the residuals of q, Ce, or both simultaneously (i.e., orthogonal distance regression). The developed function includes the most usual sorption isotherm models, as predefined equations, as well as the possibility to easily introduce custom-defined ones. Regarding the Ref_GN_LM macro, it allows the parameter optimization by using a Levenberg-Marquardt modified Gauss-Newton iterative procedure. In order to evaluate the performance of the presented tool, both function and optimization macro have been applied to different sorption data examples described in the literature. Results showed that the optimization of the isotherm parameters was successfully achieved in all cases, indicating the robustness and reliability of the developed tool. Thus, the presented software tool, available to researchers and students for free, has proven to be a user-friendly and an interesting alternative to conventional fitting tools used in sorption data analysis.

  1. Estimation and reduction of CO{sub 2} emissions from crude oil distillation units

    Energy Technology Data Exchange (ETDEWEB)

    Gadalla, M. [Departament d' Engingeria Quimica, Universitat Rovira i Virgili, Paisos Catalans 26, 43007 Tarragona (Spain)]. E-mail: mamdouh.gadalla@urv.net; Olujic, Z. [Laboratory for Process Equipment, TU Delft, Leeghwaterstraat 44, 2628 CA Delft (Netherlands); Jobson, M. [Centre for Process Integration, CEAS, The University of Manchester, PO Box 88, Manchester, M60 1QD, UK (United Kingdom); Smith, R. [Centre for Process Integration, CEAS, The University of Manchester, PO Box 88, Manchester, M60 1QD, UK (United Kingdom)

    2006-10-15

    Distillation systems are energy-intensive processes, and consequently contribute significantly to the greenhouse gases emissions (e.g. carbon dioxide (CO{sub 2}). A simple model for the estimation of CO{sub 2} emissions associated with operation of heat-integrated distillation systems as encountered in refineries is introduced. In conjunction with a shortcut distillation model, this model has been used to optimize the process conditions of an existing crude oil atmospheric tower unit aiming at minimization of CO{sub 2} emissions. Simulation results indicate that the total CO{sub 2} emissions of the existing crude oil unit can be cut down by 22%, just by changing the process conditions accordingly, and that the gain in this respect can be doubled by integrating a gas turbine. In addition, emissions reduction is accompanied by substantial profit increase due to utility saving and/or export.

  2. Parameter and State Estimation of Large-Scale Complex Systems Using Python Tools

    Directory of Open Access Journals (Sweden)

    M. Anushka S. Perera

    2015-07-01

    Full Text Available This paper discusses the topics related to automating parameter, disturbance and state estimation analysis of large-scale complex nonlinear dynamic systems using free programming tools. For large-scale complex systems, before implementing any state estimator, the system should be analyzed for structural observability and the structural observability analysis can be automated using Modelica and Python. As a result of structural observability analysis, the system may be decomposed into subsystems where some of them may be observable --- with respect to parameter, disturbances, and states --- while some may not. The state estimation process is carried out for those observable subsystems and the optimum number of additional measurements are prescribed for unobservable subsystems to make them observable. In this paper, an industrial case study is considered: the copper production process at Glencore Nikkelverk, Kristiansand, Norway. The copper production process is a large-scale complex system. It is shown how to implement various state estimators, in Python, to estimate parameters and disturbances, in addition to states, based on available measurements.

  3. bz-rates: A Web Tool to Estimate Mutation Rates from Fluctuation Analysis.

    Science.gov (United States)

    Gillet-Markowska, Alexandre; Louvel, Guillaume; Fischer, Gilles

    2015-09-02

    Fluctuation analysis is the standard experimental method for measuring mutation rates in micro-organisms. The appearance of mutants is classically described by a Luria-Delbrück distribution composed of two parameters: the number of mutations per culture (m) and the differential growth rate between mutant and wild-type cells (b). A precise estimation of these two parameters is a prerequisite to the calculation of the mutation rate. Here, we developed bz-rates, a Web tool to calculate mutation rates that provides three useful advances over existing Web tools. First, it allows taking into account b, the differential growth rate between mutant and wild-type cells, in the estimation of m with the generating function. Second, bz-rates allows the user to take into account a deviation from the Luria-Delbrück distribution called z, the plating efficiency, in the estimation of m. Finally, the Web site provides a graphical visualization of the goodness-of-fit between the experimental data and the model. bz-rates is accessible at http://www.lcqb.upmc.fr/bzrates.

  4. Development of a statistical tool for the estimation of riverbank erosion probability

    Science.gov (United States)

    Varouchakis, Emmanouil

    2016-04-01

    Riverbank erosion affects river morphology and local habitat, and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict areas vulnerable to erosion is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a innovative statistical methodology is proposed to predict the probability of presence or absence of erosion in a river section. A physically based model determines the locations vulnerable to erosion by quantifying the potential eroded area. The derived results are used to determine validation locations for the evaluation of the statistical tool performance. The statistical tool is based on a series of independent local variables and employs the Logistic Regression methodology. It is developed in two forms, Logistic Regression and Locally Weighted Logistic Regression, which both deliver useful and accurate results. The second form though, provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed tool is easy to use, accurate and can be applied to any region and river. Varouchakis, E. A., Giannakis, G. V., Lilli, M. A., Ioannidou, E., Nikolaidis, N. P., and Karatzas, G. P.: Development of a statistical tool for the estimation of riverbank erosion probability, SOIL (EGU), in print, 2016.

  5. SANDY: A Matlab tool to estimate the sediment size distribution from a sieve analysis

    Science.gov (United States)

    Ruiz-Martínez, Gabriel; Rivillas-Ospina, Germán Daniel; Mariño-Tapia, Ismael; Posada-Vanegas, Gregorio

    2016-07-01

    This paper presents a new computational tool called SANDY© which calculates the sediment size distribution and its textural parameters from a sieved sediment sample using Matlab®. The tool has been developed for professionals involved in the study of sediment transport along coastal margins, estuaries, rivers and desert dunes. The algorithm uses several types of statistical analyses to obtain the main textural characteristics of the sediment sample (D50, mean, sorting, skewness and kurtosis). SANDY© includes the method of moments (geometric, arithmetic and logarithmic approaches) and graphical methods (geometric, arithmetic and mixed approaches). In addition, it provides graphs of the sediment size distribution and its classification. The computational tool automatically exports all the graphs as enhanced metafile images and the final report is also exported as a plain text file. Parameters related to bed roughness such as Nikuradse and roughness length are also computed. Theoretical depositional environments are established by a discriminant function analysis. Using the uniformity coefficient the hydraulic conductivity of the sand as well as the porosity and void ratio of the sediment sample are obtained. The maximum relative density related to sand compaction is also computed. The Matlab® routine can compute one or several samples. SANDY© is a useful tool for estimating the sediment textural parameters which are the basis for studies of sediment transport.

  6. Recov'Heat: An estimation tool of urban waste heat recovery potential in sustainable cities

    Science.gov (United States)

    Goumba, Alain; Chiche, Samuel; Guo, Xiaofeng; Colombert, Morgane; Bonneau, Patricia

    2017-02-01

    Waste heat recovery is considered as an efficient way to increase carbon-free green energy utilization and to reduce greenhouse gas emission. Especially in urban area, several sources such as sewage water, industrial process, waste incinerator plants, etc., are still rarely explored. Their integration into a district heating system providing heating and/or domestic hot water could be beneficial for both energy companies and local governments. EFFICACITY, a French research institute focused on urban energy transition, has developed an estimation tool for different waste heat sources potentially explored in a sustainable city. This article presents the development method of such a decision making tool which, by giving both energetic and economic analysis, helps local communities and energy service companies to make preliminary studies in heat recovery projects.

  7. Qualitative: Python Tool for MT Quality Estimation Supporting Server Mode and Hybrid MT

    Directory of Open Access Journals (Sweden)

    Avramidis Eleftherios

    2016-10-01

    Full Text Available We are presenting the development contributions of the last two years to our Python opensource Quality Estimation tool, a tool that can function in both experiment-mode and online web-service mode. The latest version provides a new MT interface, which communicates with SMT and rule-based translation engines and supports on-the-fly sentence selection. Additionally, we present an improved Machine Learning interface allowing more efficient communication with several state-of-the-art toolkits. Additions also include a more informative training process, a Python re-implementation of QuEst baseline features, a new LM toolkit integration, an additional PCFG parser and alignments of syntactic nodes.

  8. Qualitative: Open Source Python Tool for Quality Estimation over Multiple Machine Translation Outputs

    Directory of Open Access Journals (Sweden)

    Eleftherios Avramidis

    2014-09-01

    Full Text Available “Qualitative” is a python toolkit for ranking and selection of sentence-level output by different MT systems using Quality Estimation. The toolkit implements a basic pipeline for annotating the given sentences with black-box features. Consequently, it applies a machine learning mechanism in order to rank data based on models pre-trained on human preferences. The preprocessing pipeline includes support for language models, PCFG parsing, language checking tools and various other pre-processors and feature generators. The code follows the principles of object-oriented programming to allow modularity and extensibility. The tool can operate by processing both batch-files and single sentences. An XML-RPC interface is provided for hooking up with web-services and a graphical animated web-based interface demonstrates its potential on-line use.

  9. Estimating the impact of plasma HIV-1 RNA reductions on heterosexual HIV-1 transmission risk.

    Directory of Open Access Journals (Sweden)

    Jairam R Lingappa

    Full Text Available BACKGROUND: The risk of sexual transmission of HIV-1 is strongly associated with the level of HIV-1 RNA in plasma making reduction in HIV-1 plasma levels an important target for HIV-1 prevention interventions. A quantitative understanding of the relationship of plasma HIV-1 RNA and HIV-1 transmission risk could help predict the impact of candidate HIV-1 prevention interventions that operate by reducing plasma HIV-1 levels, such as antiretroviral therapy (ART, therapeutic vaccines, and other non-ART interventions. METHODOLOGY/PRINCIPAL FINDINGS: We use prospective data collected from 2004 to 2008 in East and Southern African HIV-1 serodiscordant couples to model the relationship of plasma HIV-1 RNA levels and heterosexual transmission risk with confirmation of HIV-1 transmission events by HIV-1 sequencing. The model is based on follow-up of 3381 HIV-1 serodiscordant couples over 5017 person-years encompassing 108 genetically-linked HIV-1 transmission events. HIV-1 transmission risk was 2.27 per 100 person-years with a log-linear relationship to log(10 plasma HIV-1 RNA. The model predicts that a decrease in average plasma HIV-1 RNA of 0.74 log(10 copies/mL (95% CI 0.60 to 0.97 reduces heterosexual transmission risk by 50%, regardless of the average starting plasma HIV-1 level in the population and independent of other HIV-1-related population characteristics. In a simulated population with a similar plasma HIV-1 RNA distribution the model estimates that 90% of overall HIV-1 infections averted by a 0.74 copies/mL reduction in plasma HIV-1 RNA could be achieved by targeting this reduction to the 58% of the cohort with plasma HIV-1 levels ≥4 log(10 copies/mL. CONCLUSIONS/SIGNIFICANCE: This log-linear model of plasma HIV-1 levels and risk of sexual HIV-1 transmission may help estimate the impact on HIV-1 transmission and infections averted from candidate interventions that reduce plasma HIV-1 RNA levels.

  10. Estimation of Injected Carbon Longevity and Re-oxidation Times at Enhanced Reductive Bioremediation Sites

    Science.gov (United States)

    Tillotson, J.; Borden, R. C.

    2014-12-01

    Addition of an organic substrate to provide an electron donor and carbon source can be very effective at stimulating enhanced reductive bioremediation (ERB) of chlorinated solvents, energetics, and other groundwater contaminants. However, the quantity of electron donor added is usually based on an individual's or company's "rule of thumb" rather than considering site-specific conditions such as groundwater velocity, carbon source, and upgradient electron acceptor concentrations, potentially leading to unnecessarily large amounts of carbon injected. Mass balance estimates indicate that over 99% of electrons donated go to electron acceptors other than the primary contaminants. Thus, injecting excessive amounts of organic carbon can lead to a persistent reducing zone, releasing elevated levels of dissolved manganese, iron, methane, and sometimes arsenic. Monitoring data on carbon injections and electron acceptors were collected from 33 ERB sites. Two approaches were then used to evaluate carbon longevity and the time required to return to near-oxic conditions at an ERB site. The first method employed a simple mass balance approach, using such input parameters as groundwater velocity, upgradient electron acceptors, and amount of carbon injected. In the second approach, a combined flow, transport and geochemical model was developed using PHT3D to estimate the impact of ERB on secondary water quality impacts (SWQIs; e.g., methane production, iron mobilization and transport, etc.) The model was originally developed for use in estimating SWQIs released from petroleum sites, but has since been modified for use at ERB sites. The ERB site to be studied is a perchlorate release site in Elkton, Maryland where 840 lbs of an emulsified vegetable oil was injected. The results from the simple mass balance approach and PHT3D model will be compared and used to identify conditions where the simplified approach may be appropriate.

  11. Temperature dependent effective friction coefficient estimation in friction stir welding with the bobbin tool

    Directory of Open Access Journals (Sweden)

    Mijajlović Miroslav M.

    2016-01-01

    Full Text Available The friction coefficient in many friction stir welding researches is generally used as an effective, constant value without concern on the adaptable and changeable nature of the friction during welding sequence. This is understandable because the main problem in analyzing friction in friction stir welding are complex nature of the friction processes, case-dependent and time dependent contact between the bodies, influence of the temperature, sliding velocity, etc. This paper is presenting a complex experimental-numerical-analytical model for estimating the effective friction coefficient on contact of the bobbin tool and welding plates during welding, considering the temperature at the contact as the most influencing parameter on friction. The estimation criterion is the correspondence of the experimental temperature and temperature from the numerical model. The estimation procedure is iterative and parametric - the heat transport parameters and friction coefficient are adapted during the estimation procedure in a realistic manner to achieve relative difference between experimental and model’s temperature lower than 3%. The results show that friction coefficient varies from 0.01 to 0.21 for steel-aluminium alloy contact and temperature range from 406°C to 22°C.

  12. Estimating the climate and air quality benefits of aviation fuel and emissions reductions

    Science.gov (United States)

    Dorbian, Christopher S.; Wolfe, Philip J.; Waitz, Ian A.

    2011-05-01

    In this study we consider the implications of our current understanding of aviation climate impacts as it relates to the ratio of non-CO 2 to CO 2 effects from aviation. We take as inputs recent estimates from the literature of the magnitude of the component aviation impacts and associated uncertainties. We then employ a simplified probabilistic impulse response function model for the climate and a range of damage functions to estimate the ratio of non-CO 2 to CO 2 impacts of aviation for a range of different metrics, scientific assumptions, future background emissions scenarios, economic growth scenarios, and discount rates. We take cost-benefit analysis as our primary context and thus focus on integral metrics that can be related to damages: the global warming potential, the time-integrated change in surface temperature, and the net present value of damages. We also present results based on an endpoint metric, the global temperature change potential. These latter results would be more appropriate for use in a cost-effectiveness framework (e.g., with a well-defined policy target for the anthropogenic change in surface temperature at a specified time in the future). We find that the parameter that most influences the ratio of non-CO 2 to CO 2 impacts of aviation is the discount rate, or analogously the time window used for physical metrics; both are expressions of the relative importance of long-lived versus short-lived impacts. Second to this is the influence of the radiative forcing values that are assumed for aviation-induced cloudiness effects. Given the large uncertainties in short-lived effects from aviation, and the dominating influence of discounting or time-windowing, we find that the choice of metric is relatively less influential. We express the ratios of non-CO 2 to CO 2 impacts on a per unit fuel burn basis so that they can be multiplied by a social cost of carbon to estimate the additional benefits of fuel burn reductions from aviation beyond those

  13. Using the soil and water assessment tool to estimate achievable water quality targets through implementation of beneficial management practices in an agricultural watershed.

    Science.gov (United States)

    Yang, Qi; Benoy, Glenn A; Chow, Thien Lien; Daigle, Jean-Louis; Bourque, Charles P-A; Meng, Fan-Rui

    2012-01-01

    Runoff from crop production in agricultural watersheds can cause widespread soil loss and degradation of surface water quality. Beneficial management practices (BMPs) for soil conservation are often implemented as remedial measures because BMPs can reduce soil erosion and improve water quality. However, the efficacy of BMPs may be unknown because it can be affected by many factors, such as farming practices, land-use, soil type, topography, and climatic conditions. As such, it is difficult to estimate the impacts of BMPs on water quality through field experiments alone. In this research, the Soil and Water Assessment Tool was used to estimate achievable performance targets of water quality indicators (sediment and soluble P loadings) after implementation of combinations of selected BMPs in the Black Brook Watershed in northwestern New Brunswick, Canada. Four commonly used BMPs (flow diversion terraces [FDTs], fertilizer reductions, tillage methods, and crop rotations), were considered individually and in different combinations. At the watershed level, the best achievable sediment loading was 1.9 t ha(-1) yr(-1) (89% reduction compared with default scenario), with a BMP combination of crop rotation, FDT, and no-till. The best achievable soluble P loading was 0.5 kg ha(-1) yr(-1) (62% reduction), with a BMP combination of crop rotation and FDT and fertilizer reduction. Targets estimated through nonpoint source water quality modeling can be used to evaluate BMP implementation initiatives and provide milestones for the rehabilitation of streams and rivers in agricultural regions.

  14. FEA Based Tool Life Quantity Estimation of Hot Forging Dies Under Cyclic Thermo-Mechanical Loads

    Science.gov (United States)

    Behrens, B.-A.; Bouguecha, A.; Schäfer, F.; Hadifi, T.

    2011-01-01

    Hot forging dies are exposed during service to a combination of cyclic thermo-mechanical, tribological and chemical loads. Besides abrasive and adhesive wear on the die surface, fatigue crack initiation with subsequent fracture is one of the most frequent causes of failure. In order to extend the tool life, the finite element analysis (FEA) may serve as a means for process design and process optimisation. So far the FEA based estimation of the production cycles until initial cracking is limited as tool material behaviour due to repeated loading is not captured with the required accuracy. Material models which are able to account for cyclic effects are not verified for the fatigue life predictions of forging dies. Furthermore fatigue properties from strain controlled fatigue tests of relevant hot work steels are to date not available to allow for a close-to-reality fatigue life prediction. Two industrial forging processes, where clear fatigue crack initiation has been observed are considered for a fatigue analysis. For this purpose the relevant tool components are modelled with elasto-plastic material behaviour. The predicted sites, where crack initiation occurs, agree with the ones observed on the real die component.

  15. Treatment Cost Analysis Tool (TCAT) for estimating costs of outpatient treatment services.

    Science.gov (United States)

    Flynn, Patrick M; Broome, Kirk M; Beaston-Blaakman, Aaron; Knight, Danica K; Horgan, Constance M; Shepard, Donald S

    2009-02-01

    A Microsoft Excel-based workbook designed for research analysts to use in a national study was retooled for treatment program directors and financial officers to allocate, analyze, and estimate outpatient treatment costs in the U.S. This instrument can also be used as a planning and management tool to optimize resources and forecast the impact of future changes in staffing, client flow, program design, and other resources. The Treatment Cost Analysis Tool (TCAT) automatically provides feedback and generates summaries and charts using comparative data from a national sample of non-methadone outpatient providers. TCAT is being used by program staff to capture and allocate both economic and accounting costs, and outpatient service costs are reported for a sample of 70 programs. Costs for an episode of treatment in regular, intensive, and mixed types of outpatient treatment were $882, $1310, and $1381 respectively (based on 20% trimmed means and 2006 dollars). An hour of counseling cost $64 in regular, $85 intensive, and $86 mixed. Group counseling hourly costs per client were $8, $11, and $10 respectively for regular, intensive, and mixed. Future directions include use of a web-based interview version, much like some of the commercially available tax preparation software tools, and extensions for use in other modalities of treatment.

  16. Vibration reduction of pneumatic percussive rivet tools: mechanical and ergonomic re-design approaches.

    Science.gov (United States)

    Cherng, John G; Eksioglu, Mahmut; Kizilaslan, Kemal

    2009-03-01

    This paper presents a systematic design approach, which is the result of years of research effort, to ergonomic re-design of rivet tools, i.e. rivet hammers and bucking bars. The investigation was carried out using both ergonomic approach and mechanical analysis of the rivet tools dynamic behavior. The optimal mechanical design parameters of the re-designed rivet tools were determined by Taguchi method. Two ergonomically re-designed rivet tools with vibration damping/isolation mechanisms were tested against two conventional rivet tools in both laboratory and field tests. Vibration characteristics of both types of tools were measured by laboratory tests using a custom-made test fixture. The subjective field evaluations of the tools were performed by six experienced riveters at an aircraft repair shop. Results indicate that the isolation spring and polymer damper are very effective in reducing the overall level of vibration under both unweighted and weighted acceleration conditions. The mass of the dolly head and the housing played a significant role in the vibration absorption of the bucking bars. Another important result was that the duct iron has better vibration reducing capability compared to steel and aluminum for bucking bars. Mathematical simulation results were also consistent with the experimental results. Overall conclusion obtained from the study was that by applying the design principles of ergonomics and by adding vibration damping/isolation mechanisms to the rivet tools, the vibration level can significantly be reduced and the tools become safer and user friendly. The details of the experience learned, design modifications, test methods, mathematical models and the results are included in the paper.

  17. A correction in the CDM methodological tool for estimating methane emissions from solid waste disposal sites.

    Science.gov (United States)

    Santos, M M O; van Elk, A G P; Romanel, C

    2015-12-01

    Solid waste disposal sites (SWDS) - especially landfills - are a significant source of methane, a greenhouse gas. Although having the potential to be captured and used as a fuel, most of the methane formed in SWDS is emitted to the atmosphere, mainly in developing countries. Methane emissions have to be estimated in national inventories. To help this task the Intergovernmental Panel on Climate Change (IPCC) has published three sets of guidelines. In addition, the Kyoto Protocol established the Clean Development Mechanism (CDM) to assist the developed countries to offset their own greenhouse gas emissions by assisting other countries to achieve sustainable development while reducing emissions. Based on methodologies provided by the IPCC regarding SWDS, the CDM Executive Board has issued a tool to be used by project developers for estimating baseline methane emissions in their project activities - on burning biogas from landfills or on preventing biomass to be landfilled and so avoiding methane emissions. Some inconsistencies in the first two IPCC guidelines have already been pointed out in an Annex of IPCC latest edition, although with hidden details. The CDM tool uses a model for methane estimation that takes on board parameters, factors and assumptions provided in the latest IPCC guidelines, while using in its core equation the one of the second IPCC edition with its shortcoming as well as allowing a misunderstanding of the time variable. Consequences of wrong ex-ante estimation of baseline emissions regarding CDM project activities can be of economical or environmental type. Example of the first type is the overestimation of 18% in an actual project on biogas from landfill in Brazil that harms its developers; of the second type, the overestimation of 35% in a project preventing municipal solid waste from being landfilled in China, which harms the environment, not for the project per se but for the undue generated carbon credits. In a simulated landfill - the same

  18. The Influence of Tool Texture on Friction and Lubrication in Strip Reduction Testing

    DEFF Research Database (Denmark)

    Sulaiman, Mohd Hafis Bin; Christiansen, Peter; Bay, Niels Oluf

    2017-01-01

    similar to flat table mountains to avoid mechanical interlocking in the valleys; otherwise, an increase in drawing load and pick-up on the tools are observed. The textured tool surface lowers friction and improves lubrication performance, provided that the distance between pockets is 2–4 times larger than...... the pocket width. Larger drawing speed facilitates escape of the entrapped lubricant in the pockets. Testing with low-to-medium viscosity oils leads to a low sheet roughness on the plateaus, but also local workpiece material pick-up on the tool plateaus. Large lubricant viscosity results in higher sheet...

  19. On-Line Flutter Prediction Tool for Wind Tunnel Flutter Testing using Parameter Varying Estimation Methodology Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology, Inc. (ZONA) proposes to develop an on-line flutter prediction tool using the parameter varying estimation (PVE) methodology, called the PVE Toolbox,...

  20. On-Line Flutter Prediction Tool for Wind Tunnel Flutter Testing using Parameter Varying Estimation Methodology Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology, Inc. (ZONA) proposes to develop an on-line flutter prediction tool for wind tunnel model using the parameter varying estimation (PVE) technique to...

  1. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    Science.gov (United States)

    Sirirojvisuth, Apinut

    In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this

  2. The App SOC plus a tool to estimate and calculate organic carbon in the soil profile

    Directory of Open Access Journals (Sweden)

    Francisco Bautista

    2016-04-01

    Full Text Available In the world, researchers are working very intensively in the development of soil organic carbon (SOC inventories. Soil organic carbon is very important because it constitutes the largest reservoir of carbon in terrestrial ecosystems. Maintaining and increasing soil carbon is an option to reduce the amounts of CO2 in the atmosphere, and thereby, to reduce or mitigate climate change. The SOC is now a topic of great interest hence it is recommended to know the amount of SOC along the profile to select and evaluate those areas that should be preserved. The aims of developing App SOC plus were to eliminate the calculate errors of SOC and to make a tool to estimate SOC in field. The common units of measurement of soil properties were employed: bulk density in mg mL−1, horizon thickness in centimetres, stoniness and organic carbon in percentage. The App SOC plus was developed in the Android platform. App SOC plus involves a three-step process: introduction of soil properties, calculation of SOC to horizon and soil profile, and conversion of units using the international and English systems. As a result, there will no longer be confusions with conversion units using App SOC plus; with App SOC plus the soil organic carbon can now be calculated or/and estimated because it provides instructions (aids to estimate the soil properties necessary to calculate the SOC in the soil profile. You can save time in the calculation of SOC. App SOC plus is a tool for diagnosis in the field.

  3. Phase-processing as a tool for speckle reduction in pulse-echo images

    DEFF Research Database (Denmark)

    Healey, AJ; Leeman, S; Forsberg, F

    1991-01-01

    . Traditional speckle reduction procedures regard speckle correction as a stochastic process and trade image smoothing (resolution loss) for speckle reduction. Recently, a new phase acknowledging technique has been proposed that is unique in its ability to correct for speckle interference with no image......Due to the coherent nature of conventional ultrasound medical imaging systems interference artefacts occur in pulse echo images. These artefacts are generically termed 'speckle'. The phenomenon may severely limit low contrast resolution with clinically relevant information being obscured...

  4. Optimal Wavelength Selection in Ultraviolet Spectroscopy for the Estimation of Toxin Reduction Ratio during Hemodialysis

    Directory of Open Access Journals (Sweden)

    Amir Ghanifar

    2016-06-01

    Full Text Available Introduction The concentration of substances, including urea, creatinine, and uric acid, can be used as an index to measure toxic uremic solutes in the blood during dialysis and interdialytic intervals. The on-line monitoring of toxin concentration allows for the clearance measurement of some low-molecular-weight solutes at any time during hemodialysis.The aim of this study was to determine the optimal wavelength for estimating the changes in urea, creatinine, and uric acid in dialysate, using ultraviolet (UV spectroscopy. Materials and Methods In this study, nine uremic patients were investigated, using on-line spectrophotometry. The on-line absorption measurements (UV radiation were performed with a spectrophotometer module, connected to the fluid outlet of the dialysis machine. Dialysate samples were obtained and analyzed, using standard biochemical methods. Optimal wavelengths for both creatinine and uric acid were selected by using a combination of genetic algorithms (GAs, i.e., GA-partial least squares (GA-PLS and interval partial least squares (iPLS. Results The Artifitial Neural Network (ANN sensitivity analysis determined the wavelengths of the UV band most suitable for estimating the concentration of creatinine and uric acid. The two optimal wavelengths were 242 and 252 nm for creatinine and 295 and 298 nm for uric acid. Conclusion It can be concluded that the reduction ratio of creatinine and uric acid (dialysis efficiency could be continuously monitored during hemodialysis by UV spectroscopy.Compared to the conventional method, which is particularly sensitive to the sampling technique and involves post-dialysis blood sampling, iterative measurements throughout the dialysis session can yield more reliable data.

  5. MURMoT: Design and Application of Microbial Uranium Reduction Monitoring Tools

    Energy Technology Data Exchange (ETDEWEB)

    Pennell, Kurt [Tufts Univ., Medford, MA (United States)

    2014-12-31

    The overarching project goal of the MURMoT project was the design of tools to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-transforming bacteria. To accomplish these objectives, an integrated approach that combined nucleic acid-based tools, proteomic workflows, uranium isotope measurements, and U(IV) speciation and structure analyses using the Advanced Photon Source (APS) at Argonne National Laboratory was developed.

  6. Estimating costs of traffic crashes and crime: tools for informed decision making.

    Science.gov (United States)

    Streff, F M; Molnar, L J; Cohen, M A; Miller, T R; Rossman, S B

    1992-01-01

    Traffic crashes and crime both impose significant economic and social burdens through injury and loss of life, as well as property damage and loss. Efforts to reduce crashes and crime often result in competing demands on limited public resources. Comparable and up-to-date cost data on crashes and crime contribute to informed decisions about allocation of these resources in important ways. As a first step, cost data provide information about the magnitude of the problems of crashes and crime by allowing us to estimate associated dollar losses to society. More importantly, cost data on crashes and crime are essential to evaluating costs and benefits of various policy alternatives that compete for resources. This paper presents the first comparable comprehensive cost estimates for crashes and crime and applies them to crash and crime incidence data for Michigan to generate dollar losses for the state. An example illustrates how cost estimates can be used to evaluate costs and benefits of crash-reduction and crime-reduction policies in making resource allocation decisions. Traffic crash and selected index crime incidence data from the calendar year 1988 were obtained from the Michigan State Police. Costs for crashes and index crimes were generated and applied to incidence data to estimate dollar losses from crashes and index crimes for the state of Michigan. In 1988, index crimes in Michigan resulted in $0.8 billion in monetary costs and $2.4 billion in total monetary and nonmonetary quality-of-life costs (using the willingness-to-pay approach). Traffic crashes in Michigan resulted in $2.3 billion in monetary costs and $7.1 billion in total monetary and nonmonetary quality-of-life costs, nearly three times the costs of index crimes. Based on dollar losses to the state, the magnitude of the problem of traffic crashes clearly exceeded that of index crimes in Michigan in 1988. From a policy perspective, summing the total dollar losses from crashes or crime is of less

  7. p3d: a general data-reduction tool for fiber-fed integral-field spectrographs

    CERN Document Server

    Sandin, C; Roth, M M; Gerssen, J; Monreal-Ibero, A; Böhm, P; Weilbacher, P

    2010-01-01

    The reduction of integral-field spectrograph (IFS) data is demanding work. Many repetitive operations are required in order to convert raw data into, typically a large number of, spectra. This effort can be markedly simplified through the use of a tool or pipeline, which is designed to complete many of the repetitive operations without human interaction. Here we present our semi-automatic data-reduction tool p3d that is designed to be used with fiber-fed IFSs. Important components of p3d include a novel algorithm for automatic finding and tracing of spectra on the detector, and two methods of optimal spectrum extraction in addition to standard aperture extraction. p3d also provides tools to combine several images, perform wavelength calibration and flat field data. p3d is at the moment configured for four IFSs. In order to evaluate its performance we have tested the different components of the tool. For these tests we used both simulated and observational data. We demonstrate that for three of the IFSs a corr...

  8. The optimal state estimation method. A tool to integrate full scale shock trial measurement data and numerical models

    NARCIS (Netherlands)

    Trouwborst, W.; Costanzo, F.A.

    1999-01-01

    In the joint US-NL research program DYCOSS (an acronym for Dynamic Behavior of Composite Ship Structures), a data analysis tool has been developed. The tool contains a mathematical method called the Optimal State Estimation method (OSE) and a Graphical User Interface (GUI). The OSE-method utilizes a

  9. Estimated Pollution Reduction from Wind Farms in Oklahoma and Associated Economic and Human Health Benefits

    Directory of Open Access Journals (Sweden)

    J. Scott Greene

    2013-01-01

    Full Text Available Over the past few decades, there has been a recognition of the growing need for different forms of energy outside of fossil fuels. Since the latter half of the twentieth century individuals, corporations, and governments have become increasingly aware of the effects of the emissions of carbon and other harmful pollutants on the environment. With this greater concern has come increasing activity to combat these harmful emissions by using alternative fuel sources to power homes, businesses, and cities. As can be seen from recent trends in their installed capacity, it is clear that renewable energy resources will continue to be more commonly used in the future. As renewable energy increases, a decrease in a range of harmful pollutants from the energy sector will also occur. This paper provides a case study to estimate the potential environmental and health benefits of an increased shift from fossil fuels to renewable fuels for electrical production in Oklahoma. Results illustrate and quantify the specific reduction that wind energy can and will have on air quality, as well as provide a quantification of the associated potential health benefits.

  10. Effective dysphonia detection using feature dimension reduction and kernel density estimation for patients with Parkinson's disease.

    Directory of Open Access Journals (Sweden)

    Shanshan Yang

    Full Text Available Detection of dysphonia is useful for monitoring the progression of phonatory impairment for patients with Parkinson's disease (PD, and also helps assess the disease severity. This paper describes the statistical pattern analysis methods to study different vocal measurements of sustained phonations. The feature dimension reduction procedure was implemented by using the sequential forward selection (SFS and kernel principal component analysis (KPCA methods. Four selected vocal measures were projected by the KPCA onto the bivariate feature space, in which the class-conditional feature densities can be approximated with the nonparametric kernel density estimation technique. In the vocal pattern classification experiments, Fisher's linear discriminant analysis (FLDA was applied to perform the linear classification of voice records for healthy control subjects and PD patients, and the maximum a posteriori (MAP decision rule and support vector machine (SVM with radial basis function kernels were employed for the nonlinear classification tasks. Based on the KPCA-mapped feature densities, the MAP classifier successfully distinguished 91.8% voice records, with a sensitivity rate of 0.986, a specificity rate of 0.708, and an area value of 0.94 under the receiver operating characteristic (ROC curve. The diagnostic performance provided by the MAP classifier was superior to those of the FLDA and SVM classifiers. In addition, the classification results indicated that gender is insensitive to dysphonia detection, and the sustained phonations of PD patients with minimal functional disability are more difficult to be correctly identified.

  11. A Unified tool to estimate Distances, Ages, and Masses (UniDAM) from spectrophotometric data

    Science.gov (United States)

    Mints, Alexey; Hekker, Saskia

    2017-08-01

    Context. Galactic archaeology, the study of the formation and evolution of the Milky Way by reconstructing its past from its current constituents, requires precise and accurate knowledge of stellar parameters for as many stars as possible. To achieve this, a number of large spectroscopic surveys have been undertaken and are still ongoing. Aims: So far consortia carrying out the different spectroscopic surveys have used different tools to determine stellar parameters of stars from their derived effective temperatures (Teff), surface gravities (log g), and metallicities ([Fe/H]); the parameters can be combined with photometric, astrometric, interferometric, or asteroseismic information. Here we aim to homogenise the stellar characterisation by applying a unified tool to a large set of publicly available spectrophotometric data. Methods: We used spectroscopic data from a variety of large surveys combined with infrared photometry from 2MASS and AllWISE and compared these in a Bayesian manner with PARSEC isochrones to derive probability density functions (PDFs) for stellar masses, ages, and distances. We treated PDFs of pre-helium-core burning, helium-core burning, and post helium-core burning solutions as well as different peaks in multimodal PDFs (i.e. each unimodal sub-PDF) of the different evolutionary phases separately. Results: For over 2.5 million stars we report mass, age, and distance estimates for each evolutionary phase and unimodal sub-PDF. We report Gaussian, skewed, Gaussian, truncated Gaussian, modified truncated exponential distribution or truncated Student's t-distribution functions to represent each sub-PDF, allowing us to reconstruct detailed PDFs. Comparisons with stellar parameter estimates from the literature show good agreement within uncertainties. Conclusions: We present UniDAM, the unified tool applicable to spectrophotometric data of different surveys, to obtain a homogenised set of stellar parameters. The unified tool and the tables with

  12. Randomized Comparison of Mobile and Web-Tools to Provide Dementia Risk Reduction Education: Use, Engagement and Participant Satisfaction

    Science.gov (United States)

    O'Connor, Elodie; Hatherly, Chris

    2014-01-01

    Background Encouraging middle-aged adults to maintain their physical and cognitive health may have a significant impact on reducing the prevalence of dementia in the future. Mobile phone apps and interactive websites may be one effective way to target this age group. However, to date there has been little research investigating the user experience of dementia risk reduction tools delivered in this way. Objective The aim of this study was to explore participant engagement and evaluations of three different targeted smartphone and Web-based dementia risk reduction tools following a four-week intervention. Methods Participants completed a Web-based screening questionnaire to collect eligibility information. Eligible participants were asked to complete a Web-based baseline questionnaire and were then randomly assigned to use one of the three dementia risk reduction tools for a period of four weeks: (1) a mobile phone application; (2) an information-based website; and (3) an interactive website. User evaluations were obtained via a Web-based follow-up questionnaire after completion of the intervention. Results Of 415 eligible participants, 370 (89.16%) completed the baseline questionnaire and were assigned to an intervention group; 200 (54.05%) completed the post-intervention questionnaire. The average age of participants was 52 years, and 149 (75%) were female. Findings indicated that participants from all three intervention groups reported a generally positive impression of the tools across a range of domains. Participants using the information-based website reported higher ratings of their overall impression of the tool, F2,191=4.12, P=.02; how interesting the information was, F2,189=3.53, P=.03; how helpful the information was, F2,192=4.15, P=.02; and how much they learned, F2,188=3.86, P=.02. Group differences were significant between the mobile phone app and information-based website users, but not between the interactive website users and the other two groups

  13. Twitter as a Potential Disaster Risk Reduction Tool. Part IV: Competency-based Education and Training Guidelines to Promote Community Resiliency.

    Science.gov (United States)

    Yeager, Violet; Cooper, Guy Paul; Burkle, Frederick M; Subbarao, Italo

    2015-01-01

    Twitter can be an effective tool for disaster risk reduction but gaps in education and training exist in current public health and disaster management educational competency standards.  Eleven core public health and disaster management competencies are proposed that incorporate Twitter as a tool for effective disaster risk reduction.  Greater funding is required to promote the education and training of this tool for those in professional schools and in the current public health and disaster management workforce.

  14. An innovative multivariate tool for fuel consumption and costs estimation of agricultural operations

    Directory of Open Access Journals (Sweden)

    Mirko Guerrieri

    2016-12-01

    Full Text Available The estimation of operating costs of agricultural and forestry machineries is a key factor in both planning agricultural policies and farm management. Few works have tried to estimate operating costs and the produced models are normally based on deterministic approaches. Conversely, in the statistical model randomness is present and variable states are not described by unique values, but rather by probability distributions. In this study, for the first time, a multivariate statistical model based on Partial Least Squares (PLS was adopted to predict the fuel consumption and costs of six agricultural operations such as: ploughing, harrowing, fertilization, sowing, weed control and shredding. The prediction was conducted on two steps: first of all few initial selected parameters (time per surface-area unit, maximum engine power, purchase price of the tractor and purchase price of the operating machinery were used to estimate the fuel consumption; then the predicted fuel consumption together with the initial parameters were used to estimate the operational costs. Since the obtained models were based on an input dataset very heterogeneous, these resulted to be extremely efficient and so generalizable and robust. In details the results show prediction values in the test with r always ≥ 0.91. Thus, the approach may results extremely useful for both farmers (in terms of economic advantages and at institutional level (representing an innovative and efficient tool for planning future Rural Development Programmes and the Common Agricultural Policy. In light of these advantages the proposed approach may as well be implemented on a web platform and made available to all the stakeholders.

  15. Single Tree Vegetation Depth Estimation Tool for Satellite Services Link Design

    Directory of Open Access Journals (Sweden)

    Z. Hasirci

    2016-04-01

    Full Text Available Attenuation caused by tree shadowing is an important factor for describing the propagation channel of satellite services. Thus, vegetation effects should be determined by experimental studies or empirical formulations. In this study, tree types in the Black Sea Region of Turkey are classified based on their geometrical shapes into four groups such as conic, ellipsoid, spherical and hemispherical. The variations of the vegetation depth according to different tree shapes are calculated with ray tracing method. It is showed that different geometrical shapes have different vegetation depths even if they have same foliage volume for different elevation angles. The proposed method is validated with the related literature in terms of average single tree attenuation. On the other hand, due to decrease system requirements (speed, memory usage etc. of ray tracing method, an artificial neural network is proposed as an alternative. A graphical user interface is created for the above processes in MATLAB environment named vegetation depth estimation tool (VdET.

  16. Using FIESTA , an R-based tool for analysts, to look at temporal trends in forest estimates

    Science.gov (United States)

    Tracey S. Frescino; Paul L. Patterson; Elizabeth A. Freeman; Gretchen G. Moisen

    2012-01-01

    FIESTA (Forest Inventory Estimation for Analysis) is a user-friendly R package that supports the production of estimates for forest resources based on procedures from Bechtold and Patterson (2005). The package produces output consistent with current tools available for the Forest Inventory and Analysis National Program, such as FIDO (Forest Inventory Data Online) and...

  17. Reduction of potassium content of green bean pods and chard by culinary processing. Tools for chronic kidney disease.

    Science.gov (United States)

    Martínez-Pineda, Montserrat; Yagüe-Ruiz, Cristina; Caverni-Muñoz, Alberto; Vercet-Tormo, Antonio

    2016-01-01

    In order to prevent a possible hyperkalemia, chronic renal patients, especially in advanced stages, must follow a low potassium diet. So dietary guidelines for chronic kidney disease recommend limiting the consumption of many vegetables, as well as to apply laborious culinary techniques to maximize the reduction of potassium. The aim of this work is to analyze potassium content from several vegetable, fresh products, frozen and preserved, as well as check and compare the effectiveness in potassium reduction of different culinary processes, some of them recommended in dietary guidelines such as soaking or double cooking. Sample potassium content was analyzed by triplicate using flamephotometry. The results showed significant reductions in potassium content in all culinary processes studied. The degree of loss varied depending on the type of vegetable and processing applied. Frozen products achieved greater reductions than the fresh ones, obtaining in some cases losses greater than 90%. In addition, it was observed how in many cases the single application of a normal cooking reached potassium reductions to acceptable levels for its inclusion in renal patient diet. The results shown in this study are very positive because they provide tools for professionals who deal with this kind of patients. They allow them to adapt more easily to the needs and preferences of their patients and increase dietary variety. Copyright © 2016 Sociedad Española de Nefrología. Published by Elsevier España, S.L.U. All rights reserved.

  18. Evaluation of a metal artefact reduction tool on different positions of a metal object in the FOV.

    Science.gov (United States)

    Queiroz, Polyane M; Santaella, Gustavo M; da Paz, Thais D J; Freitas, Deborah Q

    2017-03-01

    To evaluate the action of a metal artefact reduction (MAR) tool when artefact-generator metal object is at different positions in the field of view (FOV). A cylindrical utility wax phantom, with a metal alloy sample inside, was made. The phantom was positioned centrally and peripherally in the FOV for image acquisition, with and without the MAR tool activation. The standard deviation values (image noise levels) from areas around the metal sample and the control area were obtained. The numbers were compared by Student's t-test (α = 0.05). When the tool was activated, a significant difference of image noise was observed for central and peripheral positioning, for both control area (p = 0.0012) and metal area (p = 0.03), and a smaller level of noise was observed for images with phantoms in central positioning. A decrease in image noise with the tool activated was found only in phantoms with the metal object positioned centrally in the FOV. For the MAR tool to be effective, the artefact-generator object needs to be in the central region of the FOV.

  19. Competitive kinetics as a tool to determine rate constants for reduction of ferrylmyoglobin by food components

    DEFF Research Database (Denmark)

    Jongberg, Sisse; Lund, Marianne Nissen; Pattison, David I.

    2016-01-01

    . This approach allows determination of apparent rate constants for the oxidation of proteins by haem proteins of relevance to food oxidation and should be applicable to other systems. A similar approach has provided approximate apparent rate constants for the reduction of MbFe(IV)=O by catechin and green tea...

  20. PEET: a Matlab tool for estimating physical gate errors in quantum information processing systems

    Science.gov (United States)

    Hocker, David; Kosut, Robert; Rabitz, Herschel

    2016-09-01

    A Physical Error Estimation Tool (PEET) is introduced in Matlab for predicting physical gate errors of quantum information processing (QIP) operations by constructing and then simulating gate sequences for a wide variety of user-defined, Hamiltonian-based physical systems. PEET is designed to accommodate the interdisciplinary needs of quantum computing design by assessing gate performance for users familiar with the underlying physics of QIP, as well as those interested in higher-level computing operations. The structure of PEET separates the bulk of the physical details of a system into Gate objects, while the construction of quantum computing gate operations are contained in GateSequence objects. Gate errors are estimated by Monte Carlo sampling of noisy gate operations. The main utility of PEET, though, is the implementation of QuantumControl methods that act to generate and then test gate sequence and pulse-shaping techniques for QIP performance. This work details the structure of PEET and gives instructive examples for its operation.

  1. SBMLSimulator: A Java Tool for Model Simulation and Parameter Estimation in Systems Biology

    Directory of Open Access Journals (Sweden)

    Alexander Dörr

    2014-12-01

    Full Text Available The identification of suitable model parameters for biochemical reactions has been recognized as a quite difficult endeavor. Parameter values from literature or experiments can often not directly be combined in complex reaction systems. Nature-inspired optimization techniques can find appropriate sets of parameters that calibrate a model to experimentally obtained time series data. We present SBMLsimulator, a tool that combines the Systems Biology Simulation Core Library for dynamic simulation of biochemical models with the heuristic optimization framework EvA2. SBMLsimulator provides an intuitive graphical user interface with various options as well as a fully-featured command-line interface for large-scale and script-based model simulation and calibration. In a parameter estimation study based on a published model and artificial data we demonstrate the capability of SBMLsimulator to identify parameters. SBMLsimulator is useful for both, the interactive simulation and exploration of the parameter space and for the large-scale model calibration and estimation of uncertain parameter values.

  2. Application of FTIR spectroscopy for traumatic axonal injury: a possible tool for estimating injury interval

    Science.gov (United States)

    Zhang, Ji; Huang, Ping

    2017-01-01

    Traumatic axonal injury (TAI) is a progressive and secondary injury following traumatic brain injury (TBI). Despite extensive investigations in the field of forensic science and neurology, no effective methods are available to estimate TAI interval between injury and death. In the present study, Fourier transform IR (FTIR) spectroscopy with IR microscopy was applied to collect IR spectra in the corpus callosum (CC) of rats subjected to TAI at 12, 24, and 72 h post-injury compared with control animals. The classification amongst different groups was visualized based on the acquired dataset using hierarchical cluster analysis (HCA) and partial least square (PLS). Furthermore, the established PLS models were used to predict injury interval of TAI in the unknown sample dataset. The results showed that samples at different time points post-injury were distinguishable from each other, and biochemical changes in protein, lipid, and carbohydrate contributed to the differences. Then, the established PLS models provided a satisfactory prediction of injury periods between different sample groups in the external validation. The present study demonstrated the great potential of FTIR-based PLS algorithm as an objective tool for estimating injury intervals of TAI in the field of forensic science and neurology. PMID:28659494

  3. Binomial Distribution Sample Confidence Intervals Estimation 7. Absolute Risk Reduction and ARR-like Expressions

    Directory of Open Access Journals (Sweden)

    Andrei ACHIMAŞ CADARIU

    2004-08-01

    Full Text Available Assessments of a controlled clinical trial suppose to interpret some key parameters as the controlled event rate, experimental event date, relative risk, absolute risk reduction, relative risk reduction, number needed to treat when the effect of the treatment are dichotomous variables. Defined as the difference in the event rate between treatment and control groups, the absolute risk reduction is the parameter that allowed computing the number needed to treat. The absolute risk reduction is compute when the experimental treatment reduces the risk for an undesirable outcome/event. In medical literature when the absolute risk reduction is report with its confidence intervals, the method used is the asymptotic one, even if it is well know that may be inadequate. The aim of this paper is to introduce and assess nine methods of computing confidence intervals for absolute risk reduction and absolute risk reduction – like function.Computer implementations of the methods use the PHP language. Methods comparison uses the experimental errors, the standard deviations, and the deviation relative to the imposed significance level for specified sample sizes. Six methods of computing confidence intervals for absolute risk reduction and absolute risk reduction-like functions were assessed using random binomial variables and random sample sizes.The experiments shows that the ADAC, and ADAC1 methods obtains the best overall performance of computing confidence intervals for absolute risk reduction.

  4. A Useful Tool for Atmospheric Correction and Surface Temperature Estimation of Landsat Infrared Thermal Data

    Science.gov (United States)

    Rivalland, Vincent; Tardy, Benjamin; Huc, Mireille; Hagolle, Olivier; Marcq, Sébastien; Boulet, Gilles

    2016-04-01

    Land Surface temperature (LST) is a critical variable for studying the energy and water budgets at the Earth surface, and is a key component of many aspects of climate research and services. The Landsat program jointly carried out by NASA and USGS has been providing thermal infrared data for 40 years, but no associated LST product has been yet routinely proposed to community. To derive LST values, radiances measured at sensor-level need to be corrected for the atmospheric absorption, the atmospheric emission and the surface emissivity effect. Until now, existing LST products have been generated with multi channel methods such as the Temperature/Emissivity Separation (TES) adapted to ASTER data or the generalized split-window algorithm adapted to MODIS multispectral data. Those approaches are ill-adapted to the Landsat mono-window data specificity. The atmospheric correction methodology usually used for Landsat data requires detailed information about the state of the atmosphere. This information may be obtained from radio-sounding or model atmospheric reanalysis and is supplied to a radiative transfer model in order to estimate atmospheric parameters for a given coordinate. In this work, we present a new automatic tool dedicated to Landsat thermal data correction which improves the common atmospheric correction methodology by introducing the spatial dimension in the process. The python tool developed during this study, named LANDARTs for LANDsat Automatic Retrieval of surface Temperature, is fully automatic and provides atmospheric corrections for a whole Landsat tile. Vertical atmospheric conditions are downloaded from the ERA Interim dataset from ECMWF meteorological organization which provides them at 0.125 degrees resolution, at a global scale and with a 6-hour-time step. The atmospheric correction parameters are estimated on the atmospheric grid using the commercial software MODTRAN, then interpolated to 30m resolution. We detail the processing steps

  5. A Software Tool for Atmospheric Correction and Surface Temperature Estimation of Landsat Infrared Thermal Data

    Directory of Open Access Journals (Sweden)

    Benjamin Tardy

    2016-08-01

    Full Text Available Land surface temperature (LST is an important variable involved in the Earth’s surface energy and water budgets and a key component in many aspects of environmental research. The Landsat program, jointly carried out by NASA and the USGS, has been recording thermal infrared data for the past 40 years. Nevertheless, LST data products for Landsat remain unavailable. The atmospheric correction (AC method commonly used for mono-window Landsat thermal data requires detailed information concerning the vertical structure (temperature, pressure and the composition (water vapor, ozone of the atmosphere. For a given coordinate, this information is generally obtained through either radio-sounding or atmospheric model simulations and is passed to the radiative transfer model (RTM to estimate the local atmospheric correction parameters. Although this approach yields accurate LST data, results are relevant only near this given coordinate. To meet the scientific community’s demand for high-resolution LST maps, we developed a new software tool dedicated to processing Landsat thermal data. The proposed tool improves on the commonly-used AC algorithm by incorporating spatial variations occurring in the Earth’s atmosphere composition. The ERA-Interim dataset (ECMWFmeteorological organization was used to retrieve vertical atmospheric conditions, which are available at a global scale with a resolution of 0.125 degrees and a temporal resolution of 6 h. A temporal and spatial linear interpolation of meteorological variables was performed to match the acquisition dates and coordinates of the Landsat images. The atmospheric correction parameters were then estimated on the basis of this reconstructed atmospheric grid using the commercial RTMsoftware MODTRAN. The needed surface emissivity was derived from the common vegetation index NDVI, obtained from the red and near-infrared (NIR bands of the same Landsat image. This permitted an estimation of LST for the entire

  6. Fast variance reduction for steady-state simulation and sensitivity analysis of stochastic chemical systems using shadow function estimators

    Science.gov (United States)

    Milias-Argeitis, Andreas; Lygeros, John; Khammash, Mustafa

    2014-07-01

    We address the problem of estimating steady-state quantities associated to systems of stochastic chemical kinetics. In most cases of interest, these systems are analytically intractable, and one has to resort to computational methods to estimate stationary values of cost functions. In this work, we introduce a novel variance reduction algorithm for stochastic chemical kinetics, inspired by related methods in queueing theory, in particular the use of shadow functions. Using two numerical examples, we demonstrate the efficiency of the method for the calculation of steady-state parametric sensitivities and evaluate its performance in comparison to other estimation methods.

  7. Estimating CO{sub 2} Emission Reduction of Non-capture CO{sub 2} Utilization (NCCU) Technology

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ji Hyun; Lee, Dong Woog; Gyu, Jang Se; Kwak, No-Sang; Lee, In Young; Jang, Kyung Ryoung; Shim, Jae-Goo [KEPCO Research Institute, Daejon (Korea, Republic of); Choi, Jong Shin [Korea East-West Power Co., LTD(ETP), Ulsan (Korea, Republic of)

    2015-10-15

    Estimating potential of CO{sub 2} emission reduction of non-capture CO{sub 2} utilization (NCCU) technology was evaluated. NCCU is sodium bicarbonate production technology through the carbonation reaction of CO{sub 2} contained in the flue gas. For the estimating the CO{sub 2} emission reduction, process simulation using process simulator (PRO/II) based on a chemical plant which could handle CO{sub 2} of 100 tons per day was performed, Also for the estimation of the indirect CO{sub 2} reduction, the solvay process which is a conventional technology for the production of sodium carbonate/sodium bicarbonate, was studied. The results of the analysis showed that in case of the solvay process, overall CO{sub 2} emission was estimated as 48,862 ton per year based on the energy consumption for the production of NaHCO{sub 3} (7.4 GJ/tNaHCO{sub 3}). While for the NCCU technology, the direct CO{sub 2} reduction through the CO{sub 2} carbonation was estimated as 36,500 ton per year and the indirect CO{sub 2} reduction through the lower energy consumption was 46,885 ton per year which lead to 83,385 ton per year in total. From these results, it could be concluded that sodium bicarbonate production technology through the carbonation reaction of CO{sub 2} contained in the flue was energy efficient and could be one of the promising technology for the low CO{sub 2} emission technology.

  8. Validation of a Mexican food photograph album as a tool to visually estimate food amounts in adolescents.

    Science.gov (United States)

    Bernal-Orozco, M Fernanda; Vizmanos-Lamotte, Barbara; Rodríguez-Rocha, Norma P; Macedo-Ojeda, Gabriela; Orozco-Valerio, María; Rovillé-Sausse, Françoise; León-Estrada, Sandra; Márquez-Sandoval, Fabiola; Fernández-Ballart, Joan D

    2013-03-14

    The aim of the present study was to validate a food photograph album (FPA) as a tool to visually estimate food amounts, and to compare this estimation with that attained through the use of measuring cups (MC) and food models (FM). We tested 163 foods over fifteen sessions (thirty subjects/session; 10-12 foods presented in two portion sizes, 20-24 plates/session). In each session, subjects estimated food amounts with the assistance of FPA, MC and FM. We compared (by portion and method) the mean estimated weight and the mean real weight. We also compared the percentage error estimation for each portion, and the mean food percentage error estimation between methods. In addition, we determined the percentage error estimation of each method. We included 463 adolescents from three public high schools (mean age 17·1 (sd 1·2) years, 61·8 % females). All foods were assessed using FPA, 53·4 % of foods were assessed using MC, and FM was used for 18·4 % of foods. The mean estimated weight with all methods was statistically different compared with the mean real weight for almost all foods. However, a lower percentage error estimation was observed using FPA (2·3 v. 56·9 % for MC and 325 % for FM, Pfoods, comparisons between methods showed FPA to be the most accurate tool for estimating food amounts.

  9. ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures

    Science.gov (United States)

    Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.

    2008-08-01

    This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.

  10. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    Science.gov (United States)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  11. A systematic review of food composition tools used for determining dietary polyphenol intake in estimated intake studies.

    Science.gov (United States)

    Probst, Yasmine; Guan, Vivienne; Kent, Katherine

    2018-01-01

    Translating food intake data into phytochemical outcomes is a crucial step in investigating potential health benefits. The aim of this review was to examine the tools for determining dietary-derived polyphenol intakes for estimated intake studies. Published studies from 2004 to 2014 reporting polyphenol food composition information were sourced with 157 studies included. Six polyphenol subclasses were identified. One quarter of studies (n=39) reported total flavonoids intake with 27% reporting individual flavonoid compounds. Assessing multiple compounds was common with approximately 10% of studies assessing seven (n=13), six (n=12) and five (n=14) subclasses of polyphenol. There was no pattern between reported flavonoids compounds and subclass studied. Approximately 60% of studies relied on publicly accessible food composition data to estimate dietary polyphenols intake with 33% using two or more tools. This review highlights the importance of publicly accessible composition databases for estimating polyphenol intake and provides a reference for tools available globally. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Organohalide Respiring Bacteria and Reductive Dehalogenases: Key Tools in Organohalide Bioremediation

    Directory of Open Access Journals (Sweden)

    Bat-Erdene eJugder

    2016-03-01

    Full Text Available Organohalides are recalcitrant pollutants that have been responsible for substantial contamination of soils and groundwater. Organohalide-respiring bacteria (ORB provide a potential solution to remediate contaminated sites, through their ability to use organohalides as terminal electron acceptors to yield energy for growth (i.e. organohalide respiration. Ideally, this process results in non- or lesser-halogenated compounds that are mostly less toxic to the environment or more easily degraded. At the heart of these processes are reductive dehalogenases (RDase, which are membrane bound enzymes coupled with other components that facilitate dehalogenation of organohalides to generate cellular energy. This review focuses RDases, concentrating on those which have been purified (partially or wholly and functionally characterized. Further, the paper reviews the major bacteria involved in organohalide breakdown and the evidence for microbial evolution of RDases. Finally, the capacity for using ORB in a bioremediation and bioaugmentation capacity are discussed.

  13. Continuous improvement process and waste reduction through a QFD tool: the case of a metallurgic plant

    Directory of Open Access Journals (Sweden)

    Leoni Pentiado Godoy

    2013-05-01

    Full Text Available This paper proposes the use of QFD for the continuous improvement of production processes and waste reduction actions. To collect the information we used the simple observation and questionnaire with closed questions applied to employees, representing 88.75% of the population that works in the production processes of an industry of metal-mechanic sector, located inRio Grandedo Sul. QFD is an effective method of quality planning, because it provides a diagnosis that underpins the definition of improvement actions aimed at combating waste. Actions were set providing improved communication between the sectors, enabling the delivery of products with specifications that meet customer requirements, on time and the right amounts, at a minimum cost and satisfaction of those involved with the company. The implementation of these actions reduces waste, minimizes the extra work, maximizes effective labor and increases profitability.

  14. Aircraft parameter estimation — A tool for development of aerodynamic databases

    Indian Academy of Sciences (India)

    R V Jategaonkar; F Thielecke

    2000-04-01

    With the evolution of high performance modern aircraft and spiraling developmental and experimental costs, the importance of flight validated databases for flight control design applications and for flight simulators has increased significantly in the recent past. Ground-based and in-flight simulators are increasingly used not only for pilot training but also for other applications such asflight planning, envelope expansion, design and analysis of control laws, and handling qualitiesinvestigations. Most of these demand a high-fidelity aerodynamic database representing the flight vehicle. System identification methodology, evolved over the past three decades, provides a powerful and sophisticated tool to identify from flight data aerodynamic characteristics valid over the entire operational flight envelope. This paper briefly presents aircraft parameter estimation methods for both stable and unstable aircraft, highlighting the developmental work at the DLR Institute of Flight Mechanics. Various aspects of database identification and its validation are presented. Practical aspectslike the proper choice of integration and optimization methods as well as limitations of gradient approximation through finite-differences are brought out. Though the paper focuses on application of system identification methodsto flight vehicles, its use in other applications, like the modelling of inelastic deformations of metallic materials, is also presented. It is shown that there are many similar problems and several challenges requiring additional concepts and algorithms.

  15. Developing a tool to estimate water withdrawal and consumption in electricity generation in the United States.

    Energy Technology Data Exchange (ETDEWEB)

    Wu, M.; Peng, J. (Energy Systems); ( NE)

    2011-02-24

    Freshwater consumption for electricity generation is projected to increase dramatically in the next couple of decades in the United States. The increased demand is likely to further strain freshwater resources in regions where water has already become scarce. Meanwhile, the automotive industry has stepped up its research, development, and deployment efforts on electric vehicles (EVs) and plug-in hybrid electric vehicles (PHEVs). Large-scale, escalated production of EVs and PHEVs nationwide would require increased electricity production, and so meeting the water demand becomes an even greater challenge. The goal of this study is to provide a baseline assessment of freshwater use in electricity generation in the United States and at the state level. Freshwater withdrawal and consumption requirements for power generated from fossil, nonfossil, and renewable sources via various technologies and by use of different cooling systems are examined. A data inventory has been developed that compiles data from government statistics, reports, and literature issued by major research institutes. A spreadsheet-based model has been developed to conduct the estimates by means of a transparent and interactive process. The model further allows us to project future water withdrawal and consumption in electricity production under the forecasted increases in demand. This tool is intended to provide decision makers with the means to make a quick comparison among various fuel, technology, and cooling system options. The model output can be used to address water resource sustainability when considering new projects or expansion of existing plants.

  16. Ordinary kriging as a tool to estimate historical daily streamflow records

    Science.gov (United States)

    Farmer, William H.

    2016-07-01

    Efficient and responsible management of water resources relies on accurate streamflow records. However, many watersheds are ungaged, limiting the ability to assess and understand local hydrology. Several tools have been developed to alleviate this data scarcity, but few provide continuous daily streamflow records at individual streamgages within an entire region. Building on the history of hydrologic mapping, ordinary kriging was extended to predict daily streamflow time series on a regional basis. Pooling parameters to estimate a single, time-invariant characterization of spatial semivariance structure is shown to produce accurate reproduction of streamflow. This approach is contrasted with a time-varying series of variograms, representing the temporal evolution and behavior of the spatial semivariance structure. Furthermore, the ordinary kriging approach is shown to produce more accurate time series than more common, single-index hydrologic transfers. A comparison between topological kriging and ordinary kriging is less definitive, showing the ordinary kriging approach to be significantly inferior in terms of Nash-Sutcliffe model efficiencies while maintaining significantly superior performance measured by root mean squared errors. Given the similarity of performance and the computational efficiency of ordinary kriging, it is concluded that ordinary kriging is useful for first-order approximation of daily streamflow time series in ungaged watersheds.

  17. Ordinary kriging as a tool to estimate historical daily streamflow records

    Science.gov (United States)

    Farmer, William H.

    2016-01-01

    Efficient and responsible management of water resources relies on accurate streamflow records. However, many watersheds are ungaged, limiting the ability to assess and understand local hydrology. Several tools have been developed to alleviate this data scarcity, but few provide continuous daily streamflow records at individual streamgages within an entire region. Building on the history of hydrologic mapping, ordinary kriging was extended to predict daily streamflow time series on a regional basis. Pooling parameters to estimate a single, time-invariant characterization of spatial semivariance structure is shown to produce accurate reproduction of streamflow. This approach is contrasted with a time-varying series of variograms, representing the temporal evolution and behavior of the spatial semivariance structure. Furthermore, the ordinary kriging approach is shown to produce more accurate time series than more common, single-index hydrologic transfers. A comparison between topological kriging and ordinary kriging is less definitive, showing the ordinary kriging approach to be significantly inferior in terms of Nash–Sutcliffe model efficiencies while maintaining significantly superior performance measured by root mean squared errors. Given the similarity of performance and the computational efficiency of ordinary kriging, it is concluded that ordinary kriging is useful for first-order approximation of daily streamflow time series in ungaged watersheds.

  18. Tools used to estimate soil quality in coal combustion waste areas

    Directory of Open Access Journals (Sweden)

    FLAVIO M.R. DA SILVA JÚNIOR

    2014-06-01

    Full Text Available Soil is a highly complex environmental compartment that has suffered with the contamination of substances of various origins. Among the main activities that affect soil quality are power generation activities that use fossil fuels, such as mineral coal. Environmental protection agencies encourage scientific investigations using tools described in legal devices or standard protocols to evaluate the potential of coal as a pollutant, especially in places that have large reserves of this mineral like the state of Rio Grande do Sul. The aim of this study was to characterize the leached extracts of different soils from an area influenced by coal waste, to classify them according to the guideline values for groundwater described in CONAMA's n. 420/2009, and to estimate the effects of the leachates ingestion in DNA mutation rates. The volume of soil needed to induce a 100% increase in the spontaneous mutation rate varied between 129.3 and 1544.1 mg of soil among the soils studied. Metals such as Mn, Pb, Cd and Ni surpassed the investigation limits for groundwater at least in one soil sample. The results showed that there can be transfer of soil contaminants to groundwater and soil intake in the area could contribute to the increased mutagenic risk.

  19. Tool for the Reduction and Assessment of Chemical and Other Environmental Impacts (TRACI) TRACI version 2.1 User’s Guide

    Science.gov (United States)

    TRACI 2.1 (the Tool for the Reduction and Assessment of Chemical and other environmental Impacts) has been developed for sustainability metrics, life cycle impact assessment, industrial ecology, and process design impact assessment for developing increasingly sustainable products...

  20. Online machining error estimation method of numerical control gear grinding machine tool based on data analysis of internal sensors

    Science.gov (United States)

    Zhao, Fei; Zhang, Chi; Yang, Guilin; Chen, Chinyin

    2016-12-01

    This paper presents an online estimation method of cutting error by analyzing of internal sensor readings. The internal sensors of numerical control (NC) machine tool are selected to avoid installation problem. The estimation mathematic model of cutting error was proposed to compute the relative position of cutting point and tool center point (TCP) from internal sensor readings based on cutting theory of gear. In order to verify the effectiveness of the proposed model, it was simulated and experimented in gear generating grinding process. The cutting error of gear was estimated and the factors which induce cutting error were analyzed. The simulation and experiments verify that the proposed approach is an efficient way to estimate the cutting error of work-piece during machining process.

  1. Bacteriophage removal efficiency as a validation and operational monitoring tool for virus reduction in wastewater reclamation: Review.

    Science.gov (United States)

    Amarasiri, Mohan; Kitajima, Masaaki; Nguyen, Thanh H; Okabe, Satoshi; Sano, Daisuke

    2017-09-15

    The multiple-barrier concept is widely employed in international and domestic guidelines for wastewater reclamation and reuse for microbiological risk management, in which a wastewater reclamation system is designed to achieve guideline values of the performance target of microbe reduction. Enteric viruses are one of the pathogens for which the target reduction values are stipulated in guidelines, but frequent monitoring to validate human virus removal efficacy is challenging in a daily operation due to the cumbersome procedures for virus quantification in wastewater. Bacteriophages have been the first choice surrogate for this task, because of the well-characterized nature of strains and the presence of established protocols for quantification. Here, we performed a meta-analysis to calculate the average log10 reduction values (LRVs) of somatic coliphages, F-specific phages, MS2 coliphage and T4 phage by membrane bioreactor, activated sludge, constructed wetlands, pond systems, microfiltration and ultrafiltration. The calculated LRVs of bacteriophages were then compared with reported human enteric virus LRVs. MS2 coliphage LRVs in MBR processes were shown to be lower than those of norovirus GII and enterovirus, suggesting it as a possible validation and operational monitoring tool. The other bacteriophages provided higher LRVs compared to human viruses. The data sets on LRVs of human viruses and bacteriophages are scarce except for MBR and conventional activated sludge processes, which highlights the necessity of investigating LRVs of human viruses and bacteriophages in multiple treatment unit processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Dimensional reduction as a tool for mesh refinement and trackingsingularities of PDEs

    Energy Technology Data Exchange (ETDEWEB)

    Stinis, Panagiotis

    2007-06-10

    We present a collection of algorithms which utilizedimensional reduction to perform mesh refinement and study possiblysingular solutions of time-dependent partial differential equations. Thealgorithms are inspired by constructions used in statistical mechanics toevaluate the properties of a system near a critical point. The firstalgorithm allows the accurate determination of the time of occurrence ofa possible singularity. The second algorithm is an adaptive meshrefinement scheme which can be used to approach efficiently the possiblesingularity. Finally, the third algorithm uses the second algorithm untilthe available resolution is exhausted (as we approach the possiblesingularity) and then switches to a dimensionally reduced model which,when accurate, can follow faithfully the solution beyond the time ofoccurrence of the purported singularity. An accurate dimensionallyreduced model should dissipate energy at the right rate. We construct twovariants of each algorithm. The first variant assumes that we have actualknowledge of the reduced model. The second variant assumes that we knowthe form of the reduced model, i.e., the terms appearing in the reducedmodel, but not necessarily their coefficients. In this case, we alsoprovide a way of determining the coefficients. We present numericalresults for the Burgers equation with zero and nonzero viscosity toillustrate the use of the algorithms.

  3. Microfinance As A Tool For Poverty Reduction: A Study Of Jordan

    Directory of Open Access Journals (Sweden)

    Žiaková M.

    2015-12-01

    Full Text Available The aim of this study is to evaluate the impact of microfinance on the poor, particularly in the specific areas of economic and social development of people and their employment. The research was carried out in Jordan, a country with a well-developed microfinance sector. The results have shown that microfinance has led to an improvement in the financial and social situation of the poor, especially for female clients of microfinance institutions. Interestingly, the higher income of clients has not caused higher expenditure on their basic needs, but rather people have generated saving for their future and used the additional money for education. According to the results of the microfinance impact assessment, it can be assumed that people, particularly females, prefer to improve the social situation for future generations. Based on this finding, we consider microfinance an effective tool for breaking the vicious circles of poverty, especially in Jordan. Furthermore, microcredits have facilitated in increasing employment for the poor, mainly through self-employment. It is believed that there exists a direct connection to the future expansion of microcredits that will lead to the development of small businesses with a promising impact on employability throughout the population structure.

  4. New Torque Estimation Method Considering Spatial Harmonics and Torque Ripple Reduction in Permanent Magnet Synchronous Motors

    Science.gov (United States)

    Hida, Hajime; Tomigashi, Yoshio; Ueyama, Kenji; Inoue, Yukinori; Morimoto, Shigeo

    This paper proposes a new torque estimation method that takes into account the spatial harmonics of permanent magnet synchronous motors and that is capable of real-time estimation. First, the torque estimation equation of the proposed method is derived. In the method, the torque ripple of a motor can be estimated from the average of the torque calculated by the conventional method (cross product of the fluxlinkage and motor current) and the torque calculated from the electric input power to the motor. Next, the effectiveness of the proposed method is verified by simulations in which two kinds of motors with different components of torque ripple are considered. The simulation results show that the proposed method estimates the torque ripple more accurately than the conventional method. Further, the effectiveness of the proposed method is verified by performing on experiment. It is shown that the torque ripple is decreased by using the proposed method to the torque control.

  5. Comparing estimates of child mortality reduction modelled in LiST with pregnancy history survey data for a community-based NGO project in Mozambique

    Directory of Open Access Journals (Sweden)

    Morrow Melanie

    2011-04-01

    Full Text Available Abstract Background There is a growing body of evidence that integrated packages of community-based interventions, a form of programming often implemented by NGOs, can have substantial child mortality impact. More countries may be able to meet Millennium Development Goal (MDG 4 targets by leveraging such programming. Analysis of the mortality effect of this type of programming is hampered by the cost and complexity of direct mortality measurement. The Lives Saved Tool (LiST produces an estimate of mortality reduction by modelling the mortality effect of changes in population coverage of individual child health interventions. However, few studies to date have compared the LiST estimates of mortality reduction with those produced by direct measurement. Methods Using results of a recent review of evidence for community-based child health programming, a search was conducted for NGO child health projects implementing community-based interventions that had independently verified child mortality reduction estimates, as well as population coverage data for modelling in LiST. One child survival project fit inclusion criteria. Subsequent searches of the USAID Development Experience Clearinghouse and Child Survival Grants databases and interviews of staff from NGOs identified no additional projects. Eight coverage indicators, covering all the project’s technical interventions were modelled in LiST, along with indicator values for most other non-project interventions in LiST, mainly from DHS data from 1997 and 2003. Results The project studied was implemented by World Relief from 1999 to 2003 in Gaza Province, Mozambique. An independent evaluation collecting pregnancy history data estimated that under-five mortality declined 37% and infant mortality 48%. Using project-collected coverage data, LiST produced estimates of 39% and 34% decline, respectively. Conclusions LiST gives reasonably accurate estimates of infant and child mortality decline in an area

  6. Comparing estimates of child mortality reduction modelled in LiST with pregnancy history survey data for a community-based NGO project in Mozambique.

    Science.gov (United States)

    Ricca, Jim; Prosnitz, Debra; Perry, Henry; Edward, Anbrasi; Morrow, Melanie; Ernst, Pieter; Ryan, Leo

    2011-04-13

    There is a growing body of evidence that integrated packages of community-based interventions, a form of programming often implemented by NGOs, can have substantial child mortality impact. More countries may be able to meet Millennium Development Goal (MDG) 4 targets by leveraging such programming. Analysis of the mortality effect of this type of programming is hampered by the cost and complexity of direct mortality measurement. The Lives Saved Tool (LiST) produces an estimate of mortality reduction by modelling the mortality effect of changes in population coverage of individual child health interventions. However, few studies to date have compared the LiST estimates of mortality reduction with those produced by direct measurement. Using results of a recent review of evidence for community-based child health programming, a search was conducted for NGO child health projects implementing community-based interventions that had independently verified child mortality reduction estimates, as well as population coverage data for modelling in LiST. One child survival project fit inclusion criteria. Subsequent searches of the USAID Development Experience Clearinghouse and Child Survival Grants databases and interviews of staff from NGOs identified no additional projects. Eight coverage indicators, covering all the project's technical interventions were modelled in LiST, along with indicator values for most other non-project interventions in LiST, mainly from DHS data from 1997 and 2003. The project studied was implemented by World Relief from 1999 to 2003 in Gaza Province, Mozambique. An independent evaluation collecting pregnancy history data estimated that under-five mortality declined 37% and infant mortality 48%. Using project-collected coverage data, LiST produced estimates of 39% and 34% decline, respectively. LiST gives reasonably accurate estimates of infant and child mortality decline in an area where a package of community-based interventions was implemented

  7. Mobile Health Devices as Tools for Worldwide Cardiovascular Risk Reduction and Disease Management.

    Science.gov (United States)

    Piette, John D; List, Justin; Rana, Gurpreet K; Townsend, Whitney; Striplin, Dana; Heisler, Michele

    2015-11-24

    We examined evidence on whether mobile health (mHealth) tools, including interactive voice response calls, short message service, or text messaging, and smartphones, can improve lifestyle behaviors and management related to cardiovascular diseases throughout the world. We conducted a state-of-the-art review and literature synthesis of peer-reviewed and gray literature published since 2004. The review prioritized randomized trials and studies focused on cardiovascular diseases and risk factors, but included other reports when they represented the best available evidence. The search emphasized reports on the potential benefits of mHealth interventions implemented in low- and middle-income countries. Interactive voice response and short message service interventions can improve cardiovascular preventive care in developed countries by addressing risk factors including weight, smoking, and physical activity. Interactive voice response and short message service-based interventions for cardiovascular disease management also have shown benefits with respect to hypertension management, hospital readmissions, and diabetic glycemic control. Multimodal interventions including Web-based communication with clinicians and mHealth-enabled clinical monitoring with feedback also have shown benefits. The evidence regarding the potential benefits of interventions using smartphones and social media is still developing. Studies of mHealth interventions have been conducted in >30 low- and middle-income countries, and evidence to date suggests that programs are feasible and may improve medication adherence and disease outcomes. Emerging evidence suggests that mHealth interventions may improve cardiovascular-related lifestyle behaviors and disease management. Next-generation mHealth programs developed worldwide should be based on evidence-based behavioral theories and incorporate advances in artificial intelligence for adapting systems automatically to patients' unique and changing needs.

  8. The ISO 50001 Impact Estimator Tool (IET 50001 V1.1.4) - User Guide and Introduction to the ISO 50001 Impacts Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Therkelsen, Peter L. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Rao, Prakash [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee T. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-08-01

    ISO 50001-Energy management systems – Requirements with guidance for use, is an internationally developed standard that provides organizations with a flexible framework for implementing an energy management system (EnMS) with the goal of continual energy performance improvement. The ISO 50001 standard was first published in 2011 and has since seen growth in the number of certificates issued around the world, primarily in the industrial (agriculture, manufacturing, and mining) and service (commercial) sectors. Policy makers in many regions and countries are looking to or are already using ISO 50001 as a basis for energy efficiency, carbon reduction, and other energy performance improvement schemes. The Impact Estimator Tool 50001 (IET 50001 Tool) is a computational model developed to assist researchers and policy makers determine the potential impact of ISO 50001 implementation in the industrial and service (commercial) sectors for a given region or country. The IET 50001 Tool is based upon a methodology initially developed by the Lawrence Berkeley National Laboratory that has been improved upon and vetted by a group of international researchers. By using a commonly accepted and transparent methodology, users of the IET 50001 Tool can easily and clearly communicate the potential impact of ISO 50001 for a region or country.

  9. Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    Science.gov (United States)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average

  10. Exploring the effects of dimensionality reduction in deep networks for force estimation in robotic-assisted surgery

    Science.gov (United States)

    Aviles, Angelica I.; Alsaleh, Samar; Sobrevilla, Pilar; Casals, Alicia

    2016-03-01

    Robotic-Assisted Surgery approach overcomes the limitations of the traditional laparoscopic and open surgeries. However, one of its major limitations is the lack of force feedback. Since there is no direct interaction between the surgeon and the tissue, there is no way of knowing how much force the surgeon is applying which can result in irreversible injuries. The use of force sensors is not practical since they impose different constraints. Thus, we make use of a neuro-visual approach to estimate the applied forces, in which the 3D shape recovery together with the geometry of motion are used as input to a deep network based on LSTM-RNN architecture. When deep networks are used in real time, pre-processing of data is a key factor to reduce complexity and improve the network performance. A common pre-processing step is dimensionality reduction which attempts to eliminate redundant and insignificant information by selecting a subset of relevant features to use in model construction. In this work, we show the effects of dimensionality reduction in a real-time application: estimating the applied force in Robotic-Assisted Surgeries. According to the results, we demonstrated positive effects of doing dimensionality reduction on deep networks including: faster training, improved network performance, and overfitting prevention. We also show a significant accuracy improvement, ranging from about 33% to 86%, over existing approaches related to force estimation.

  11. Optical Density Analysis of X-Rays Utilizing Calibration Tooling to Estimate Thickness of Parts

    Science.gov (United States)

    Grau, David

    2012-01-01

    This process is designed to estimate the thickness change of a material through data analysis of a digitized version of an x-ray (or a digital x-ray) containing the material (with the thickness in question) and various tooling. Using this process, it is possible to estimate a material's thickness change in a region of the material or part that is thinner than the rest of the reference thickness. However, that same principle process can be used to determine the thickness change of material using a thinner region to determine thickening, or it can be used to develop contour plots of an entire part. Proper tooling must be used. An x-ray film with an S-shaped characteristic curve or a digital x-ray device with a product resulting in like characteristics is necessary. If a film exists with linear characteristics, this type of film would be ideal; however, at the time of this reporting, no such film has been known. Machined components (with known fractional thicknesses) of a like material (similar density) to that of the material to be measured are necessary. The machined components should have machined through-holes. For ease of use and better accuracy, the throughholes should be a size larger than 0.125 in. (.3 mm). Standard components for this use are known as penetrameters or image quality indicators. Also needed is standard x-ray equipment, if film is used in place of digital equipment, or x-ray digitization equipment with proven conversion properties. Typical x-ray digitization equipment is commonly used in the medical industry, and creates digital images of x-rays in DICOM format. It is recommended to scan the image in a 16-bit format. However, 12-bit and 8-bit resolutions are acceptable. Finally, x-ray analysis software that allows accurate digital image density calculations, such as Image-J freeware, is needed. The actual procedure requires the test article to be placed on the raw x-ray, ensuring the region of interest is aligned for perpendicular x-ray exposure

  12. A Traffic Reduction Method for Centralized RSSI-Based Location Estimation in Wireless Sensor Networks

    Science.gov (United States)

    Zemek, Radim; Hara, Shinsuke; Yanagihara, Kentaro; Kitayama, Ken-Ichi

    In a centralized localization scenario, the limited throughput of the central node constrains the possible number of target node locations that can be estimated simultaneously. To overcome this limitation, we propose a method which effectively decreases the traffic load associated with target node localization, and therefore increases the possible number of target node locations that can estimated simultaneously in a localization system based on received signal strength indicator (RSSI) and maximum likelihood estimation. Our proposed method utilizes a threshold which limits the amount of forwarded RSSI data to the central node. As the threshold is crucial to the method, we further propose a method to theoretically determine its value. We experimentally verified the proposed method in various environments and the experimental results revealed that the method can reduce the load by 32-64% without significantly affecting the estimation accuracy.

  13. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    Directory of Open Access Journals (Sweden)

    Demeter Lisa

    2010-05-01

    Full Text Available Abstract Background The replication rate (or fitness between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV. HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models, a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1. Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  14. Statistical analysis of electrical resistivity as a tool for estimating cement type of 12-year-old concrete specimens

    NARCIS (Netherlands)

    Polder, R.B.; Morales-Napoles, O.; Pacheco, J.

    2012-01-01

    Statistical tests on values of concrete resistivity can be used as a fast tool for estimating the cement type of old concrete. Electrical resistivity of concrete is a material property that describes the electrical resistance of concrete in a unit cell. Influences of binder type, water-to-binder rat

  15. Clinical evaluation of a commercial orthopedic metal artifact reduction tool for CT simulations in radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Li Hua; Noel, Camille; Chen, Haijian; Harold Li, H.; Low, Daniel; Moore, Kevin; Klahr, Paul; Michalski, Jeff; Gay, Hiram A.; Thorstad, Wade; Mutic, Sasa [Department of Radiation Oncology, Washington University, St. Louis, Missouri 63110 (United States); Department of Radiation Oncology, University of California Los Angeles, Los Angeles, California 90095 (United States); Department of Radiation Oncology, University of California San Diego, San Diego, California 92093 (United States); Philips Healthcare System, Cleveland, Ohio 44143 (United States); Department of Radiation Oncology, Washington University, St. Louis, Missouri 63110 (United States)

    2012-12-15

    Purpose: Severe artifacts in kilovoltage-CT simulation images caused by large metallic implants can significantly degrade the conspicuity and apparent CT Hounsfield number of targets and anatomic structures, jeopardize the confidence of anatomical segmentation, and introduce inaccuracies into the radiation therapy treatment planning process. This study evaluated the performance of the first commercial orthopedic metal artifact reduction function (O-MAR) for radiation therapy, and investigated its clinical applications in treatment planning. Methods: Both phantom and clinical data were used for the evaluation. The CIRS electron density phantom with known physical (and electron) density plugs and removable titanium implants was scanned on a Philips Brilliance Big Bore 16-slice CT simulator. The CT Hounsfield numbers of density plugs on both uncorrected and O-MAR corrected images were compared. Treatment planning accuracy was evaluated by comparing simulated dose distributions computed using the true density images, uncorrected images, and O-MAR corrected images. Ten CT image sets of patients with large hip implants were processed with the O-MAR function and evaluated by two radiation oncologists using a five-point score for overall image quality, anatomical conspicuity, and CT Hounsfield number accuracy. By utilizing the same structure contours delineated from the O-MAR corrected images, clinical IMRT treatment plans for five patients were computed on the uncorrected and O-MAR corrected images, respectively, and compared. Results: Results of the phantom study indicated that CT Hounsfield number accuracy and noise were improved on the O-MAR corrected images, especially for images with bilateral metal implants. The {gamma} pass rates of the simulated dose distributions computed on the uncorrected and O-MAR corrected images referenced to those of the true densities were higher than 99.9% (even when using 1% and 3 mm distance-to-agreement criterion), suggesting that dose

  16. Reduction of air pollutants - a tool for control of atmospheric corrosion

    Directory of Open Access Journals (Sweden)

    Kucera, V.

    2003-12-01

    Full Text Available In most urban areas in Europe and Northern America serious corrosion impacts on buildings and cultural monuments have been caused by emissions of pollutants. The rapidly increasing pollution levels in many of the developing countries also exert a serious threat to materials. Beside the very important role of SO2 also the direct or synergistic effect of NOx and O3, the particulates and rain acidity may contribute in an important way to materials degradation. Results from extensive international field exposure programs i.e. within the UN/ECE have enabled development of dose-response relations which describe the effect of dry and wet deposition of pollutants on corrosion of different material groups. In most of the industrialized countries decreasing trends of sulphur and nitrogen pollutants and of acidity of precipitation have resulted in decreased corrosion rates. The concept of acceptable levels of pollutants is a useful tool in planning of abatement strategies and for defining of conditions for a suitable development in the field of corrosion of constructions in the atmosphere.

    La contaminación de la atmósfera ha sido la principal razón del grave deterioro de las edificaciones y de los monumentos en numerosas ciudades de Europa y Norteamérica. De otro lado, el acelerado incremento de los niveles de contaminación en los países menos desarrollados está poniendo en peligro la estabilidad de los materiales utilizados. Además del importante papel que en este sentido juega el SO2, la acción directa o el efecto sinérgico de los NOx y el O3, al igual que el material particulado y las lluvias acidas contribuyen a agravar el problema. Resultados de vastos programas internacionales de investigación como, por ejemplo, el UN/ECE, han permitido desarrollar relaciones dosis-respuesta que describen el efecto de la deposición de los contaminantes sobre la corrosión de

  17. Estimating Longitudinal Risks and Benefits From Cardiovascular Preventive Therapies Among Medicare Patients: The Million Hearts Longitudinal ASCVD Risk Assessment Tool: A Special Report From the American Heart Association and American College of Cardiology.

    Science.gov (United States)

    Lloyd-Jones, Donald M; Huffman, Mark D; Karmali, Kunal N; Sanghavi, Darshak M; Wright, Janet S; Pelser, Colleen; Gulati, Martha; Masoudi, Frederick A; Goff, David C

    2017-03-28

    The Million Hearts Initiative has a goal of preventing 1 million heart attacks and strokes-the leading causes of mortality-through several public health and healthcare strategies by 2017. The American Heart Association and American College of Cardiology support the program. The Cardiovascular Risk Reduction Model was developed by Million Hearts and the Center for Medicare & Medicaid Services as a strategy to assess a value-based payment approach toward reduction in 10-year predicted risk of atherosclerotic cardiovascular disease (ASCVD) by implementing cardiovascular preventive strategies to manage the "ABCS" (aspirin therapy in appropriate patients, blood pressure control, cholesterol management, and smoking cessation). The purpose of this special report is to describe the development and intended use of the Million Hearts Longitudinal ASCVD Risk Assessment Tool. The Million Hearts Tool reinforces and builds on the "2013 ACC/AHA Guideline on the Assessment of Cardiovascular Risk" by allowing clinicians to estimate baseline and updated 10-year ASCVD risk estimates for primary prevention patients adhering to the appropriate ABCS over time, alone or in combination. The tool provides updated risk estimates based on evidence from high-quality systematic reviews and meta-analyses of the ABCS therapies. This novel approach to personalized estimation of benefits from risk-reducing therapies in primary prevention may help target therapies to those in whom they will provide the greatest benefit, and serves as the basis for a Center for Medicare & Medicaid Services program designed to evaluate the Million Hearts Cardiovascular Risk Reduction Model.

  18. Estimation of cost reduction and increase for the final disposal associated with the categorization of inert waste landfills in Japan.

    Science.gov (United States)

    Nakayama, Hirofumi; Tsuchida, Daisuke; Shimaoka, Takayuki

    2012-02-01

    This study estimates the overall cost savings that have been realized due to disposal of inert wastes in Japan because this material has been deposited in inert waste landfills (IWLs) that are designed exclusively for this purpose, instead of being co-dipsosed with organic wastes in more costly in sanitary landfills (SLs). The total realized cost savings were based on the disposed volume of inert waste and the actual disposal fees for IWLs and SLs for the period 1977-2006. The estimated reduction in expense is 4748 billion JPY for the period. On the other hand, if organic wastes had been deposited in IWLs along with inert wastes, costs would be incurred to clean up the sites because the surrounding environment may be polluted by the decomposition of the non-inert wastes and considerable efforts probably would be required to restore the polluted environment to its normal condition (this is because IWLs typically do not have a barrier system.) The potential cleanup cost was estimated to be 616 to 1226 billion JPY. These estimated costs were compared and it was found that the net reduction in expense was 3522 billion to 4122 billion JPY. Although the expense was reduced substantially, it was noted that a considerable cleanup cost would be generated. In particular, it was found that the increase in cleanup costs becomes most significant after the late 1990s.

  19. Review of hardware cost estimation methods, models and tools applied to early phases of space mission planning

    Science.gov (United States)

    Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.

    2012-08-01

    The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation

  20. Noise reduction for modal parameters estimation using algorithm of solving partially described inverse singular value problem

    Science.gov (United States)

    Bao, Xingxian; Cao, Aixia; Zhang, Jing

    2016-07-01

    Modal parameters estimation plays an important role for structural health monitoring. Accurately estimating the modal parameters of structures is more challenging as the measured vibration response signals are contaminated with noise. This study develops a mathematical algorithm of solving the partially described inverse singular value problem (PDISVP) combined with the complex exponential (CE) method to estimate the modal parameters. The PDISVP solving method is to reconstruct an L2-norm optimized (filtered) data matrix from the measured (noisy) data matrix, when the prescribed data constraints are one or several sets of singular triplets of the matrix. The measured data matrix is Hankel structured, which is constructed based on the measured impulse response function (IRF). The reconstructed matrix must maintain the Hankel structure, and be lowered in rank as well. Once the filtered IRF is obtained, the CE method can be applied to extract the modal parameters. Two physical experiments, including a steel cantilever beam with 10 accelerometers mounted, and a steel plate with 30 accelerometers mounted, excited by an impulsive load, respectively, are investigated to test the applicability of the proposed scheme. In addition, the consistency diagram is proposed to exam the agreement among the modal parameters estimated from those different accelerometers. Results indicate that the PDISVP-CE method can significantly remove noise from measured signals and accurately estimate the modal frequencies and damping ratios.

  1. Unique ion filter: a data reduction tool for GC/MS data preprocessing prior to chemometric analysis.

    Science.gov (United States)

    Adutwum, L A; Harynuk, J J

    2014-08-01

    Using raw GC/MS data as the X-block for chemometric modeling has the potential to provide better classification models for complex samples when compared to using the total ion current (TIC), extracted ion chromatograms/profiles (EIC/EIP), or integrated peak tables. However, the abundance of raw GC/MS data necessitates some form of data reduction/feature selection to remove the variables containing primarily noise from the data set. Several algorithms for feature selection exist; however, due to the extreme number of variables (10(6)-10(8) variables per chromatogram), the feature selection time can be prolonged and computationally expensive. Herein, we present a new prefilter for automated data reduction of GC/MS data prior to feature selection. This tool, termed unique ion filter (UIF), is a module that can be added after chromatographic alignment and prior to any subsequent feature selection algorithm. The UIF objectively reduces the number of irrelevant or redundant variables in raw GC/MS data, while preserving potentially relevant analytical information. In the m/z dimension, data are reduced from a full spectrum to a handful of unique ions for each chromatographic peak. In the time dimension, data are reduced to only a handful of scans around each peak apex. UIF was applied to a data set of GC/MS data for a variety of gasoline samples to be classified using partial least-squares discriminant analysis (PLS-DA) according to octane rating. It was also applied to a series of chromatograms from casework fire debris analysis to be classified on the basis of whether or not signatures of gasoline were detected. By reducing the overall population of candidate variables subjected to subsequent variable selection, the UIF reduced the total feature selection time for which a perfect classification of all validation data was achieved from 373 to 9 min (98% reduction in computing time). Additionally, the significant reduction in included variables resulted in a concomitant

  2. Model reduction and parameter estimation of non-linear dynamical biochemical reaction networks.

    Science.gov (United States)

    Sun, Xiaodian; Medvedovic, Mario

    2016-02-01

    Parameter estimation for high dimension complex dynamic system is a hot topic. However, the current statistical model and inference approach is known as a large p small n problem. How to reduce the dimension of the dynamic model and improve the accuracy of estimation is more important. To address this question, the authors take some known parameters and structure of system as priori knowledge and incorporate it into dynamic model. At the same time, they decompose the whole dynamic model into subset network modules, based on different modules, and then they apply different estimation approaches. This technique is called Rao-Blackwellised particle filters decomposition methods. To evaluate the performance of this method, the authors apply it to synthetic data generated from repressilator model and experimental data of the JAK-STAT pathway, but this method can be easily extended to large-scale cases.

  3. Estimating sufficient reductions of the predictors in abundant high-dimensional regressions

    CERN Document Server

    Cook, R Dennis; Rothman, Adam J; 10.1214/11-AOS962

    2012-01-01

    We study the asymptotic behavior of a class of methods for sufficient dimension reduction in high-dimension regressions, as the sample size and number of predictors grow in various alignments. It is demonstrated that these methods are consistent in a variety of settings, particularly in abundant regressions where most predictors contribute some information on the response, and oracle rates are possible. Simulation results are presented to support the theoretical conclusion.

  4. Computational aspects of maximum likelihood estimation and reduction in sensitivity function calculations

    Science.gov (United States)

    Gupta, N. K.; Mehra, R. K.

    1974-01-01

    This paper discusses numerical aspects of computing maximum likelihood estimates for linear dynamical systems in state-vector form. Different gradient-based nonlinear programming methods are discussed in a unified framework and their applicability to maximum likelihood estimation is examined. The problems due to singular Hessian or singular information matrix that are common in practice are discussed in detail and methods for their solution are proposed. New results on the calculation of state sensitivity functions via reduced order models are given. Several methods for speeding convergence and reducing computation time are also discussed.

  5. A technical review of urban land use - transportation models as tools for evaluating vehicle travel reduction strategies

    Energy Technology Data Exchange (ETDEWEB)

    Southworth, F.

    1995-07-01

    The continued growth of highway traffic in the United States has led to unwanted urban traffic congestion as well as to noticeable urban air quality problems. These problems include emissions covered by the 1990 Clean Air Act Amendments (CAAA) and 1991 Intermodal Surface Transportation Efficiency Act (ISTEA), as well as carbon dioxide and related {open_quotes}greenhouse gas{close_quotes} emissions. Urban travel also creates a major demand for imported oil. Therefore, for economic as well as environmental reasons, transportation planning agencies at both the state and metropolitan area level are focussing a good deal of attention on urban travel reduction policies. Much discussed policy instruments include those that encourage fewer trip starts, shorter trip distances, shifts to higher-occupancy vehicles or to nonvehicular modes, and shifts in the timing of trips from the more to the less congested periods of the day or week. Some analysts have concluded that in order to bring about sustainable reductions in urban traffic volumes, significant changes will be necessary in the way our households and businesses engage in daily travel. Such changes are likely to involve changes in the ways we organize and use traffic-generating and-attracting land within our urban areas. The purpose of this review is to evaluate the ability of current analytic methods and models to support both the evaluation and possibly the design of such vehicle travel reduction strategies, including those strategies involving the reorganization and use of urban land. The review is organized into three sections. Section 1 describes the nature of the problem we are trying to model, Section 2 reviews the state of the art in operational urban land use-transportation simulation models, and Section 3 provides a critical assessment of such models as useful urban transportation planning tools. A number of areas are identified where further model development or testing is required.

  6. A Mobile Clinical Decision Support Tool for Pediatric Cardiovascular Risk-Reduction Clinical Practice Guidelines: Development and Description

    Science.gov (United States)

    2017-01-01

    Background Widespread application of research findings to improve patient outcomes remains inadequate, and failure to routinely translate research findings into daily clinical practice is a major barrier for the implementation of any evidence-based guideline. Strategies to increase guideline uptake in primary care pediatric practices and to facilitate adherence to recommendations are required. Objective Our objective was to operationalize the US National Heart, Lung, and Blood Institute’s Integrated Guidelines for Cardiovascular Health and Risk Reduction in Children and Adolescents into a mobile clinical decision support (CDS) system for healthcare providers, and to describe the process development and outcomes. Methods To overcome the difficulty of translating clinical practice guidelines into a computable form that can be used by a CDS system, we used a multilayer framework to convert the evidence synthesis into executable knowledge. We used an iterative process of design, testing, and revision through each step in the translation of the guidelines for use in a CDS tool to support the development of 4 validated modules: an integrated risk assessment; a blood pressure calculator; a body mass index calculator; and a lipid management instrument. Results The iterative revision process identified several opportunities to improve the CDS tool. Operationalizing the integrated guideline identified numerous areas in which the guideline was vague or incorrect and required more explicit operationalization. Iterative revisions led to workable solutions to problems and understanding of the limitations of the tool. Conclusions The process and experiences described provide a model for other mobile CDS systems that translate written clinical practice guidelines into actionable, real-time clinical recommendations. PMID:28270384

  7. Comparing Fatigue Life Estimations of Composite Wind Turbine Blades using different Fatigue Analysis Tools

    DEFF Research Database (Denmark)

    Ardila, Oscar Gerardo Castro; Lennie, Matthew; Branner, Kim;

    2015-01-01

    suggested by the IEC 61400-1 standard were studied employing different load time intervals and by using two novel fatigue tools called ALBdeS and BECAS+F. The aeroelastic loads were defined thought aeroelastic simulations performed with both FAST and HAWC2 tools. The stress spectra at each layer were...

  8. An Error-Reduction Algorithm to Improve Lidar Turbulence Estimates for Wind Energy

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer F.; Clifton, Andrew

    2016-08-01

    Currently, cup anemometers on meteorological (met) towers are used to measure wind speeds and turbulence intensity to make decisions about wind turbine class and site suitability. However, as modern turbine hub heights increase and wind energy expands to complex and remote sites, it becomes more difficult and costly to install met towers at potential sites. As a result, remote sensing devices (e.g., lidars) are now commonly used by wind farm managers and researchers to estimate the flow field at heights spanned by a turbine. While lidars can accurately estimate mean wind speeds and wind directions, there is still a large amount of uncertainty surrounding the measurement of turbulence with lidars. This uncertainty in lidar turbulence measurements is one of the key roadblocks that must be overcome in order to replace met towers with lidars for wind energy applications. In this talk, a model for reducing errors in lidar turbulence estimates is presented. Techniques for reducing errors from instrument noise, volume averaging, and variance contamination are combined in the model to produce a corrected value of the turbulence intensity (TI), a commonly used parameter in wind energy. In the next step of the model, machine learning techniques are used to further decrease the error in lidar TI estimates.

  9. Iterative PSF Estimation and Its Application to Shift Invariant and Variant Blur Reduction

    Directory of Open Access Journals (Sweden)

    Seung-Won Jung

    2009-01-01

    Full Text Available Among image restoration approaches, image deconvolution has been considered a powerful solution. In image deconvolution, a point spread function (PSF, which describes the blur of the image, needs to be determined. Therefore, in this paper, we propose an iterative PSF estimation algorithm which is able to estimate an accurate PSF. In real-world motion-blurred images, a simple parametric model of the PSF fails when a camera moves in an arbitrary direction with an inconsistent speed during an exposure time. Moreover, the PSF normally changes with spatial location. In order to accurately estimate the complex PSF of a real motion blurred image, we iteratively update the PSF by using a directional spreading operator. The directional spreading is applied to the PSF when it reduces the amount of the blur and the restoration artifacts. Then, to generalize the proposed technique to the linear shift variant (LSV model, a piecewise invariant approach is adopted by the proposed image segmentation method. Experimental results show that the proposed method effectively estimates the PSF and restores the degraded images.

  10. Multi-sensor integration for on-line tool wear estimation through radial basis function networks and fuzzy neural network.

    Science.gov (United States)

    Kuo, R J.; Cohen, P H.

    1999-03-01

    On-line tool wear estimation plays a very critical role in industry automation for higher productivity and product quality. In addition, appropriate and timely decision for tool change is significantly required in the machining systems. Thus, this paper is dedicated to develop an estimation system through integration of two promising technologies, artificial neural networks (ANN) and fuzzy logic. An on-line estimation system consisting of five components: (1) data collection; (2) feature extraction; (3) pattern recognition; (4) multi-sensor integration; and (5) tool/work distance compensation for tool flank wear, is proposed herein. For each sensor, a radial basis function (RBF) network is employed to recognize the extracted features. Thereafter, the decisions from multiple sensors are integrated through a proposed fuzzy neural network (FNN) model. Such a model is self-organizing and self-adjusting, and is able to learn from the experience. Physical experiments for the metal cutting process are implemented to evaluate the proposed system. The results show that the proposed system can significantly increase the accuracy of the product profile.

  11. Gene-based comparative analysis of tools for estimating copy number alterations using whole-exome sequencing data

    Science.gov (United States)

    Kim, Hyung-Yong; Choi, Jin-Woo; Lee, Jeong-Yeon; Kong, Gu

    2017-01-01

    Accurate detection of copy number alterations (CNAs) using next-generation sequencing technology is essential for the development and application of more precise medical treatments for human cancer. Here, we evaluated seven CNA estimation tools (ExomeCNV, CoNIFER, VarScan2, CODEX, ngCGH, saasCNV, and falcon) using whole-exome sequencing data from 419 breast cancer tumor-normal sample pairs from The Cancer Genome Atlas. Estimations generated using each tool were converted into gene-based copy numbers; concordance for gains and losses and the sensitivity and specificity of each tool were compared to validated copy numbers from a single nucleotide polymorphism reference array. The concordance and sensitivity of the tumor-normal pair methods for estimating CNAs (saasCNV, ExomeCNV, and VarScan2) were better than those of the tumor batch methods (CoNIFER and CODEX). SaasCNV had the highest gain and loss concordances (65.0%), sensitivity (69.4%), and specificity (89.1%) for estimating copy number gains or losses. These findings indicate that improved CNA detection algorithms are needed to more accurately interpret whole-exome sequencing results in human cancer. PMID:28460482

  12. Something from nothing: Estimating consumption rates using propensity scores, with application to emissions reduction policies.

    Science.gov (United States)

    Bardsley, Nicholas; Büchs, Milena; Schnepf, Sylke V

    2017-01-01

    Consumption surveys often record zero purchases of a good because of a short observation window. Measures of distribution are then precluded and only mean consumption rates can be inferred. We show that Propensity Score Matching can be applied to recover the distribution of consumption rates. We demonstrate the method using the UK National Travel Survey, in which c.40% of motorist households purchase no fuel. Estimated consumption rates are plausible judging by households' annual mileages, and highly skewed. We apply the same approach to estimate CO2 emissions and outcomes of a carbon cap or tax. Reliance on means apparently distorts analysis of such policies because of skewness of the underlying distributions. The regressiveness of a simple tax or cap is overstated, and redistributive features of a revenue-neutral policy are understated.

  13. A Novel Coherence Reduction Method in Compressed Sensing for DOA Estimation

    Directory of Open Access Journals (Sweden)

    Jing Liu

    2013-01-01

    Full Text Available A novel method named as coherent column replacement method is proposed to reduce the coherence of a partially deterministic sensing matrix, which is comprised of highly coherent columns and random Gaussian columns. The proposed method is to replace the highly coherent columns with random Gaussian columns to obtain a new sensing matrix. The measurement vector is changed accordingly. It is proved that the original sparse signal could be reconstructed well from the newly changed measurement vector based on the new sensing matrix with large probability. This method is then extended to a more practical condition when highly coherent columns and incoherent columns are considered, for example, the direction of arrival (DOA estimation problem in phased array radar system using compressed sensing. Numerical simulations show that the proposed method succeeds in identifying multiple targets in a sparse radar scene, where the compressed sensing method based on the original sensing matrix fails. The proposed method also obtains more precise estimation of DOA using one snapshot compared with the traditional estimation methods such as Capon, APES, and GLRT, based on hundreds of snapshots.

  14. Wavelet-based density estimation for noise reduction in plasma simulations using particles

    Science.gov (United States)

    van yen, Romain Nguyen; del-Castillo-Negrete, Diego; Schneider, Kai; Farge, Marie; Chen, Guangye

    2010-04-01

    For given computational resources, the accuracy of plasma simulations using particles is mainly limited by the noise due to limited statistical sampling in the reconstruction of the particle distribution function. A method based on wavelet analysis is proposed and tested to reduce this noise. The method, known as wavelet-based density estimation (WBDE), was previously introduced in the statistical literature to estimate probability densities given a finite number of independent measurements. Its novel application to plasma simulations can be viewed as a natural extension of the finite size particles (FSP) approach, with the advantage of estimating more accurately distribution functions that have localized sharp features. The proposed method preserves the moments of the particle distribution function to a good level of accuracy, has no constraints on the dimensionality of the system, does not require an a priori selection of a global smoothing scale, and its able to adapt locally to the smoothness of the density based on the given discrete particle data. Moreover, the computational cost of the denoising stage is of the same order as one time step of a FSP simulation. The method is compared with a recently proposed proper orthogonal decomposition based method, and it is tested with three particle data sets involving different levels of collisionality and interaction with external and self-consistent fields.

  15. Beyond Self-Report: Tools to Compare Estimated and Real-World Smartphone Use

    Science.gov (United States)

    Andrews, Sally; Ellis, David A.; Shaw, Heather; Piwek, Lukasz

    2015-01-01

    Psychologists typically rely on self-report data when quantifying mobile phone usage, despite little evidence of its validity. In this paper we explore the accuracy of using self-reported estimates when compared with actual smartphone use. We also include source code to process and visualise these data. We compared 23 participants’ actual smartphone use over a two-week period with self-reported estimates and the Mobile Phone Problem Use Scale. Our results indicate that estimated time spent using a smartphone may be an adequate measure of use, unless a greater resolution of data are required. Estimates concerning the number of times an individual used their phone across a typical day did not correlate with actual smartphone use. Neither estimated duration nor number of uses correlated with the Mobile Phone Problem Use Scale. We conclude that estimated smartphone use should be interpreted with caution in psychological research. PMID:26509895

  16. The application of an analytical probabilistic model for estimating the rainfall-runoff reductions achieved using a rainwater harvesting system.

    Science.gov (United States)

    Kim, Hyoungjun; Han, Mooyoung; Lee, Ju Young

    2012-05-01

    Rainwater harvesting systems cannot only supplement on-site water needs, but also reduce water runoff and lessen downstream flooding. In this study, an existing analytic model for estimating the runoff in urban areas is modified to provide a more economical and effective model that can be used for describing rainwater harvesting. This model calculates the rainfall-runoff reduction by taking into account the catchment, storage tank, and infiltration facility of a water harvesting system; this calculation is based on the water balance equation, and the cumulative distribution, probability density, and average rainfall-runoff functions. This model was applied to a water harvesting system at the Seoul National University in order to verify its practicality. The derived model was useful for evaluating runoff reduction and for designing the storage tank capacity.

  17. A quantum chemical based toxicity study of estimated reduction potential and hydrophobicity in series of nitroaromatic compounds.

    Science.gov (United States)

    Gooch, A; Sizochenko, N; Sviatenko, L; Gorb, L; Leszczynski, J

    2017-02-01

    Nitroaromatic compounds and the products of their degradation are toxic to bacteria, cells and animals. Various studies have been carried out to better understand the mechanism of toxicity of aromatic nitrocompounds and their relationship to humans and the environment. Recent data relate cytotoxicity of nitroaromatic compounds to their single- or two-electron enzymatic reduction. However, mechanisms of animal toxicity could be more complex. This work investigates the estimated reduction and oxidation potentials of 34 nitroaromatic compounds using quantum chemical approaches. All geometries were optimized with density functional theory (DFT) using the solvation model based on density (SMD) and polarizable continuum model (PCM) solvent model protocols. Quantitative structure-activity/property (QSAR/QSPR) models were developed using descriptors obtained from quantum chemical optimizations as well as the DRAGON software program. The QSAR/QSPR equations developed consist of two to four descriptors. Correlations have been identified between electron affinity (ELUMO) and hydrophobicity (log P).

  18. Time-dependent estimates of recurrence and survival in colon cancer: clinical decision support system tool development for adjuvant therapy and oncological outcome assessment.

    Science.gov (United States)

    Steele, Scott R; Bilchik, Anton; Johnson, Eric K; Nissan, Aviram; Peoples, George E; Eberhardt, John S; Kalina, Philip; Petersen, Benjamin; Brücher, Björn; Protic, Mladjan; Avital, Itzhak; Stojadinovic, Alexander

    2014-05-01

    Unanswered questions remain in determining which high-risk node-negative colon cancer (CC) cohorts benefit from adjuvant therapy and how it may differ in an equal access population. Machine-learned Bayesian Belief Networks (ml-BBNs) accurately estimate outcomes in CC, providing clinicians with Clinical Decision Support System (CDSS) tools to facilitate treatment planning. We evaluated ml-BBNs ability to estimate survival and recurrence in CC. We performed a retrospective analysis of registry data of patients with CC to train-test-crossvalidate ml-BBNs using the Department of Defense Automated Central Tumor Registry (January 1993 to December 2004). Cases with events or follow-up that passed quality control were stratified into 1-, 2-, 3-, and 5-year survival cohorts. ml-BBNs were trained using machine-learning algorithms and k-fold crossvalidation and receiver operating characteristic curve analysis used for validation. BBNs were comprised of 5301 patients and areas under the curve ranged from 0.85 to 0.90. Positive predictive values for recurrence and mortality ranged from 78 to 84 per cent and negative predictive values from 74 to 90 per cent by survival cohort. In the 12-month model alone, 1,132,462,080 unique rule sets allow physicians to predict individual recurrence/mortality estimates. Patients with Stage II (N0M0) CC benefit from chemotherapy at different rates. At one year, all patients older than 73 years of age with T2-4 tumors and abnormal carcinoembryonic antigen levels benefited, whereas at five years, all had relative reduction in mortality with the largest benefit amongst elderly, highest T-stage patients. ml-BBN can readily predict which high-risk patients benefit from adjuvant therapy. CDSS tools yield individualized, clinically relevant estimates of outcomes to assist clinicians in treatment planning.

  19. SU-F-P-19: Fetal Dose Estimate for a High-Dose Fluoroscopy Guided Intervention Using Modern Data Tools

    Energy Technology Data Exchange (ETDEWEB)

    Moirano, J [University of Washington, Seattle, WA (United States)

    2016-06-15

    Purpose: An accurate dose estimate is necessary for effective patient management after a fetal exposure. In the case of a high-dose exposure, it is critical to use all resources available in order to make the most accurate assessment of the fetal dose. This work will demonstrate a methodology for accurate fetal dose estimation using tools that have recently become available in many clinics, and show examples of best practices for collecting data and performing the fetal dose calculation. Methods: A fetal dose estimate calculation was performed using modern data collection tools to determine parameters for the calculation. The reference point air kerma as displayed by the fluoroscopic system was checked for accuracy. A cumulative dose incidence map and DICOM header mining were used to determine the displayed reference point air kerma. Corrections for attenuation caused by the patient table and pad were measured and applied in order to determine the peak skin dose. The position and depth of the fetus was determined by ultrasound imaging and consultation with a radiologist. The data collected was used to determine a normalized uterus dose from Monte Carlo simulation data. Fetal dose values from this process were compared to other accepted calculation methods. Results: An accurate high-dose fetal dose estimate was made. Comparison to accepted legacy methods were were within 35% of estimated values. Conclusion: Modern data collection and reporting methods ease the process for estimation of fetal dose from interventional fluoroscopy exposures. Many aspects of the calculation can now be quantified rather than estimated, which should allow for a more accurate estimation of fetal dose.

  20. Estimating 'lost heart beats' rather than reductions in heart rate during the intubation of critically-ill children.

    Science.gov (United States)

    Jones, Peter; Ovenden, Nick; Dauger, Stéphane; Peters, Mark J

    2014-01-01

    Reductions in heart rate occur frequently in children during critical care intubation and are currently considered the gold standard for haemodynamic instability. Our objective was to estimate loss of heart beats during intubation and compare this to reduction in heart rate alone whilst testing the impact of atropine pre-medication. Data were extracted from a prospective 2-year cohort study of intubation ECGs from critically ill children in PICU/Paediatric Transport. A three step algorithm was established to exclude variation in pre-intubation heart rate (using a 95%CI limit derived from pre-intubation heart rate variation of the children included), measure the heart rate over time and finally the estimate the numbers of lost beats. 333 intubations in children were eligible for inclusion of which 245 were available for analysis (74%). Intubations where the fall in heart rate was less than 50 bpm were accompanied almost exclusively by less than 25 lost beats (n = 175, median 0 [0-1]). When there was a reduction of >50 bpm there was a poor correlation with numbers of lost beats (n = 70, median 42 [15-83]). During intubation the median number of lost beats was 8 [1]-[32] when atropine was not used compared to 0 [0-0] when atropine was used (pheart rate during intubation of heart rate was >50 bpm the heart rate was poorly predictive of lost beats. A study looking at the relationship between lost beats and cardiac output needs to be performed. Atropine reduces both fall in heart rate and loss of beats. Similar area-under-the-curve methodology may be useful for estimating risk when biological parameters deviate outside normal range.

  1. Deflation as a Method of Variance Reduction for Estimating the Trace of a Matrix Inverse

    CERN Document Server

    Gambhir, Arjun Singh; Orginos, Kostas

    2016-01-01

    Many fields require computing the trace of the inverse of a large, sparse matrix. The typical method used for such computations is the Hutchinson method which is a Monte Carlo (MC) averaging over matrix quadratures. To improve its convergence, several variance reductions techniques have been proposed. In this paper, we study the effects of deflating the near null singular value space. We make two main contributions. First, we analyze the variance of the Hutchinson method as a function of the deflated singular values and vectors. Although this provides good intuition in general, by assuming additionally that the singular vectors are random unitary matrices, we arrive at concise formulas for the deflated variance that include only the variance and mean of the singular values. We make the remarkable observation that deflation may increase variance for Hermitian matrices but not for non-Hermitian ones. This is a rare, if not unique, property where non-Hermitian matrices outperform Hermitian ones. The theory can b...

  2. Estimation of Power Production Potential from Natural Gas Pressure Reduction Stations in Pakistan Using ASPEN HYSYS

    Directory of Open Access Journals (Sweden)

    Imran Nazir Unar

    2015-07-01

    Full Text Available Pakistan is a gas rich but power poor country. It consumes approximately 1, 559 Billion cubic feet of natural gas annually. Gas is transported around the country in a system of pressurized transmission pipelines under a pressure range of 600-1000 psig exclusively operated by two state owned companies i.e. SNGPL (Sui Northern Gas Pipelines Limited and SSGCL (Sui Southern Gas Company Limited. The gas is distributed by reducing from the transmission pressure into distribution pressure up to maximum level of 150 psig at the city gate stations normally called SMS (Sales Metering Station. As a normal practice gas pressure reduction at those SMSs is accomplished in pressure regulators (PCVs or in throttle valves where isenthalpic expansion takes place without producing any energy. Pressure potential of natural gas is an untapped energy resource which is currently wasted by its throttling. This pressure reduction at SMS (pressure drop through SMS may also be achieved by expansion of natural gas in TE, which converts its pressure into the mechanical energy, which can be transmitted any loading device for example electric generator. The aim of present paper is to explore the expected power production potential of various Sales Metering Stations of SSGCL company in Pakistan. The model of sales metering station was developed in a standard flow sheeting software Aspen HYSYS®7.1 to calculate power and study other parameters when an expansion turbine is used instead of throttling valves. It was observed from the simulation results that a significant power (more than 140 KW can be produced at pressure reducing stations of SSGC network with gas flows more than 2.2 MMSCFD and pressure ration more than 1.3.

  3. Vibration reduction on a nonlinear flexible structure through resonant control and disturbance estimator

    Science.gov (United States)

    Cazzulani, Gabriele; Resta, Ferruccio; Ripamonti, Francesco

    2012-04-01

    Large mechanical structures are often affected by high level vibrations due to their flexibility. These vibrations can reduce the system performances and lifetime and the use of active vibration control strategies becomes very attractive. In this paper a combination of resonant control and a disturbance estimator is proposed. This solution is able to improve the system performances during the transient motion and also to reject the disturbance forces acting on the system. Both control logics are based on a modal approach, since it allows to describe the structure dynamics considering only few degrees of freedom.

  4. Bias reduction for Satellite Based Precipitation Estimates using statistical transformations in Guiana Shield

    Science.gov (United States)

    Ringard, Justine; Becker, Melanie; Seyler, Frederique; Linguet, Laurent

    2016-04-01

    Currently satellite-based precipitation estimates exhibit considerable biases, and there have been many efforts to reduce these biases by merging surface gauge measurements with satellite-based estimates. In Guiana Shield all products exhibited better performances during the dry season (August- December). All products greatly overestimate very low intensities (50 mm). Moreover the responses of each product are different according to hydro climatic regimes. The aim of this study is to correct spatially the bias of precipitation, and compare various correction methods to define the best methods depending on the rainfall characteristic correcting (intensity, frequency). Four satellites products are used: Tropical Rainfall Measuring Mission (TRMM) Multisatellite Precipitation Analysis (TMPA) research product (3B42V7) and real time product (3B42RT), the Precipitation Estimation from Remotely-Sensed Information using Artificial Neural Network (PERSIANN) and the NOAA Climate Prediction Center (CPC) Morphing technique (CMORPH), for six hydro climatic regimes between 2001 and 2012. Several statistical transformations are used to correct the bias. Statistical transformations attempt to find a function h that maps a simulated variable Ps such that its new distribution equals the distribution of the observed variable Po. The first is the use of a distribution derived transformations which is a mixture of the Bernoulli and the Gamma distribution, where the Bernoulli distribution is used to model the probability of precipitation occurrence and the Gamma distribution used to model precipitation intensities. The second a quantile-quantile relation using parametric transformation, and the last one is a common approach using the empirical CDF of observed and modelled values instead of assuming parametric distributions. For each correction 30% of both, simulated and observed data sets, are used to calibrate and the other part used to validate. The validation are test with statistical

  5. U-AVLIS feed conversion using continuous metallothermic reduction of UF{sub 4}: System description and cost estimate

    Energy Technology Data Exchange (ETDEWEB)

    1994-04-01

    The purpose of this document is to present a system description and develop baseline capital and operating cost estimates for commercial facilities which produced U-Fe feedstock for AVLIS enrichment plants using the continuous fluoride reduction (CFR) process. These costs can then be used together with appropriate economic assumptions to calculate estimated unit costs to the AVLIS plant owner (or utility customer) for such conversion services. Six cases are being examined. All cases assume that the conversion services are performed by a private company at a commercial site which has an existing NRC license to possess source material and which has existing uranium processing operations. The cases differ in terms of annual production capacity and whether the new process system is installed in a new building or in an existing building on the site. The six cases are summarized here.

  6. Effect of large weight reductions on measured and estimated kidney function

    DEFF Research Database (Denmark)

    von Scholten, Bernt Johan; Persson, Frederik; Svane, Maria S

    2017-01-01

    BACKGROUND: When patients experience large weight loss, muscle mass may be affected followed by changes in plasma creatinine (pCr). The MDRD and CKD-EPI equations for estimated GFR (eGFR) include pCr. We hypothesised that a large weight loss reduces muscle mass and pCr causing increase in e......GFR was assessed during four hours plasma (51)Cr-EDTA clearance. GFR was estimated by four equations (MDRD, CKD-EPI-pCr, CKD-EPI-cysC and CKD-EPI-pCr-cysC). DXA-scans were performed at baseline and six months post-surgery to measure changes in lean limb mass, as a surrogate for muscle mass. RESULTS: Patients were....../min (p = 0.024), but corrected for current body surface area (BSA) mGFR was unchanged by 2 (-5; 9) ml/min/1.73 m(2) (p = 0.52). CKD-EPI-pCr increased by 12 (6; 17) and MDRD by 13 (8; 18) (p CKD-EPI-cysC was unchanged by 2 (-8; 4) ml/min/1.73 m(2) (p = 0.51). Lean limb mass...

  7. Use of statistical tools to evaluate the reductive dechlorination of high levels of TCE in microcosm studies.

    Science.gov (United States)

    Harkness, Mark; Fisher, Angela; Lee, Michael D; Mack, E Erin; Payne, Jo Ann; Dworatzek, Sandra; Roberts, Jeff; Acheson, Carolyn; Herrmann, Ronald; Possolo, Antonio

    2012-04-01

    A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study was designed as a fractional factorial experiment involving 177 bottles distributed between four industrial laboratories and was used to assess the impact of six electron donors, bioaugmentation, addition of supplemental nutrients, and two TCE levels (0.57 and 1.90 mM or 75 and 250 mg/L in the aqueous phase) on TCE dechlorination. Performance was assessed based on the concentration changes of TCE and reductive dechlorination degradation products. The chemical data was evaluated using analysis of variance (ANOVA) and survival analysis techniques to determine both main effects and important interactions for all the experimental variables during the 203-day study. The statistically based design and analysis provided powerful tools that aided decision-making for field application of this technology. The analysis showed that emulsified vegetable oil (EVO), lactate, and methanol were the most effective electron donors, promoting rapid and complete dechlorination of TCE to ethene. Bioaugmentation and nutrient addition also had a statistically significant positive impact on TCE dechlorination. In addition, the microbial community was measured using phospholipid fatty acid analysis (PLFA) for quantification of total biomass and characterization of the community structure and quantitative polymerase chain reaction (qPCR) for enumeration of Dehalococcoides organisms (Dhc) and the vinyl chloride reductase (vcrA) gene. The highest increase in levels of total biomass and Dhc was observed in the EVO microcosms, which correlated well with the dechlorination results.

  8. Children with developmental coordination disorder demonstrate a spatial mismatch when estimating coincident-timing ability with tools.

    Science.gov (United States)

    Caçola, Priscila; Ibana, Melvin; Ricard, Mark; Gabbard, Carl

    2016-01-01

    Coincident timing or interception ability can be defined as the capacity to precisely time sensory input and motor output. This study compared accuracy of typically developing (TD) children and those with Developmental Coordination Disorder (DCD) on a task involving estimation of coincident timing with their arm and various tool lengths. Forty-eight (48) participants performed two experiments where they imagined intercepting a target moving toward (Experiment 1) and target moving away (Experiment 2) from them in 5 conditions with their arm and tool lengths: arm, 10, 20, 30, and 40 cm. In Experiment 1, the DCD group overestimated interception points approximately twice as much as the TD group, and both groups overestimated consistently regardless of the tool used. Results for Experiment 2 revealed that those with DCD underestimated about three times as much as the TD group, with the exception of when no tool was used. Overall, these results indicate that children with DCD are less accurate with estimation of coincident-timing; which might in part explain their difficulties with common motor activities such as catching a ball or striking a baseball pitch.

  9. A Novel 2-D Coherent DOA Estimation Method Based on Dimension Reduction Sparse Reconstruction for Orthogonal Arrays.

    Science.gov (United States)

    Wang, Xiuhong; Mao, Xingpeng; Wang, Yiming; Zhang, Naitong; Li, Bo

    2016-09-15

    Based on sparse representations, the problem of two-dimensional (2-D) direction of arrival (DOA) estimation is addressed in this paper. A novel sparse 2-D DOA estimation method, called Dimension Reduction Sparse Reconstruction (DRSR), is proposed with pairing by Spatial Spectrum Reconstruction of Sub-Dictionary (SSRSD). By utilizing the angle decoupling method, which transforms a 2-D estimation into two independent one-dimensional (1-D) estimations, the high computational complexity induced by a large 2-D redundant dictionary is greatly reduced. Furthermore, a new angle matching scheme, SSRSD, which is less sensitive to the sparse reconstruction error with higher pair-matching probability, is introduced. The proposed method can be applied to any type of orthogonal array without requirement of a large number of snapshots and a priori knowledge of the number of signals. The theoretical analyses and simulation results show that the DRSR-SSRSD method performs well for coherent signals, which performance approaches Cramer-Rao bound (CRB), even under a single snapshot and low signal-to-noise ratio (SNR) condition.

  10. A Novel 2-D Coherent DOA Estimation Method Based on Dimension Reduction Sparse Reconstruction for Orthogonal Arrays

    Directory of Open Access Journals (Sweden)

    Xiuhong Wang

    2016-09-01

    Full Text Available Based on sparse representations, the problem of two-dimensional (2-D direction of arrival (DOA estimation is addressed in this paper. A novel sparse 2-D DOA estimation method, called Dimension Reduction Sparse Reconstruction (DRSR, is proposed with pairing by Spatial Spectrum Reconstruction of Sub-Dictionary (SSRSD. By utilizing the angle decoupling method, which transforms a 2-D estimation into two independent one-dimensional (1-D estimations, the high computational complexity induced by a large 2-D redundant dictionary is greatly reduced. Furthermore, a new angle matching scheme, SSRSD, which is less sensitive to the sparse reconstruction error with higher pair-matching probability, is introduced. The proposed method can be applied to any type of orthogonal array without requirement of a large number of snapshots and a priori knowledge of the number of signals. The theoretical analyses and simulation results show that the DRSR-SSRSD method performs well for coherent signals, which performance approaches Cramer–Rao bound (CRB, even under a single snapshot and low signal-to-noise ratio (SNR condition.

  11. Spatial factor analysis: a new tool for estimating joint species distributions and correlations in species range

    DEFF Research Database (Denmark)

    Thorson, James T.; Scheuerell, Mark D.; Shelton, Andrew O.;

    2015-01-01

    1. Predicting and explaining the distribution and density of species is one of the oldest concerns in ecology. Species distributions can be estimated using geostatistical methods, which estimate a latent spatial variable explaining observed variation in densities, but geostatistical methods may...... be imprecise for species with low densities or few observations. Additionally, simple geostatistical methods fail to account for correlations in distribution among species and generally estimate such cross-correlations as a post hoc exercise. 2. We therefore present spatial factor analysis (SFA), a spatial...

  12. Wind turbine noise reduction. An indicative cost estimation; Sanering windturbinegeluid. Een indicatieve raming van kosten

    Energy Technology Data Exchange (ETDEWEB)

    Verheijen, E.N.G.; Jabben, J.

    2011-11-15

    Since the 1st of January 2011 new rules apply for wind turbine noise. The rules include a different calculation method and different noise limits, intended for new wind turbines. In order to tackle noise annoyance from existing wind turbines the government is considering to set up a abatement operation, for which a cost estimate is given in this study. At an abatement limit of 47 decibel L{sub den} (Level day-evening-night) approximately 450 dwellings would be eligible for noise remediation. The costs of this operation are estimated at 4.9 million euro. However, in many of these cases the wind turbine is probably owned by the respective residents. It is possible that public funds for noise remediation will not be allocated to the owners of dwellings that directly profit from the turbines. If these cases are excluded, the abatement operation would cover 165 to 275 dwellings with estimated costs for remediation of 1.6 to 2.6 million euro. A tentative cost-benefit analysis suggests that noise remediation will be cost effective in most situations. This means that the benefits of reduced annoyance or sleep disturbance are in balance with the cost of remediation. Only for the small group of wind turbines that are in use for over fifteen years, remediation will not be cost effective. These wind turbines are nearing the end of their lifespan and are therefore ignored in the above estimates. [Dutch] Sinds 1 januari 2011 zijn nieuwe regels rond windturbinegeluid van kracht. Bij de nieuwe regelgeving hoort een andere berekeningsmethode en normstelling, bedoeld voor nieuw te plaatsen windturbines. Voor de aanpak van de geluidhinder door bestaande windturbines overweegt de overheid een saneringsoperatie op te zetten, waarvoor in dit onderzoek een kostenraming wordt gegeven. Bij een saneringsgrenswaarde van 47 decibel zouden ongeveer 450 woningen voor sanering in aanmerking komen. De kosten voor sanering daarvan worden geschat op 4,9 miljoen euro. Bij een groot deel van deze

  13. Rapid Estimation of TPH Reduction in Oil-Contaminated Soils Using the MED Method

    Energy Technology Data Exchange (ETDEWEB)

    Edenborn, H.M.; Zenone, V.A. (US EPA, Philadelphia, PA)

    2007-09-01

    Oil-contaminated soil and sludge generated during federal well plugging activities in northwestern Pennsylvania are currently remediated on small landfarm sites in lieu of more expensive landfill disposal. Bioremediation success at these sites in the past has been gauged by the decrease in total petroleum hydrocarbon (TPH) concentrations to less than 10,000 mg/kg measured using EPA Method 418.1. We tested the “molarity of ethanol droplet” (MED) water repellency test as a rapid indicator of TPH concentration in soil at one landfarm near Bradford, PA. MED was estimated by determining the minimum ethanol concentration (0 – 6 M) required to penetrate air-dried and sieved soil samples within 10 sec. TPH in soil was analyzed by rapid fluorometric analysis of methanol soil extracts, which correlated well with EPA Method 1664. Uncontaminated landfarm site soil amended with increasing concentrations of waste oil sludge showed a high correlation between MED and TPH. MED values exceeded the upper limit of 6 M as TPH estimates exceed ca. 25,000 mg/kg. MED and TPH at the land farm were sampled monthly during summer months over two years in a grid pattern that allowed spatial comparisons of site remediation effectiveness. MED and TPH decreased at a constant rate over time and remained highly correlated. Inexpensive alternatives to reagent-grade ethanol gave comparable results. The simple MED approach served as an inexpensive alternative to the routine laboratory analysis of TPH during the monitoring of oily waste bioremediation at this landfarm site.

  14. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  15. Sediment traps as a new tool for estimation of longevity of planktonic foraminifera

    Digital Repository Service at National Institute of Oceanography (India)

    Nigam, R.

    Sediment trap technique provides time series data of sinking particles (faunal and sediment) from surface to bottom of the sea. Besides many other applications, data can also be used to estimate life span of planktonic foraminifera. Based on rearing...

  16. Estimating Storm Discharge and Water Quality Data Uncertainty: A Software Tool for Monitoring and Modeling Applications

    Science.gov (United States)

    Uncertainty inherent in hydrologic and water quality data has numerous economic, societal, and environmental implications; therefore, scientists can no longer ignore measurement uncertainty when collecting and presenting these data. Reporting uncertainty estimates with measured hydrologic and water...

  17. Model reduction and frequency residuals for a robust estimation of nonlinearities in subspace identification

    Science.gov (United States)

    De Filippis, G.; Noël, J. P.; Kerschen, G.; Soria, L.; Stephan, C.

    2017-09-01

    The introduction of the frequency-domain nonlinear subspace identification (FNSI) method in 2013 constitutes one in a series of recent attempts toward developing a realistic, first-generation framework applicable to complex structures. If this method showed promising capabilities when applied to academic structures, it is still confronted with a number of limitations which needs to be addressed. In particular, the removal of nonphysical poles in the identified nonlinear models is a distinct challenge. In the present paper, it is proposed as a first contribution to operate directly on the identified state-space matrices to carry out spurious pole removal. A modal-space decomposition of the state and output matrices is examined to discriminate genuine from numerical poles, prior to estimating the extended input and feedthrough matrices. The final state-space model thus contains physical information only and naturally leads to nonlinear coefficients free of spurious variations. Besides spurious variations due to nonphysical poles, vibration modes lying outside the frequency band of interest may also produce drifts of the nonlinear coefficients. The second contribution of the paper is to include residual terms, accounting for the existence of these modes. The proposed improved FNSI methodology is validated numerically and experimentally using a full-scale structure, the Morane-Saulnier Paris aircraft.

  18. Side-by-side ANFIS as a useful tool for estimating correlated thermophysical properties

    Science.gov (United States)

    Grieu, Stéphane; Faugeroux, Olivier; Traoré, Adama; Claudet, Bernard; Bodnar, Jean-Luc

    2015-12-01

    In the present paper, an artificial intelligence-based approach dealing with the estimation of correlated thermophysical properties is designed and evaluated. This new and "intelligent" approach makes use of photothermal responses obtained when homogeneous materials are subjected to a light flux. Commonly, gradient-based algorithms are used as parameter estimation techniques. Unfortunately, such algorithms show instabilities leading to non-convergence in case of correlated properties to be estimated from a rebuilt impulse response. So, the main objective of the present work was to simultaneously estimate both the thermal diffusivity and conductivity of homogeneous materials, from front-face or rear-face photothermal responses to pseudo random binary signals. To this end, we used side-by-side neuro-fuzzy systems (adaptive network-based fuzzy inference systems) trained with a hybrid algorithm. We focused on the impact on generalization of both the examples used during training and the fuzzification process. In addition, computation time was a key point to consider. That is why the developed algorithm is computationally tractable and allows both the thermal diffusivity and conductivity of homogeneous materials to be simultaneously estimated with very good accuracy (the generalization error ranges between 4.6% and 6.2%).

  19. Autologous fat transplantation: volumetric tools for estimation of volume survival. A systematic review.

    Science.gov (United States)

    Herold, Christian; Ueberreiter, Klaus; Busche, Marc N; Vogt, Peter M

    2013-04-01

    Autologous fat transplantation has gained great recognition in aesthetic and reconstructive surgery. Two main aspects are of predominant importance for progress control after autologous fat transplantation to the breast: quantitative information about the rate of fat survival in terms of effective volume persistence and qualitative information about the breast tissue to exclude potential complications of autologous fat transplantation. There are several tools available for use in evaluating the rate of volume survival. They are extensively compared in this review. The anthropometric method, thermoplastic casts, and Archimedes' principle of water displacement are not up to date anymore because of major drawbacks, first and foremost being reduced reproducibility and exactness. They have been replaced by more exact and reproducible tools such as MRI volumetry or 3D body surface scans. For qualitative and quantitative progress control, MRI volumetry offers all the necessary information: evaluation of fat survival and diagnostically valuable imaging to exclude possible complications of autologous fat transplantation. For frequent follow-up, e.g., monthly volume analysis, repeated MRI exams would not be good for the patient and are not cost effective. In these cases, 3D surface imaging is a good tool and especially helpful in a private practice setting where fast data acquisition is needed. This tool also offers the possibility of simulating the results of autologous fat transplantation. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  20. Tsunami evacuation modelling as a tool for risk reduction: application to the coastal area of El Salvador

    Science.gov (United States)

    González-Riancho, P.; Aguirre-Ayerbe, I.; Aniel-Quiroga, I.; Abad, S.; González, M.; Larreynaga, J.; Gavidia, F.; Gutiérrez, O. Q.; Álvarez-Gómez, J. A.; Medina, R.

    2013-12-01

    Advances in the understanding and prediction of tsunami impacts allow the development of risk reduction strategies for tsunami-prone areas. This paper presents an integral framework for the formulation of tsunami evacuation plans based on tsunami vulnerability assessment and evacuation modelling. This framework considers (i) the hazard aspects (tsunami flooding characteristics and arrival time), (ii) the characteristics of the exposed area (people, shelters and road network), (iii) the current tsunami warning procedures and timing, (iv) the time needed to evacuate the population, and (v) the identification of measures to improve the evacuation process. The proposed methodological framework aims to bridge between risk assessment and risk management in terms of tsunami evacuation, as it allows for an estimation of the degree of evacuation success of specific management options, as well as for the classification and prioritization of the gathered information, in order to formulate an optimal evacuation plan. The framework has been applied to the El Salvador case study, demonstrating its applicability to site-specific response times and population characteristics.

  1. qpure: A tool to estimate tumor cellularity from genome-wide single-nucleotide polymorphism profiles.

    Science.gov (United States)

    Song, Sarah; Nones, Katia; Miller, David; Harliwong, Ivon; Kassahn, Karin S; Pinese, Mark; Pajic, Marina; Gill, Anthony J; Johns, Amber L; Anderson, Matthew; Holmes, Oliver; Leonard, Conrad; Taylor, Darrin; Wood, Scott; Xu, Qinying; Newell, Felicity; Cowley, Mark J; Wu, Jianmin; Wilson, Peter; Fink, Lynn; Biankin, Andrew V; Waddell, Nic; Grimmond, Sean M; Pearson, John V

    2012-01-01

    Tumour cellularity, the relative proportion of tumour and normal cells in a sample, affects the sensitivity of mutation detection, copy number analysis, cancer gene expression and methylation profiling. Tumour cellularity is traditionally estimated by pathological review of sectioned specimens; however this method is both subjective and prone to error due to heterogeneity within lesions and cellularity differences between the sample viewed during pathological review and tissue used for research purposes. In this paper we describe a statistical model to estimate tumour cellularity from SNP array profiles of paired tumour and normal samples using shifts in SNP allele frequency at regions of loss of heterozygosity (LOH) in the tumour. We also provide qpure, a software implementation of the method. Our experiments showed that there is a medium correlation 0.42 ([Formula: see text]-value=0.0001) between tumor cellularity estimated by qpure and pathology review. Interestingly there is a high correlation 0.87 ([Formula: see text]-value [Formula: see text] 2.2e-16) between cellularity estimates by qpure and deep Ion Torrent sequencing of known somatic KRAS mutations; and a weaker correlation 0.32 ([Formula: see text]-value=0.004) between IonTorrent sequencing and pathology review. This suggests that qpure may be a more accurate predictor of tumour cellularity than pathology review. qpure can be downloaded from https://sourceforge.net/projects/qpure/.

  2. qpure: A tool to estimate tumor cellularity from genome-wide single-nucleotide polymorphism profiles.

    Directory of Open Access Journals (Sweden)

    Sarah Song

    Full Text Available Tumour cellularity, the relative proportion of tumour and normal cells in a sample, affects the sensitivity of mutation detection, copy number analysis, cancer gene expression and methylation profiling. Tumour cellularity is traditionally estimated by pathological review of sectioned specimens; however this method is both subjective and prone to error due to heterogeneity within lesions and cellularity differences between the sample viewed during pathological review and tissue used for research purposes. In this paper we describe a statistical model to estimate tumour cellularity from SNP array profiles of paired tumour and normal samples using shifts in SNP allele frequency at regions of loss of heterozygosity (LOH in the tumour. We also provide qpure, a software implementation of the method. Our experiments showed that there is a medium correlation 0.42 ([Formula: see text]-value=0.0001 between tumor cellularity estimated by qpure and pathology review. Interestingly there is a high correlation 0.87 ([Formula: see text]-value [Formula: see text] 2.2e-16 between cellularity estimates by qpure and deep Ion Torrent sequencing of known somatic KRAS mutations; and a weaker correlation 0.32 ([Formula: see text]-value=0.004 between IonTorrent sequencing and pathology review. This suggests that qpure may be a more accurate predictor of tumour cellularity than pathology review. qpure can be downloaded from https://sourceforge.net/projects/qpure/.

  3. LastQuake app: a tool for risk reduction that focuses on earthquakes that really matter to the public!

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.; Frobert, L.

    2015-12-01

    Many seismic events are only picked up by seismometers but the only earthquakes that really interest the public (and the authorities) are those which are felt by the population. It is not a magnitude issue only; even a small magnitude earthquake, if widely felt can create a public desire for information. In LastQuake, felt events are automatically discriminated through the reactions of the population on the Internet. It uses three different and complementary methods. Twitter Earthquake detection, initially developed by the USGS, detects surges in the number of tweets containing the word "earthquake" in different languages. Flashsourcing, developed by EMSC, detects traffic surges caused by eyewitnesses on its website - one of the top global earthquake information websites. Both detections happen typically within 2 minutes of an event's occurrence. Finally, an earthquake is also confirmed as being felt when at least 3 independent felt reports (questionnaires) are collected. LastQuake automatically merges seismic data, direct (crowdsourced) and indirect eyewitnesses' contributions, damage scenarios and tsunami alerts to provide information on felt earthquakes and their effects in a time ranging from a few tens of seconds to 90 minutes. It is based on visual communication to erase language hurdles, for instance, it crowdsources felt reports through simple cartoons as well as geo-located pics. It was massively adopted in Nepal within hours of the Gorkha earthquake and collected thousands of felt reports and more than 100 informative pics. LastQuake is also a seismic risk reduction tools thanks to its very rapid information. When such information does not exist, people tend to call emergency services, crowds emerge and rumors spread. In its next release, LastQuake will also have "do/don't do" cartoons popping up after an earthquake to encourage appropriate behavior.

  4. SEEKR: Simulation Enabled Estimation of Kinetic Rates, A Computational Tool to Estimate Molecular Kinetics and Its Application to Trypsin-Benzamidine Binding.

    Science.gov (United States)

    Votapka, Lane W; Jagger, Benjamin R; Heyneman, Alexandra L; Amaro, Rommie E

    2017-04-20

    We present the Simulation Enabled Estimation of Kinetic Rates (SEEKR) package, a suite of open-source scripts and tools designed to enable researchers to perform multiscale computation of the kinetics of molecular binding, unbinding, and transport using a combination of molecular dynamics, Brownian dynamics, and milestoning theory. To demonstrate its utility, we compute the kon, koff, and ΔGbind for the protein trypsin with its noncovalent binder, benzamidine, and examine the kinetics and other results generated in the context of the new software, and compare our findings to previous studies performed on the same system. We compute a kon estimate of (2.1 ± 0.3) × 10(7) M(-1) s(-1), a koff estimate of 83 ± 14 s(-1), and a ΔGbind of -7.4 ± 0.1 kcal·mol(-1), all of which compare closely to the experimentally measured values of 2.9 × 10(7) M(-1) s(-1), 600 ± 300 s(-1), and -6.71 ± 0.05 kcal·mol(-1), respectively.

  5. A software tool to estimate the dynamic behaviour of the IP2C samples as sensors for didactic purposes

    Science.gov (United States)

    Graziani, S.; Pagano, F.; Pitrone, N.; Umana, E.

    2010-07-01

    Ionic Polymer Polymer Composites (IP2Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP2C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP2Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP2C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP2C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  6. A spatial decision support tool for estimating population catchments to aid rural and remote health service allocation planning.

    Science.gov (United States)

    Schuurman, Nadine; Randall, Ellen; Berube, Myriam

    2011-12-01

    There is mounting pressure on healthcare planners to manage and contain costs. In rural regions, there is a particular need to rationalize health service allocation to ensure the best possible coverage for a dispersed population. Rural health administrators need to be able to quantify the population affected by their allocation decisions and, therefore, need the capacity to incorporate spatial analyses into their decision-making process. Spatial decision support systems (SDSS) can provide this capability. In this article, we combine geographical information systems (GIS) with a web-based graphical user interface (webGUI) in a SDSS tool that enables rural decision-makers charged with service allocation, to estimate population catchments around specific health services in rural and remote areas. Using this tool, health-care planners can model multiple scenarios to determine the optimal location for health services, as well as the number of people served in each instance.

  7. A simple tool for estimating city-wide annual electrical energy savings from cooler surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Pomerantz, Melvin; Rosado, Pablo J.; Levinson, Ronnen

    2015-12-01

    We present a simple method to estimate the maximum possible electrical energy saving that might be achieved by increasing the albedo of surfaces in a large city. We restrict this to the “indirect effect”, the cooling of outside air that lessens the demand for air conditioning (AC). Given the power demand of the electric utilities and data about the city, we can use a single linear equation to estimate the maximum savings. For example, the result for an albedo change of 0.2 of pavements in a typical warm city in California, such as Sacramento, is that the saving is less than about 2 kWh per m2 per year. This may help decision makers choose which heat island mitigation techniques are economical from an energy-saving perspective.

  8. ESTIMATED DURATION OF THE SUBSURFACE REDUCTION ENVIRONMENT PRODUCED BY THE SALTSTONE DISPOSAL FACILITY ON THE SAVANNAH RIVER SITE.

    Energy Technology Data Exchange (ETDEWEB)

    Kaplan, D; Thong Hang, T

    2007-01-22

    The formula for Savannah River Site (SRS) saltstone includes {approx}25 wt% slag to create a reducing environment for mitigating the subsurface transport of several radionuclides, including Tc-99. Based on laboratory measurements and two-dimensional reactive transport calculations, it was estimated that the SRS saltstone waste form will maintain a reducing environment, and therefore its ability to sequester Tc-99, for well over 10,000 years. For example, it was calculated that {approx}16% of the saltstone reduction capacity would be consumed after 213,000 years. For purposes of comparison, a second calculation was presented that was based on entirely different assumptions (direct spectroscopic measurements and diffusion calculations). The results from this latter calculation were near identical to those from this study. Obtaining similar conclusions by two extremely different calculations and sets of assumptions provides additional credence to the conclusion that the saltstone will likely maintain a reducing environment in excess of 10,000 years.

  9. Wastewater-based epidemiology as a new tool for estimating population exposure to phthalate plasticizers.

    Science.gov (United States)

    Gonzalez-Marino, Iria; Rodil, Rosario; Barrio, Ivan; Cela, Rafael; Quintana, Jose Benito

    2017-02-27

    This study proposes the monitoring of phthalate metabolites in wastewater as a non-intrusive and economic alternative to urine analysis for estimating human exposure to phthalates. To this end, a solid-phase extraction-liquid chromatography-tandem mass spectrometry method was developed, allowing for the determination of eight phthalate metabolites in wastewater (limits of quantification between 0.5 and 32 ng L-1). The analysis of samples from the NW region of Spain showed that these substances occur in raw wastewater up to ca.1.6 µg L-1 and in treated wastewater up to ca. 1 µg L-1. Concentrations in raw wastewater were converted into levels of exposure to six phthalate diesters. For three of them, these levels were always below the daily exposure thresholds recommended by the US-Environmental Protection Agency and the European Food Safety Authority. For the other three, however, estimates of exposure surpassed such threshold (especially the toddler threshold) in some cases, highlighting the significance of the exposure to phthalates in children. Finally, concentrations in wastewater were also used to estimate metabolite concentrations in urine, providing a reasonable concordance between our results and the data obtained in two previous biomonitoring studies.

  10. Estimating Prediction Uncertainty from Geographical Information System Raster Processing: A User's Manual for the Raster Error Propagation Tool (REPTool)

    Science.gov (United States)

    Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.

    2009-01-01

    The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.

  11. An Accurate Computational Tool for Performance Estimation of FSO Communication Links over Weak to Strong Atmospheric Turbulent Channels

    Directory of Open Access Journals (Sweden)

    Theodore D. Katsilieris

    2017-03-01

    Full Text Available The terrestrial optical wireless communication links have attracted significant research and commercial worldwide interest over the last few years due to the fact that they offer very high and secure data rate transmission with relatively low installation and operational costs, and without need of licensing. However, since the propagation path of the information signal, i.e., the laser beam, is the atmosphere, their effectivity affects the atmospheric conditions strongly in the specific area. Thus, system performance depends significantly on the rain, the fog, the hail, the atmospheric turbulence, etc. Due to the influence of these effects, it is necessary to study, theoretically and numerically, very carefully before the installation of such a communication system. In this work, we present exactly and accurately approximate mathematical expressions for the estimation of the average capacity and the outage probability performance metrics, as functions of the link’s parameters, the transmitted power, the attenuation due to the fog, the ambient noise and the atmospheric turbulence phenomenon. The latter causes the scintillation effect, which results in random and fast fluctuations of the irradiance at the receiver’s end. These fluctuations can be studied accurately with statistical methods. Thus, in this work, we use either the lognormal or the gamma–gamma distribution for weak or moderate to strong turbulence conditions, respectively. Moreover, using the derived mathematical expressions, we design, accomplish and present a computational tool for the estimation of these systems’ performances, while also taking into account the parameter of the link and the atmospheric conditions. Furthermore, in order to increase the accuracy of the presented tool, for the cases where the obtained analytical mathematical expressions are complex, the performance results are verified with the numerical estimation of the appropriate integrals. Finally, using

  12. Statistical tools for transgene copy number estimation based on real-time PCR.

    Science.gov (United States)

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation

  13. ERROR ESTIMATE FOR INFLUENCE OF MODEL REDUCTION OF NONLINEAR DISSIPATIVE AUTONOMOUS DYNAMICAL SYSTEM ON LONG-TERM BEHAVIOURS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jia-zhong; LIU Yan; CHEN Dang-min

    2005-01-01

    From viewpoint of nonlinear dynamics, the model reduction and its influence on the long-term behaviours of a class of nonlinear dissipative autonomous dynamical system with higher dimension are investigated theoretically under some assumptions. The system is analyzed in the state space with an introduction of a distance definition which can be used to describe the distance between the full system and the reduced system, and the solution of the full system is then projected onto the complete space spanned by the eigenvectors of the linear operator of the governing equations. As a result, the influence of mode series tnncation on the long-term behaviours and the error estimate are derived, showing that the error is dependent on the first products of frequencies and damping ratios in the subspace spanned by the eigenvectors with higher modal damping. Furthermore, the fundamental understanding for the topological change of the solution due to the application of different model reduction is interpreted in a mathematically precise way, using the qualitative theory of nonlinear dynamics.

  14. Structural correlation method for model reduction and practical estimation of patient specific parameters illustrated on heart rate regulation.

    Science.gov (United States)

    Ottesen, Johnny T; Mehlsen, Jesper; Olufsen, Mette S

    2014-11-01

    We consider the inverse and patient specific problem of short term (seconds to minutes) heart rate regulation specified by a system of nonlinear ODEs and corresponding data. We show how a recent method termed the structural correlation method (SCM) can be used for model reduction and for obtaining a set of practically identifiable parameters. The structural correlation method includes two steps: sensitivity and correlation analysis. When combined with an optimization step, it is possible to estimate model parameters, enabling the model to fit dynamics observed in data. This method is illustrated in detail on a model predicting baroreflex regulation of heart rate and applied to analysis of data from a rat and healthy humans. Numerous mathematical models have been proposed for prediction of baroreflex regulation of heart rate, yet most of these have been designed to provide qualitative predictions of the phenomena though some recent models have been developed to fit observed data. In this study we show that the model put forward by Bugenhagen et al. can be simplified without loss of its ability to predict measured data and to be interpreted physiologically. Moreover, we show that with minimal changes in nominal parameter values the simplified model can be adapted to predict observations from both rats and humans. The use of these methods make the model suitable for estimation of parameters from individuals, allowing it to be adopted for diagnostic procedures.

  15. The use of activity-based cost estimation as a management tool for cultural change

    Science.gov (United States)

    Mandell, Humboldt; Bilby, Curt

    1991-01-01

    It will be shown that the greatest barrier to American exploration of the planet Mars is not the development of the technology needed to deliver humans and return them safely to earth. Neither is it the cost of such an undertaking, as has been previously suggested, although certainly, such a venture may not be inexpensive by some measures. The predicted costs of exploration have discouraged serious political dialog on the subject. And, in fact, even optimistic projections of the NASA budget do not contain the resources required, under the existing development and management paradigm, for human space exploration programs. It will be demonstrated that the perception of the costs of such a venture, and the cultural responses to the perceptions are factors inhibiting American exploration of the moon and the planet Mars. Cost models employed in the aerospace industry today correctly mirror the history of past space programs, and as such, are representative of the existing management and development paradigms. However, if, under this current paradigm no major exploration programs are feasible, then cost analysis methods based in the past may not have great utility in exploring the needed cultural changes. This paper explores the use of a new type of model, the activity based cost model, which will treat management style as an input variable, in a sense providing a tool whereby a complete, affordable program might be designed, including both the technological and management aspects.

  16. Community duplicate diet methodology: a new tool for estimating dietary exposures to pesticides.

    Science.gov (United States)

    Melnyk, Lisa Jo; McCombs, Michelle; Brown, G Gordon; Raymer, James; Nishioka, Marcia; Buehler, Stephanie; Freeman, Natalie; Michael, Larry C

    2012-01-01

    An observational field study was conducted to assess the feasibility of a community duplicate diet collection method; a dietary monitoring tool that is population-based. The purpose was to establish an alternative procedure to duplicate diet sampling that would be more efficient for a large, defined population, e.g., in the National Children's Study (NCS). Questionnaire data and food samples were collected in a residence so as not to lose the important component of storage, preparation, and handling in a contaminated microenvironment. The participants included nine Hispanic women of child bearing age living in Apopka, FL, USA. Foods highly consumed by Hispanic women were identified based on national food frequency questionnaires and prioritized by permethrin residue concentrations as measured for the Pesticide Data Program. Participants filled out questionnaires to determine if highly consumed foods were commonly eaten by them and to assess the collection protocol for the food samples. Measureable levels of permethrin were found in 54% of the samples. Questionnaire responses indicated that the collection of the community duplicate diet was feasible for a defined population.

  17. RNAseqPS: A Web Tool for Estimating Sample Size and Power for RNAseq Experiment.

    Science.gov (United States)

    Guo, Yan; Zhao, Shilin; Li, Chung-I; Sheng, Quanhu; Shyr, Yu

    2014-01-01

    Sample size and power determination is the first step in the experimental design of a successful study. Sample size and power calculation is required for applications for National Institutes of Health (NIH) funding. Sample size and power calculation is well established for traditional biological studies such as mouse model, genome wide association study (GWAS), and microarray studies. Recent developments in high-throughput sequencing technology have allowed RNAseq to replace microarray as the technology of choice for high-throughput gene expression profiling. However, the sample size and power analysis of RNAseq technology is an underdeveloped area. Here, we present RNAseqPS, an advanced online RNAseq power and sample size calculation tool based on the Poisson and negative binomial distributions. RNAseqPS was built using the Shiny package in R. It provides an interactive graphical user interface that allows the users to easily conduct sample size and power analysis for RNAseq experimental design. RNAseqPS can be accessed directly at http://cqs.mc.vanderbilt.edu/shiny/RNAseqPS/.

  18. The use of activity-based cost estimation as a management tool for cultural change

    Science.gov (United States)

    Mandell, Humboldt; Bilby, Curt

    1991-01-01

    It will be shown that the greatest barrier to American exploration of the planet Mars is not the development of the technology needed to deliver humans and return them safely to earth. Neither is it the cost of such an undertaking, as has been previously suggested, although certainly, such a venture may not be inexpensive by some measures. The predicted costs of exploration have discouraged serious political dialog on the subject. And, in fact, even optimistic projections of the NASA budget do not contain the resources required, under the existing development and management paradigm, for human space exploration programs. It will be demonstrated that the perception of the costs of such a venture, and the cultural responses to the perceptions are factors inhibiting American exploration of the moon and the planet Mars. Cost models employed in the aerospace industry today correctly mirror the history of past space programs, and as such, are representative of the existing management and development paradigms. However, if, under this current paradigm no major exploration programs are feasible, then cost analysis methods based in the past may not have great utility in exploring the needed cultural changes. This paper explores the use of a new type of model, the activity based cost model, which will treat management style as an input variable, in a sense providing a tool whereby a complete, affordable program might be designed, including both the technological and management aspects.

  19. A framework and a set of tools called Nutting models to estimate retention capacities and loads of nitrogen and phosphorus in rivers at catchment and national level (France)

    Science.gov (United States)

    Legeay, Pierre-Louis; Moatar, Florentina; Dupas, Rémi; Gascuel-Odoux, Chantal

    2016-04-01

    The Nutting-N and Nutting-P models (Dupas et al., 2013, 2015) have been developed to estimate Nitrogen and Phosphorus nonpoint-source emissions to surface water, using readily available data. These models were inspired from US model SPARROW (Smith al., 1997) and European model GREEN (Grizzetti et al., 2008), i.e. statistical approaches consisting of linking nitrogen and phosphorus surplus to catchment's land and rivers characteristics to find the catchment relative retention capacities. The nutrient load (L) at the outlet of each catchment is expressed as: L=R*(B*DS+PS) [1] where DS is diffuse sources (i.e. surplus in kg.ha-1/yr-1 for N, P storage in soil for P), PS is point sources from domestic and industrial origin (kg.ha-1.yr-1), R and B are the river system and basin reduction factor, respectively and they combine observed variables and calibrated parameters. The model was calibrated on independent catchments for the 2005-2009 and 2008-2012 periods. Variables were selected according to Bayesian Information Criterion (BIC) in order to optimize the predictive performance of the models. From these basic models, different improvements have been realized to build a framework and a set of tools: 1) a routing module has been added in order to improve estimations on 4 or 5 stream order, i.e. upscaling the basic Nutting approach; 2) a territorial module, in order to test the models at local scale (from 500 to 5000 km²); 3) a seasonal estimation has been investigated. The basic approach as well territorial application will be illustrated. These tools allow water manager to identify areas at risk where high nutrients loads are estimated, as well areas where retention is potentially high and can buffer high nutrient sources. References Dupas R., Curie F., Gascuel-Odoux C., Moatar F., Delmas M., Parnaudeau, V., Durand P., 2013. Assessing N emissions in surface water at the national level: Comparison of country-wide vs. regionalized models. Science of the Total Environment

  20. Exfoliative cytology: A possible tool in age estimation in forensic odontology

    Directory of Open Access Journals (Sweden)

    Devi Charan Shetty

    2015-01-01

    Full Text Available Introduction: Age determination of unknown human bodies is important in the setting of a crime investigation or a mass disaster because the age at death, birth date, and year of death as well as gender can guide investigators to the correct identity among a large number of possible matches. Objective: The study was undertaken with an aim to estimate the age of an individual from their buccal smears by comparing the average cell size using image analysis morphometric software. Materials and Methods: Buccal smears were collected from 100 apparently healthy individuals. After fixation in 95% alcohol, the smears were stained using standard Papanicolaou laboratory procedure. The average cell size was measured using Dewinter′s image analysis software version 4.3. Statistical analysis of the data was done using one-way ANOVA, Bonferroni procedures. Results: The results showed significant decrease in average cell size of individual with increase in age. The difference was highly significant in age group of above 60 years. Conclusion: Age-related alterations are observed in buccal smears.

  1. Exfoliative cytology: A possible tool in age estimation in forensic odontology.

    Science.gov (United States)

    Shetty, Devi Charan; Wadhwan, Vijay; Khanna, Kaveri Surya; Jain, Anshi; Gupta, Amit

    2015-01-01

    Age determination of unknown human bodies is important in the setting of a crime investigation or a mass disaster because the age at death, birth date, and year of death as well as gender can guide investigators to the correct identity among a large number of possible matches. The study was undertaken with an aim to estimate the age of an individual from their buccal smears by comparing the average cell size using image analysis morphometric software. Buccal smears were collected from 100 apparently healthy individuals. After fixation in 95% alcohol, the smears were stained using standard Papanicolaou laboratory procedure. The average cell size was measured using Dewinter's image analysis software version 4.3. Statistical analysis of the data was done using one-way ANOVA, Bonferroni procedures. The results showed significant decrease in average cell size of individual with increase in age. The difference was highly significant in age group of above 60 years. Age-related alterations are observed in buccal smears.

  2. Anisotropy of Anhysteretic Remanent Magnetization: A Tool To Estimate Trm Deviations In Volcanic Rocks

    Science.gov (United States)

    Gattacceca, J.; Rochette, P.

    In order to assess the paleomagnetic direction deviations due to anisotropy in volcanic rocks, we studied the anisotropies of magnetic susceptibility (AMS), of anhysteretic remanent magnetization (AARM) and of thermoremanent magnetization (ATRM) of a set of Miocene pyroclastic rocks from Sardinia (Italy). The main magnetic carrier is pseudo-single domain titanomagnetite. AARM and ATRM were determined with a 3-position measurement scheme. The measurements show that there is no general relation between the degrees of AMS and ATRM (as this relation depends on the ti- tanomagnetite grain size spectrum), while the degree of AARM and ATRM are almost identical. Measuring the AMS is thus nearly irrelevant to quantitatively estimate TRM deviations due to anisotropy in volcanic rocks. Instead, measuring the AARM provides a reliable and relatively fast method to correct paleomagnetic direction deviations in volcanic rocks (inclination shallowing due to horizontal planar fabric in most cases). This is confirmed by a case study on a succession of four welded pyroclastic flows : an apparent paleosecular variation pattern is almost entirely explained by the effect of ATRM.

  3. Geospatial tools effectively estimate nonexceedance probabilities of daily streamflow at ungauged and intermittently gauged locations in Ohio

    Science.gov (United States)

    Farmer, William H.; Koltun, Greg

    2017-01-01

    Study regionThe state of Ohio in the United States, a humid, continental climate.Study focusThe estimation of nonexceedance probabilities of daily streamflows as an alternative means of establishing the relative magnitudes of streamflows associated with hydrologic and water-quality observations.New hydrological insights for the regionSeveral methods for estimating nonexceedance probabilities of daily mean streamflows are explored, including single-index methodologies (nearest-neighboring index) and geospatial tools (kriging and topological kriging). These methods were evaluated by conducting leave-one-out cross-validations based on analyses of nearly 7 years of daily streamflow data from 79 unregulated streamgages in Ohio and neighboring states. The pooled, ordinary kriging model, with a median Nash–Sutcliffe performance of 0.87, was superior to the single-site index methods, though there was some bias in the tails of the probability distribution. Incorporating network structure through topological kriging did not improve performance. The pooled, ordinary kriging model was applied to 118 locations without systematic streamgaging across Ohio where instantaneous streamflow measurements had been made concurrent with water-quality sampling on at least 3 separate days. Spearman rank correlations between estimated nonexceedance probabilities and measured streamflows were high, with a median value of 0.76. In consideration of application, the degree of regulation in a set of sample sites helped to specify the streamgages required to implement kriging approaches successfully.

  4. Aerial Survey as a Tool to Estimate Abundance and Describe Distribution of a Carcharhinid Species, the Lemon Shark, Negaprion brevirostris

    Directory of Open Access Journals (Sweden)

    S. T. Kessel

    2013-01-01

    Full Text Available Aerial survey provides an important tool to assess the abundance of both terrestrial and marine vertebrates. To date, limited work has tested the effectiveness of this technique to estimate the abundance of smaller shark species. In Bimini, Bahamas, the lemon shark (Negaprion brevirostris shows high site fidelity to a shallow sandy lagoon, providing an ideal test species to determine the effectiveness of localised aerial survey techniques for a Carcharhinid species in shallow subtropical waters. Between September 2007 and September 2008, visual surveys were conducted from light aircraft following defined transects ranging in length between 8.8 and 4.4 km. Count results were corrected for “availability”, “perception”, and “survey intensity” to provide unbiased abundance estimates. The abundance of lemon sharks was greatest in the central area of the lagoon during high tide, with a change in abundance distribution to the east and western regions of the lagoon with low tide. Mean abundance of sharks was estimated at 49 (±8.6 individuals, and monthly abundance was significantly positively correlated with mean water temperature. The successful implementation of the aerial survey technique highlighted the potential of further employment for shark abundance assessments in shallow coastal marine environments.

  5. Volcano-tectonic earthquakes: A new tool for estimating intrusive volumes and forecasting eruptions

    Science.gov (United States)

    White, Randall A.; McCausland, Wendy

    2016-01-01

    We present data on 136 high-frequency earthquakes and swarms, termed volcano-tectonic (VT) seismicity, which preceded 111 eruptions at 83 volcanoes, plus data on VT swarms that preceded intrusions at 21 other volcanoes. We find that VT seismicity is usually the earliest reported seismic precursor for eruptions at volcanoes that have been dormant for decades or more, and precedes eruptions of all magma types from basaltic to rhyolitic and all explosivities from VEI 0 to ultraplinian VEI 6 at such previously long-dormant volcanoes. Because large eruptions occur most commonly during resumption of activity at long-dormant volcanoes, VT seismicity is an important precursor for the Earth's most dangerous eruptions. VT seismicity precedes all explosive eruptions of VEI ≥ 5 and most if not all VEI 4 eruptions in our data set. Surprisingly we find that the VT seismicity originates at distal locations on tectonic fault structures at distances of one or two to tens of kilometers laterally from the site of the eventual eruption, and rarely if ever starts beneath the eruption site itself. The distal VT swarms generally occur at depths almost equal to the horizontal distance of the swarm from the summit out to about 15 km distance, beyond which hypocenter depths level out. We summarize several important characteristics of this distal VT seismicity including: swarm-like nature, onset days to years prior to the beginning of magmatic eruptions, peaking of activity at the time of the initial eruption whether phreatic or magmatic, and large non-double couple component to focal mechanisms. Most importantly we show that the intruded magma volume can be simply estimated from the cumulative seismic moment of the VT seismicity from:

  6. Quantifying spatial correlations of fluorescent markers using enhanced background reduction with protein proximity index and correlation coefficient estimations.

    Science.gov (United States)

    Zinchuk, Vadim; Wu, Yong; Grossenbacher-Zinchuk, Olga; Stefani, Enrico

    2011-09-15

    Interactions of proteins are examined by detecting their overlap using fluorescent markers. The observed overlap is then quantified to serve as a measure of spatial correlation. A major drawback of this approach is that it can produce false values because of the properties of the image background. To remedy this, we provide a protocol to reduce the contribution of image background and then apply a protein proximity index (PPI) and correlation coefficient to estimate colocalization. Background heterogeneity is reduced by the median filtering procedure, comprising two steps, to reduce random noise and background, respectively. Alternatively, background can be reduced by advanced thresholding. PPI provides separate values for each channel to characterize the contribution of each protein, whereas correlation coefficient determines the overall colocalization. The protocol is demonstrated using computer-simulated and real biological images. It minimizes human bias and can be universally applied to various cell types in which there is a need to understand protein-protein interactions. Background reductions require 3-5 min per image. Quantifications take <1 min. The entire procedure takes approximately 15-30 min.

  7. The variogram and the simple kriging estimator: Useful tools to complement lithologic correlation in a complex fluvial depositional environment

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, J. [IT Corp., Martinez, CA (United States)

    1995-12-31

    Three dimensional grid estimation has been combined with an interpretive model of fluvial deposition for correlating low permeability zones in the shallow subsurface. Improvement in correlation reliability was realized by combining hand drawn interpretive cross-sections (spotting local trends in grain size, CPT log signature, etc.) with cross-section maps of the geostatistical grid model. The site is a military installation where soil contamination is being mapped and quantified using three dimensional modeling techniques. The subsurface is a complex fluvial depositional environment with intermittent bedrock highs and more frequent calcite and Calcium/Iron related cementation. Hence, the problem of lithologic correlation occurred where the drillhole spacing became wider than the channel belt width or cemented materials prevented detailed sampling. The goals of the sampling and analysis plan called for sampling within the first continuous silt or clay unit in order to quantify the zone of greatest contaminant retention on its downward migratory path. This paper will describe a three dimensional correlation technique which employs geostatistical analysis of CPT hole data specifically coded by permeability indicator thresholds. The process yielded variogram ranges applied to a simple kriging estimator on a 3-dimensional grid block. Estimates of clay probability are then provided as output and overlaid with the geologists cross section interpretation. The marriage of these two tools was invaluable in that geostatistical estimates sometimes behaved contrary to the channel depositional process, while on the other hand, the geologists interpretation often failed to recognize data in the third dimension (i.e. off section CPT data).

  8. Performance Analysis of a Fluidic Axial Oscillation Tool for Friction Reduction with the Absence of a Throttling Plate

    Directory of Open Access Journals (Sweden)

    Xinxin Zhang

    2017-04-01

    Full Text Available An axial oscillation tool is proved to be effective in solving problems associated with high friction and torque in the sliding drilling of a complex well. The fluidic axial oscillation tool, based on an output-fed bistable fluidic oscillator, is a type of axial oscillation tool which has become increasingly popular in recent years. The aim of this paper is to analyze the dynamic flow behavior of a fluidic axial oscillation tool with the absence of a throttling plate in order to evaluate its overall performance. In particular, the differences between the original design with a throttling plate and the current default design are profoundly analyzed, and an improvement is expected to be recorded for the latter. A commercial computational fluid dynamics code, Fluent, was used to predict the pressure drop and oscillation frequency of a fluidic axial oscillation tool. The results of the numerical simulations agree well with corresponding experimental results. A sufficient pressure pulse amplitude with a low pressure drop is desired in this study. Therefore, a relative pulse amplitude of pressure drop and displacement are introduced in our study. A comparison analysis between the two designs with and without a throttling plate indicates that when the supply flow rate is relatively low or higher than a certain value, the fluidic axial oscillation tool with a throttling plate exhibits a better performance; otherwise, the fluidic axial oscillation tool without a throttling plate seems to be a preferred alternative. In most of the operating circumstances in terms of the supply flow rate and pressure drop, the fluidic axial oscillation tool performs better than the original design.

  9. A theoretical model to estimate the oil burial depth on sandy beaches: A new oil spill management tool.

    Science.gov (United States)

    Bernabeu, Ana M; Fernández-Fernández, Sandra; Rey, Daniel

    2016-08-15

    In oiled sandy beaches, unrecovered fuel can be buried up to several metres. This study proposes a theoretical approach to oil burial estimation along the intertidal area. First, our results revealed the existence of two main patterns in seasonal beach profile behaviour. Type A is characterized by intertidal slopes of time-constant steepness which advance/recede parallel to themselves in response to changing wave conditions. Type B is characterized by slopes of time-varying steepness which intersect at a given point in the intertidal area. This finding has a direct influence on the definition of oil depth. Type A pattern exhibits oil burial along the entire intertidal area following decreasing wave energy, while the type B pattern combines burial in high intertidal and exhumation in mid and/or low intertidal zones, depending on the position of the intersection point. These outcomes should be incorporated as key tools in future oil spill management programs.

  10. Effect of Using Different Vehicle Weight Groups on the Estimated Relationship Between Mass Reduction and U.S. Societal Fatality Risk per Vehicle Miles of Travel

    Energy Technology Data Exchange (ETDEWEB)

    Wenzel, Tom P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Technologies Area. Building Technology and Urban Systems Division

    2016-08-22

    This report recalculates the estimated relationship between vehicle mass and societal fatality risk, using alternative groupings by vehicle weight, to test whether the trend of decreasing fatality risk from mass reduction as case vehicle mass increases, holds over smaller increments of the range in case vehicle masses. The NHTSA baseline regression model estimates the relationship using for two weight groups for cars and light trucks; we re-estimated the mass reduction coefficients using four, six, and eight bins of vehicle mass. The estimated effect of mass reduction on societal fatality risk was not consistent over the range in vehicle masses in these weight bins. These results suggest that the relationship indicated by the NHTSA baseline model is a result of other, unmeasured attributes of the mix of vehicles in the lighter vs. heavier weight bins, and not necessarily the result of a correlation between mass reduction and societal fatality risk. An analysis of the average vehicle, driver, and crash characteristics across the various weight groupings did not reveal any strong trends that might explain the lack of a consistent trend of decreasing fatality risk from mass reduction in heavier vehicles.

  11. Estimates of associated outdoor particulate matter health risk and costs reductions from alternative building, ventilation and filtration scenarios.

    Science.gov (United States)

    Sultan, Zuraimi M

    2007-05-01

    Although many studies have reported calculations of outdoor particulate matter (PM) associated externalities using ambient data, there is little information on the role buildings, their ventilation and filtration play. This study provides the framework to evaluate the health risk and cost reduction of building, ventilation and filtration strategies from outdoor PM pollution on a nationwide level and applied it to a case study in Singapore. Combining Indoor Air Quality (IAQ) and time weighted exposure models, with established concentration-response functions and monetary valuation methods, mortality and morbidity effects of outdoor PM on the population of Singapore under different building, ventilation and filtration strategies were estimated. Different interventions were made to compare the effects from the current building conditions. The findings demonstrate that building protection effect reduced approximately half the attributable health cases amounting to US$17.7 billion due to PM pollution when compared to levels computed using outdoor data alone. For residential buildings, nationwide adoption of natural ventilation from current state is associated with 28% higher cases of mortality and 13 to 38% higher cases for different morbidities, amounting to US$6.7 billion. The incurred cost is negligible compared to energy costs of air-conditioning. However, nationwide adoption of closed residence and air-conditioning are associated with outcomes including fewer mortality (10 and 6% respectively), fewer morbidities (8 and 4% respectively) and economic savings of US$1.5 and 0.9 billion respectively. The related savings were about a factor of 9 the energy cost for air-conditioning. Nationwide adoption of mechanical ventilation and filtration from current natural ventilation in schools is associated with fewer asthma hospital admissions and exacerbations; although the economic impact is not substantial. Enhanced workplace filtration reduces the mortality and morbidity

  12. Estimated reductions in cardiovascular and gastric cancer disease burden through salt policies in England: an IMPACTNCD microsimulation study

    Science.gov (United States)

    Guzman-Castillo, Maria; Hyseni, Lirije; Hickey, Graeme L; Bandosz, Piotr; Buchan, Iain; Capewell, Simon; O'Flaherty, Martin

    2017-01-01

    Objective To estimate the impact and equity of existing and potential UK salt reduction policies on primary prevention of cardiovascular disease (CVD) and gastric cancer (GCa) in England. Design A microsimulation study of a close-to-reality synthetic population. In the first period, 2003–2015, we compared the impact of current policy against a counterfactual ‘no intervention’ scenario, which assumed salt consumption persisted at 2003 levels. For 2016–2030, we assumed additional legislative policies could achieve a steeper salt decline and we compared this against the counterfactual scenario that the downward trend in salt consumption observed between 2001 and 2011 would continue up to 2030. Setting Synthetic population with similar characteristics to the non-institutionalised population of England. Participants Synthetic individuals with traits informed by the Health Survey for England. Main measure CVD and GCa cases and deaths prevented or postponed, stratified by fifths of socioeconomic status using the Index of Multiple Deprivation. Results Since 2003, current salt policies have prevented or postponed ∼52 000 CVD cases (IQR: 34 000–76 000) and 10 000 CVD deaths (IQR: 3000–17 000). In addition, the current policies have prevented ∼5000 new cases of GCa (IQR: 2000–7000) resulting in about 2000 fewer deaths (IQR: 0–4000). This policy did not reduce socioeconomic inequalities in CVD, and likely increased inequalities in GCa. Additional legislative policies from 2016 could further prevent or postpone ∼19 000 CVD cases (IQR: 8000–30 000) and 3600 deaths by 2030 (IQR: −400–8100) and may reduce inequalities. Similarly for GCa, 1200 cases (IQR: −200–3000) and 700 deaths (IQR: −900–2300) could be prevented or postponed with a neutral impact on inequalities. Conclusions Current salt reduction policies are powerfully effective in reducing the CVD and GCa burdens overall but fail to reduce the inequalities involved

  13. Estimated reductions in cardiovascular and gastric cancer disease burden through salt policies in England: an IMPACTNCD microsimulation study.

    Science.gov (United States)

    Kypridemos, Chris; Guzman-Castillo, Maria; Hyseni, Lirije; Hickey, Graeme L; Bandosz, Piotr; Buchan, Iain; Capewell, Simon; O'Flaherty, Martin

    2017-01-24

    To estimate the impact and equity of existing and potential UK salt reduction policies on primary prevention of cardiovascular disease (CVD) and gastric cancer (GCa) in England. A microsimulation study of a close-to-reality synthetic population. In the first period, 2003-2015, we compared the impact of current policy against a counterfactual 'no intervention' scenario, which assumed salt consumption persisted at 2003 levels. For 2016-2030, we assumed additional legislative policies could achieve a steeper salt decline and we compared this against the counterfactual scenario that the downward trend in salt consumption observed between 2001 and 2011 would continue up to 2030. Synthetic population with similar characteristics to the non-institutionalised population of England. Synthetic individuals with traits informed by the Health Survey for England. CVD and GCa cases and deaths prevented or postponed, stratified by fifths of socioeconomic status using the Index of Multiple Deprivation. Since 2003, current salt policies have prevented or postponed ∼52 000 CVD cases (IQR: 34 000-76 000) and 10 000 CVD deaths (IQR: 3000-17 000). In addition, the current policies have prevented ∼5000 new cases of GCa (IQR: 2000-7000) resulting in about 2000 fewer deaths (IQR: 0-4000). This policy did not reduce socioeconomic inequalities in CVD, and likely increased inequalities in GCa. Additional legislative policies from 2016 could further prevent or postpone ∼19 000 CVD cases (IQR: 8000-30 000) and 3600 deaths by 2030 (IQR: -400-8100) and may reduce inequalities. Similarly for GCa, 1200 cases (IQR: -200-3000) and 700 deaths (IQR: -900-2300) could be prevented or postponed with a neutral impact on inequalities. Current salt reduction policies are powerfully effective in reducing the CVD and GCa burdens overall but fail to reduce the inequalities involved. Additional structural policies could achieve further, more equitable health benefits. Published by the BMJ

  14. OligoHeatMap (OHM): an online tool to estimate and display hybridizations of oligonucleotides onto DNA sequences.

    Science.gov (United States)

    Croce, Olivier; Chevenet, François; Christen, Richard

    2008-07-01

    The efficiency of molecular methods involving DNA/DNA hybridizations depends on the accurate prediction of the melting temperature (T(m)) of the duplex. Many softwares are available for T(m) calculations, but difficulties arise when one wishes to check if a given oligomer (PCR primer or probe) hybridizes well or not on more than a single sequence. Moreover, the presence of mismatches within the duplex is not sufficient to estimate specificity as it does not always significantly decrease the T(m). OHM (OligoHeatMap) is an online tool able to provide estimates of T(m) for a set of oligomers and a set of aligned sequences, not only as text files of complete results but also in a graphical way: T(m) values are translated into colors and displayed as a heat map image, either stand alone or to be used by softwares such as TreeDyn to be included in a phylogenetic tree. OHM is freely available at http://bioinfo.unice.fr/ohm/, with links to the full source code and online help.

  15. ErpICASSO: a tool for reliability estimates of independent components in EEG event-related analysis.

    Science.gov (United States)

    Artoni, Fiorenzo; Gemignani, Angelo; Sebastiani, Laura; Bedini, Remo; Landi, Alberto; Menicucci, Danilo

    2012-01-01

    Independent component analysis and blind source separation methods are steadily gaining popularity for separating individual brain and non-brain source signals mixed by volume conduction in electroencephalographic data. Despite the advancements on these techniques, determining the number of embedded sources and their reliability are still open issues. In particular to date no method takes into account trial-to-trial variability in order to provide a reliability measure of independent components extracted in Event Related Potentials (ERPs) studies. In this work we present ErpICASSO, a new method which modifies a data-driven approach named ICASSO for the analysis of trials (epochs). In addition to ICASSO the method enables the user to estimate the number of embedded sources, and provides a quality index of each extracted ERP component by combining trial-to-trial bootstrapping and CCA projection. We applied ErpICASSO on ERPs recorded from 14 subjects presented with unpleasant and neutral pictures. We separated potentials putatively related to different systems and identified the four primary ERP independent sources. Standing on the confidence interval estimated by ErpICASSO, we were able to compare the components between neutral and unpleasant conditions. ErpICASSO yielded encouraging results, thus providing the scientific community with a useful tool for ICA signal processing whenever dealing with trials recorded in different conditions.

  16. A novel GIS-based tool for estimating present-day ocean reference depth using automatically processed gridded bathymetry data

    Science.gov (United States)

    Jurecka, Mirosława; Niedzielski, Tomasz; Migoń, Piotr

    2016-05-01

    This paper presents a new method for computing the present-day value of the reference depth (dr) which is an essential input information for assessment of past sea-level changes. The method applies a novel automatic geoprocessing tool developed using Python script and ArcGIS, and uses recent data about ocean floor depth, sediment thickness, and age of oceanic crust. The procedure is multi-step and involves creation of a bathymetric dataset corrected for sediment loading and isostasy, delineation of subduction zones, computation of perpendicular sea-floor profiles, and statistical analysis of these profiles versus crust age. The analysis of site-specific situations near the subduction zones all around the world shows a number of instances where the depth of the oceanic crust stabilizes at a certain level before reaching the subduction zone, and this occurs at depths much lower than proposed in previous approaches to the reference depth issue. An analysis of Jurassic and Cretaceous oceanic lithosphere shows that the most probable interval at which the reference depth occurs is 5300-5800 m. This interval is broadly consistent with dr estimates determined using the Global Depth-Heatflow model (GDH1), but is significantly lower than dr estimates calculated on a basis of the Parsons-Sclater Model (PSM).

  17. PHACCS, an online tool for estimating the structure and diversity of uncultured viral communities using metagenomic information

    Directory of Open Access Journals (Sweden)

    Salamon Peter

    2005-03-01

    Full Text Available Abstract Background Phages, viruses that infect prokaryotes, are the most abundant microbes in the world. A major limitation to studying these viruses is the difficulty of cultivating the appropriate prokaryotic hosts. One way around this limitation is to directly clone and sequence shotgun libraries of uncultured viral communities (i.e., metagenomic analyses. PHACCS http://phage.sdsu.edu/phaccs, Phage Communities from Contig Spectrum, is an online bioinformatic tool to assess the biodiversity of uncultured viral communities. PHACCS uses the contig spectrum from shotgun DNA sequence assemblies to mathematically model the structure of viral communities and make predictions about diversity. Results PHACCS builds models of possible community structure using a modified Lander-Waterman algorithm to predict the underlying contig spectrum. PHACCS finds the most appropriate structure model by optimizing the model parameters until the predicted contig spectrum is as close as possible to the experimental one. This model is the basis for making estimates of uncultured viral community richness, evenness, diversity index and abundance of the most abundant genotype. Conclusion PHACCS analysis of four different environmental phage communities suggests that the power law is an important rank-abundance form to describe uncultured viral community structure. The estimates support the fact that the four phage communities were extremely diverse and that phage community biodiversity and structure may be correlated with that of their hosts.

  18. Using the soil and water assessment tool to estimate dissolved inorganic nitrogen water pollution abatement cost functions in central portugal.

    Science.gov (United States)

    Roebeling, P C; Rocha, J; Nunes, J P; Fidélis, T; Alves, H; Fonseca, S

    2014-01-01

    Coastal aquatic ecosystems are increasingly affected by diffuse source nutrient water pollution from agricultural activities in coastal catchments, even though these ecosystems are important from a social, environmental and economic perspective. To warrant sustainable economic development of coastal regions, we need to balance marginal costs from coastal catchment water pollution abatement and associated marginal benefits from coastal resource appreciation. Diffuse-source water pollution abatement costs across agricultural sectors are not easily determined given the spatial heterogeneity in biophysical and agro-ecological conditions as well as the available range of best agricultural practices (BAPs) for water quality improvement. We demonstrate how the Soil and Water Assessment Tool (SWAT) can be used to estimate diffuse-source water pollution abatement cost functions across agricultural land use categories based on a stepwise adoption of identified BAPs for water quality improvement and corresponding SWAT-based estimates for agricultural production, agricultural incomes, and water pollution deliveries. Results for the case of dissolved inorganic nitrogen (DIN) surface water pollution by the key agricultural land use categories ("annual crops," "vineyards," and "mixed annual crops & vineyards") in the Vouga catchment in central Portugal show that no win-win agricultural practices are available within the assessed BAPs for DIN water quality improvement. Estimated abatement costs increase quadratically in the rate of water pollution abatement, with largest abatement costs for the "mixed annual crops & vineyards" land use category (between 41,900 and 51,900 € tDIN yr) and fairly similar abatement costs across the "vineyards" and "annual crops" land use categories (between 7300 and 15,200 € tDIN yr).

  19. Construction of estimated flow- and load-duration curves for Kentucky using the Water Availability Tool for Environmental Resources (WATER)

    Science.gov (United States)

    Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.

    2012-01-01

    Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.

  20. Distance Learning as a Tool for Poverty Reduction and Economic Development: A Focus on China and Mexico

    Science.gov (United States)

    Larson, Richard C.; Murray, M. Elizabeth

    2008-01-01

    This paper uses case studies to focus on distance learning in developing countries as an enabler for economic development and poverty reduction. To provide perspective, we first review the history of telecottages, local technology-equipped facilities to foster community-based learning, which have evolved into "telecenters" or…

  1. Distance Learning as a Tool for Poverty Reduction and Economic Development: A Focus on China and Mexico

    Science.gov (United States)

    Larson, Richard C.; Murray, M. Elizabeth

    2008-01-01

    This paper uses case studies to focus on distance learning in developing countries as an enabler for economic development and poverty reduction. To provide perspective, we first review the history of telecottages, local technology-equipped facilities to foster community-based learning, which have evolved into "telecenters" or…

  2. Review of Tools for Energy Use and Greenhouse Gas Reduction Applicable to Small- and Medium-Sized Communities

    Science.gov (United States)

    Considerable attention has already been given to ways of assessing and reducing transportation energy use and greenhouse gas emissions in large cities. This presentation provides a review of what tools, if any, may be used to assess and reduce transportation energy use and green...

  3. Reduction of potassium content of green bean pods and chard by culinary processing. Tools for chronic kidney disease

    Directory of Open Access Journals (Sweden)

    Montserrat Martínez-Pineda

    2016-07-01

    Conclusion: The results shown in this study are very positive because they provide tools for professionals who deal with this kind of patients. They allow them to adapt more easily to the needs and preferences of their patients and increase dietary variety.

  4. Biomechanical Evaluation of the Vertebral Jack Tool and the Inflatable Bone Tamp for Reduction of Osteoporotic Spine Fractures

    NARCIS (Netherlands)

    Sietsma, Maurits S.; Hosman, Allard J. F.; Verdonschot, N. J. J.; Aalsma, Arthur M. M.; Veldhuizen, Albert G.

    2009-01-01

    Study Design. Controlled in vitro study. Objective. To compare two kyphoplasty techniques in cadaveric fractured vertebrae: an experimental vertebral jack tool (VJT) and an inflatable bone tamp (IBT). Summary of Background Data. A previous biomechanical study showed restored strength and stiffness a

  5. Cost reduction in the production process using the ABC and Lean tools: Case Study in the refrigeration components industry

    Directory of Open Access Journals (Sweden)

    Levi da Silva Guimarães

    2015-03-01

    Full Text Available This paper focuses on production management with respect to operating costs that relate directly to the value of the product. For this study, three methods were used, ABC - Activity Based Costing, which provides accurate information about the knowledge of the real costs, VSM - Value Stream Mapping and Lean Manufacturing. The method adopted for this research was the case study. The study was conducted at a refrigeration components company in the Industrial Center of Manaus. The analyses and observations initially went through the process of mapping the value stream, measuring the current state of activities (cycle time, setup, etc.. After analysis it was possible to map the cost for each activity and finally calculate the cost of the product before and after the improvements resulting from the lean methodology. The results obtained in this study showed a 20% reduction in product costs resulting from operational improvements. The activity-based cost led to a discovery of the real costs of waste. The steps for this study include process mapping through the value stream, measuring the current state of activities (cycle time, setup, etc., establishing the cost driver for each activity, and finally calculating the cost of the product before and after the application of lean improvements. The paper was conducted through literature and descriptive review, and used a case study method. It describes the model that has been tested in a production line for a refrigeration components company from the Manaus Industrial Center, achieving a 20% reduction in product cost.

  6. Tourism as a Poverty Reduction Tool: The Case of Mukuni Village in the Southern Province of Zambia

    Directory of Open Access Journals (Sweden)

    Miroslav Horák

    2014-01-01

    Full Text Available Globally, tourism is becoming one of the cornerstones of national economic growth and as a means of poverty alleviation, especially in the tourist attractions in rural areas. This article assesses the levels of utilization of tourism potentials in Zambia, in general, and the Mukuni village in the Southern province in Zambia, in particular, with reference to poverty reduction. The world famous Victoria Falls is situated in the Southern province and therefore this area is the most visited places in Zambia and attracts more tourists throughout the whole year. The main income of the local people, which includes the Tonga tribe comes from tourism. Even though tourism has brought positive results, including the realization of some local development projects and prosperity to the people, it has also brought some negative effects such as sociocultural change, pollution and waste in the tourist destination areas in Zambia.For the Mukuni people and Zambia as a whole to fully exploit tourism potentials, stricter laws protecting the destruction of the environment and the preservation culture of the indigenous people should be enforced in the tourist destination areas. The government should use the levy from tourism to provide better infrastructure, create job opportunities and create wealth within the tourist areas for sustainable tourism development and poverty reduction.

  7. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    Science.gov (United States)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  8. Accounting for density reduction and structural loss in standing dead trees: Implications for forest biomass and carbon stock estimates in the United States

    Directory of Open Access Journals (Sweden)

    Domke Grant M

    2011-11-01

    Full Text Available Abstract Background Standing dead trees are one component of forest ecosystem dead wood carbon (C pools, whose national stock is estimated by the U.S. as required by the United Nations Framework Convention on Climate Change. Historically, standing dead tree C has been estimated as a function of live tree growing stock volume in the U.S.'s National Greenhouse Gas Inventory. Initiated in 1998, the USDA Forest Service's Forest Inventory and Analysis program (responsible for compiling the Nation's forest C estimates began consistent nationwide sampling of standing dead trees, which may now supplant previous purely model-based approaches to standing dead biomass and C stock estimation. A substantial hurdle to estimating standing dead tree biomass and C attributes is that traditional estimation procedures are based on merchantability paradigms that may not reflect density reductions or structural loss due to decomposition common in standing dead trees. The goal of this study was to incorporate standing dead tree adjustments into the current estimation procedures and assess how biomass and C stocks change at multiple spatial scales. Results Accounting for decay and structural loss in standing dead trees significantly decreased tree- and plot-level C stock estimates (and subsequent C stocks by decay class and tree component. At a regional scale, incorporating adjustment factors decreased standing dead quaking aspen biomass estimates by almost 50 percent in the Lake States and Douglas-fir estimates by more than 36 percent in the Pacific Northwest. Conclusions Substantial overestimates of standing dead tree biomass and C stocks occur when one does not account for density reductions or structural loss. Forest inventory estimation procedures that are descended from merchantability standards may need to be revised toward a more holistic approach to determining standing dead tree biomass and C attributes (i.e., attributes of tree biomass outside of sawlog

  9. Distance Learning as a Tool for Poverty Reduction and Economic Development: A Focus on China and Mexico

    Science.gov (United States)

    Larson, Richard C.; Murray, M. Elizabeth

    2008-04-01

    This paper uses case studies to focus on distance learning in developing countries as an enabler for economic development and poverty reduction. To provide perspective, we first review the history of telecottages, local technology-equipped facilities to foster community-based learning, which have evolved into "telecenters" or "Community Learning Centers" (CLCs). Second, we describe extensive site visits to CLCs in impoverished portions of China and Mexico, the centers operated by premier universities in each respective country. These CLCs constitute the core of new emerging systems of distance education, and their newness poses challenges and opportunities, which are discussed. Finally, we offer 12 points to develop further the concept and reality of distance learning in support of economic development.

  10. Normalized Difference Vegetation Index as a Tool for Wheat Yield Estimation: A Case Study from Faisalabad, Pakistan

    Directory of Open Access Journals (Sweden)

    Syeda Refat Sultana

    2014-01-01

    Full Text Available For estimation of grain yield in wheat, Normalized Difference Vegetation Index (NDVI is considered as a potential screening tool. Field experiments were conducted to scrutinize the response of NDVI to yield behavior of different wheat cultivars and nitrogen fertilization at agronomic research area, University of Agriculture Faisalabad (UAF during the two years 2008-09 and 2009-10. For recording the value of NDVI, Green seeker (Handheld-505 was used. Split plot design was used as experimental model in, keeping four nitrogen rates (N1= 0 kg ha−1, N2= 55 kg ha−1, N3=110 kg ha−1, and N4= 220 kg ha−1 in main plots and ten wheat cultivars (Bakkhar-2001, Chakwal-50, Chakwal-97, Faisalabad-2008, GA-2002, Inqlab-91, Lasani-2008, Miraj-2008, Sahar-2006, and Shafaq-2006 in subplots with four replications. Impact of nitrogen and difference between cultivars were forecasted through NDVI. The results suggested that nitrogen treatment N4 (220 kg ha−1 and cultivar Faisalabad-2008 gave maximum NDVI value (0.85 at grain filling stage among all treatments. The correlation among NDVI at booting, grain filling, and maturity stages with grain yield was positive (R2 = 0.90; R2 = 0.90; R2 = 0.95, respectively. So, booting, grain filling, and maturity can be good depictive stages during mid and later growth stages of wheat crop under agroclimatic conditions of Faisalabad and under similar other wheat growing environments in the country.

  11. Migration modelling from food-contact plastics into foodstuffs as a new tool for consumer exposure estimation.

    Science.gov (United States)

    Franz, R

    2005-10-01

    One important aspect within the European Union's public healthcare is the exposure of consumers to undesirable chemicals in the diet. Food-contact materials (FCM) are one potential contamination source and therefore of particular interest for food exposure assessment. On the other hand, scientific investigations concerning the migration potential and behaviour of food-packaging materials have demonstrated that diffusion in and migration from FCM are foreseeable physical and, in principle, mathematically describable processes. Because of this situation and the current state-of-the-art in migration science, a research project was initiated within the 5th Framework Programme of the European Commission. This project, with the acronym 'FOODMIGROSURE' (European Union Contract No. 'QLK1-CT2002-2390') started on 1 March 2003, was due to last 3 years and had the participation of nine European project partners (see the project website: www.foodmigrosure.org). The aim of the project was to extend currently existing migration models (which have been demonstrated to be applicable for less complex matrices such as food simulants) to foodstuffs themselves. In this way, the project aims to provide a novel and economic tool for estimation of consumer exposure to chemicals migrating from food-contact plastic materials under any actual contact conditions. In addition, the project aims to increase knowledge of the mechanisms of diffusion of organic compounds in foodstuffs and provide data on the partitioning effects between FCM and foods. Today the latter aspect is increasingly regarded as a fundamental influence parameter for migration into foods. Based on the project achievements, a much better scientific basis is available to allow scientifically appropriate amendments of European Union Directive 85/572/EEC as well as to support further developments with the so-called Plastics Directive 2002/72/EC. The paper introduces the project and presents an overview of the project work

  12. A GIS-based tool for estimating tree canopy cover on fixed-radius plots using high-resolution aerial imagery

    Science.gov (United States)

    Sara A. Goeking; Greg C. Liknes; Erik Lindblom; John Chase; Dennis M. Jacobs; Robert. Benton

    2012-01-01

    Recent changes to the Forest Inventory and Analysis (FIA) Program's definition of forest land precipitated the development of a geographic information system (GIS)-based tool for efficiently estimating tree canopy cover for all FIA plots. The FIA definition of forest land has shifted from a density-related criterion based on stocking to a 10 percent tree canopy...

  13. Computational tools for mechanistic discrimination in the reductive and metathesis coupling reactions mediated by titanium(IV) isopropoxide

    Indian Academy of Sciences (India)

    Akshai Kumar; Ashoka G Samuelson

    2012-11-01

    A theoretical study has been carried out at the B3LYP/LANL2DZ level to compare the reactivity of phenyl isocyanate and phenyl isothiocyanate towards titanium(IV) alkoxides. Isocyanates are shown to favour both mono insertion and double insertion reactions. Double insertion in a head-to-tail fashion is shown to be more exothermic than double insertion in a head-to-head fashion. The head-to-head double insertion leads to the metathesis product, a carbodiimide, after the extrusion of carbon dioxide. In the case of phenyl isothiocyanate, calculations favour the formation of only mono insertion products. Formation of a double insertion product is highly unfavourable. Further, these studies indicate that the reverse reaction involving the metathesis of N,N'-diphenyl carbodiimide with carbon dioxide is likely to proceed more efficiently than the metathesis reaction with carbon disulphide. This is in excellent agreement with experimental results as metathesis with carbon disulphide fails to occur. In a second study, multilayer MM/QM calculations are carried out on intermediates generated from reduction of titanium(IV) alkoxides to investigate the effect of alkoxy bridging on the reactivity of multinuclear Ti species. Bimolecular coupling of imines initiated by Ti(III) species leads to a mixture of diastereomers and not diastereoselective coupling of the imine. However if the reaction is carried out by a trimeric biradical species, diastereoselective coupling of the imine is predicted. The presence of alkoxy bridges greatly favours the formation of the d,l (±) isomer, whereas the intermediate without alkoxy bridges favours the more stable meso isomer. As a bridged trimeric species, stabilized by bridging alkoxy groups, correctly explains the diastereoselective reaction, it is the most likely intermediate in the reaction.

  14. Application of watershed deposition tool to estimate from CMAQ simulations the atmospheric deposition of nitrogen to Tampa Bay and its watershed.

    Science.gov (United States)

    Poor, Noreen D; Pribble, J Raymond; Schwede, Donna B

    2013-01-01

    The US. Environmental Protection Agency (EPA) has developed the Watershed Deposition Tool (WDT) to calculate from the Community Multiscale Air Quality (CMAQ) model output the nitrogen, sulfur and mercury deposition rates to watersheds and their sub-basins. The CMAQ model simulates from first principles the transport, transformation, and removal of atmospheric pollutants. We applied WDT to estimate the atmospheric deposition of reactive nitrogen (N) to Tampa Bay and its watershed. For 2002 and within the boundaries of Tampa Bay's watershed, modeled atmospheric deposition rates averaged 13.3 kg N ha(-1) yr(-1) and ranged from 6.24 kg N ha(-1) yr(-1) at the bay's boundary with Gulf of Mexico to 21.4 kg N ha(-1) yr(-1) near Tampa's urban core, based on a 12-km x 12-km grid cell size. CMAQ-predicted loading rates were 1,080 metric tons N yr(-1) to Tampa Bay and 8,280 metric tons N yr(-1) to the land portion of its watershed. If we assume a watershed-to-bay transfer rate of 18% for indirect loading, our estimates of the 2002 direct and indirect loading rates to Tampa Bay were 1,080 metric tons N and 1,490 metric tons N, respectively, for an atmospheric loading of 2,570 metric tons N or 71% of the total N loading to Tampa Bay. To evaluate the potential impact of the US. EPA Clean Air Interstate Rule (CAIR, replaced with Cross-State Air Pollution Rule), Tier 2 Vehicle and Gasoline Sulfur Rules, Heavy Duty Highway Rule, and Non-Road Diesel Rule, we compared CMAQ outputs between 2020 and 2002 simulations, with only the emissions inventories changed. The CMAQ-projected change in atmospheric loading rates between these emissions inventories was 857 metric tons N to Tampa Bay, or about 24% of the 2002 loading of 3,640 metric tons N to Tampa Bay from all sources. Air quality modeling reveals that atmospheric deposition of reactive nitrogen (N) contributes a significant fraction to Tampa Bay's total N loading from external sources. Regulatory drivers that lower nitrogen oxide

  15. Energy Saving Melting andRevert Reduction Technology (E0SMARRT): Predicting Pattern Tooling and Casting Dimension for Investment Casting

    Energy Technology Data Exchange (ETDEWEB)

    Nick Cannell; Dr. Mark Samonds; Adi Sholapurwalla; Sam Scott

    2008-11-21

    The investment casting process is an expendable mold process where wax patterns of the part and rigging are molded, assembled, shelled and melted to produce a ceramic mold matching the shape of the component to be cast. Investment casting is an important manufacturing method for critical parts because of the ability to maintain dimensional shape and tolerances. However, these tolerances can be easily exceeded if the molding components do not maintain their individual shapes well. In the investment casting process there are several opportunities for the final casting shape to not maintain the intended size and shape, such as shrinkage of the wax in the injection tool, the modification of the shape during shell heating, and with the thermal shrink and distortion in the casting process. Studies have been completed to look at the casting and shell distortions through the process in earlier phases of this project. Dr. Adrian Sabau at Oak Ridge National Labs performed characterizations and validations of 17-4 PH stainless steel in primarily fused silica shell systems with good agreement between analysis results and experimental data. Further tasks provided material property measurements of wax and methodology for employing a viscoelastic definition of wax materials into software. The final set of tasks involved the implementation of the findings into the commercial casting analysis software ProCAST, owned and maintained by ESI Group. This included: o the transfer of the wax material property data from its raw form into separate temperature-dependent thermophysical and mechanical property datasets o adding this wax material property data into an easily viewable and modifiable user interface within the pre-processing application of the ProCAST suite, namely PreCAST o and validating the data and viscoelastic wax model with respect to experimental results

  16. Maximum likelihood estimate of life expectancy in the prehistoric Jomon: Canine pulp volume reduction suggests a longer life expectancy than previously thought.

    Science.gov (United States)

    Sasaki, Tomohiko; Kondo, Osamu

    2016-09-01

    Recent theoretical progress potentially refutes past claims that paleodemographic estimations are flawed by statistical problems, including age mimicry and sample bias due to differential preservation. The life expectancy at age 15 of the Jomon period prehistoric populace in Japan was initially estimated to have been ∼16 years while a more recent analysis suggested 31.5 years. In this study, we provide alternative results based on a new methodology. The material comprises 234 mandibular canines from Jomon period skeletal remains and a reference sample of 363 mandibular canines of recent-modern Japanese. Dental pulp reduction is used as the age-indicator, which because of tooth durability is presumed to minimize the effect of differential preservation. Maximum likelihood estimation, which theoretically avoids age mimicry, was applied. Our methods also adjusted for the known pulp volume reduction rate among recent-modern Japanese to provide a better fit for observations in the Jomon period sample. Without adjustment for the known rate in pulp volume reduction, estimates of Jomon life expectancy at age 15 were dubiously long. However, when the rate was adjusted, the estimate results in a value that falls within the range of modern hunter-gatherers, with significantly better fit to the observations. The rate-adjusted result of 32.2 years more likely represents the true life expectancy of the Jomon people at age 15, than the result without adjustment. Considering ∼7% rate of antemortem loss of the mandibular canine observed in our Jomon period sample, actual life expectancy at age 15 may have been as high as ∼35.3 years. © 2016 Wiley Periodicals, Inc.

  17. A Modelling Approach to Estimate the Impact of Sodium Reduction in Soups on Cardiovascular Health in the Netherlands

    Directory of Open Access Journals (Sweden)

    Maaike J. Bruins

    2015-09-01

    Full Text Available Hypertension is a major modifiable risk factor for cardiovascular disease and mortality, which could be lowered by reducing dietary sodium. The potential health impact of a product reformulation in the Netherlands was modelled, selecting packaged soups containing on average 25% less sodium as an example of an achievable product reformulation when implemented gradually. First, the blood pressure lowering resulting from sodium intake reduction was modelled. Second, the predicted blood pressure lowering was translated into potentially preventable incidence and mortality cases from stroke, acute myocardial infarction (AMI, angina pectoris, and heart failure (HF implementing one year salt reduction. Finally, the potentially preventable subsequent lifetime Disability-Adjusted Life Years (DALYs were calculated. The sodium reduction in soups might potentially reduce the incidence and mortality of stroke by approximately 0.5%, AMI and angina by 0.3%, and HF by 0.2%. The related burden of disease could be reduced by approximately 800 lifetime DALYs. This modelling approach can be used to provide insight into the potential public health impact of sodium reduction in specific food products. The data demonstrate that an achievable food product reformulation to reduce sodium can potentially benefit public health, albeit modest. When implemented across multiple product categories and countries, a significant health impact could be achieved.

  18. Structural correlation method for model reduction and practical estimation of patient specific parameters illustrated on heart rate regulation

    DEFF Research Database (Denmark)

    Ottesen, Johnny T.; Mehlsen, Jesper; Olufsen, Mette

    2014-01-01

    We consider the inverse and patient specific problem of short term (seconds to minutes) heart rate regulation specified by a system of nonlinear ODEs and corresponding data. We show how a recent method termed the structural correlation method (SCM) can be used for model reduction and for obtaining...

  19. Reduction in patient burdens with graphical computerized adaptive testing on the ADL scale: tool development and simulation

    Directory of Open Access Journals (Sweden)

    Wang Weng-Chung

    2009-05-01

    Full Text Available Abstract Background The aim of this study was to verify the effectiveness and efficacy of saving time and reducing burden for patients, nurses, and even occupational therapists through computer adaptive testing (CAT. Methods Based on an item bank of the Barthel Index (BI and the Frenchay Activities Index (FAI for assessing comprehensive activities of daily living (ADL function in stroke patients, we developed a visual basic application (VBA-Excel CAT module, and (1 investigated whether the averaged test length via CAT is shorter than that of the traditional all-item-answered non-adaptive testing (NAT approach through simulation, (2 illustrated the CAT multimedia on a tablet PC showing data collection and response errors of ADL clinical functional measures in stroke patients, and (3 demonstrated the quality control of endorsing scale with fit statistics to detect responding errors, which will be further immediately reconfirmed by technicians once patient ends the CAT assessment. Results The results show that endorsed items could be shorter on CAT (M = 13.42 than on NAT (M = 23 at 41.64% efficiency in test length. However, averaged ability estimations reveal insignificant differences between CAT and NAT. Conclusion This study found that mobile nursing services, placed at the bedsides of patients could, through the programmed VBA-Excel CAT module, reduce the burden to patients and save time, more so than the traditional NAT paper-and-pencil testing appraisals.

  20. User’s Guide for T.E.S.T. (version 4.2) (Toxicity Estimation Software Tool) A Program to Estimate Toxicity from Molecular Structure

    Science.gov (United States)

    The user's guide describes the methods used by TEST to predict toxicity and physical properties (including the new mode of action based method used to predict acute aquatic toxicity). It describes all of the experimental data sets included in the tool. It gives the prediction res...

  1. Twitter as a Potential Disaster Risk Reduction Tool. Part II: Descriptive Analysis of Identified Twitter Activity during the 2013 Hattiesburg F4 Tornado.

    Science.gov (United States)

    Cooper, Guy Paul; Yeager, Violet; Burkle, Frederick M; Subbarao, Italo

    2015-06-29

    This article describes a novel triangulation methodological approach for identifying twitter activity of regional active twitter users during the 2013 Hattiesburg EF-4 Tornado. A data extraction and geographically centered filtration approach was utilized to generate Twitter data for 48 hrs pre- and post-Tornado. The data was further validated using six sigma approach utilizing GPS data. The regional analysis revealed a total of 81,441 tweets, 10,646 Twitter users, 27,309 retweets and 2637 tweets with GPS coordinates. Twitter tweet activity increased 5 fold during the response to the Hattiesburg Tornado.  Retweeting activity increased 2.2 fold. Tweets with a hashtag increased 1.4 fold. Twitter was an effective disaster risk reduction tool for the Hattiesburg EF-4 Tornado 2013.

  2. EucaTool®, a cloud computing application for estimating the growth and production of Eucalyptus globulus Labill. plantations in Galicia (NW Spain

    Directory of Open Access Journals (Sweden)

    Alberto Rojo-Alboreca

    2015-12-01

    Full Text Available Aim of study: To present the software utilities and explain how to use EucaTool®, a free cloud computing application developed to estimate the growth and production of seedling and clonal blue gum (Eucalyptus globulus Labill. plantations in Galicia (NW Spain.Area of study: Galicia (NW Spain.Material and methods: EucaTool® implements a dynamic growth and production model that is valid for clonal and non-clonal blue gum plantations in the region. The model integrates transition functions for dominant height (site index curves, number of stems per hectare (mortality function and basal area, as well as output functions for tree and stand volume, biomass and carbon content.Main results: EucaTool® can be freely accessed from any device with an Internet connection, from http://app.eucatool.com. In addition, useful information about the application is published on a related website: http://www.eucatool.com.Research highlights: The application has been designed to enable forest stakeholders to estimate volume, biomass and carbon content of forest plantations from individual trees, diameter classes or stand data, as well as to estimate growth and future production (indicating the optimal rotation age for maximum income by measurement of only four stand variables: age, number of trees per hectare, dominant height and basal area.Keywords: forest management; biomass; seedling; clones; blue gum; forest tool.

  3. Testing the reliability of software tools in sex and ancestry estimation in a multi-ancestral Brazilian sample.

    Science.gov (United States)

    Urbanová, Petra; Ross, Ann H; Jurda, Mikoláš; Nogueira, Maria-Ines

    2014-09-01

    In the framework of forensic anthropology osteometric techniques are generally preferred over visual examinations due to a higher level of reproducibility and repeatability; qualities that are crucial within a legal context. The use of osteometric methods has been further reinforced by incorporating statistically-based algorithms and large reference samples in a variety of user-friendly software applications. However, the continued increase in admixture of human populations have made the use of osteometric methods for estimation of ancestry much more complex, which confounds one of major requirements of ancestry assessment - intra-population homogeneity. The present paper tests the accuracy of ancestry and sex assessment using four identification software tools, specifically FORDISC 2.0, FORDISC 3.1.293, COLIPR 1.5.2 and 3D-ID 1.0. Software accuracy was tested in a sample of 174 documented human crania of Brazilian origin composed of different ancestral groups (i.e., European Brazilians, Afro-Brazilians, and Japanese Brazilians and of admixed ancestry). The results show that regardless of the software algorithm employed and composition of the reference database, all methods were able to allocate approximately 50% of Brazilian specimens to an appropriate major reference group. Of the three ancestral groups, Afro-Brazilians were especially prone to misclassification. Japanese Brazilians, by contrast, were shown to be relatively easily recognizable as being of Asian descent but at the same time showed a strong affinity towards Hispanic crania, in particularly when the classification based on FDB was carried out in FORDISC. For crania of admixed origin all of the algorithms showed a considerable higher rate of inconsistency with a tendency for misclassification into Asian and American Hispanic groups. Sex assessments revealed an overall modest to poor reliability (60-71% of correctly classified specimens) using the tested software programs with unbalanced individual

  4. Using an ecosystem service decision support tool to support ridge to reef management: An example of sediment reduction in west Maui, Hawaii

    Science.gov (United States)

    Falinski, K. A.; Oleson, K.; Htun, H.; Kappel, C.; Lecky, J.; Rowe, C.; Selkoe, K.; White, C.

    2016-12-01

    Faced with anthropogenic stressors and declining coral reef states, managers concerned with restoration and resilience of coral reefs are increasingly recognizing the need to take a ridge-to-reef, ecosystem-based approach. An ecosystem services framing can help managers move towards these goals, helping to illustrate trade-offs and opportunities of management actions in terms of their impacts on society. We describe a research program building a spatial ecosystem services-based decision-support tool, and being applied to guide ridge-to-reef management in a NOAA priority site in West Maui. We use multiple modeling methods to link biophysical processes to ecosystem services and their spatial flows and social values in an integrating platform. Modeled services include water availability, sediment retention, nutrient retention and carbon sequestration on land. A coral reef ecosystem service model is under development to capture the linkages between terrestrial and coastal ecosystem services. Valuation studies are underway to quantify the implications for human well-being. The tool integrates techniques from decision science to facilitate decision making. We use the sediment retention model to illustrate the types of analyses the tool can support. The case study explores the tradeoffs between road rehabilitation costs and sediment export avoided. We couple the sediment and cost models with trade-off analysis to identify optimal distributed solutions that are most cost-effective in reducing erosion, and then use those models to estimate sediment exposure to coral reefs. We find that cooperation between land owners reveals opportunities for maximizing the benefits of fixing roads and minimizes costs. This research forms the building blocks of an ecosystem service decision support tool that we intend to continue to test and apply in other Pacific Island settings.

  5. Analysis of Reduction in Area in MIMO Receivers Using SQRD Method and Unitary Transformation with Maximum Likelihood Estimation (MLE and Minimum Mean Square Error Estimation (MMSE Techniques

    Directory of Open Access Journals (Sweden)

    Sabitha Gauni

    2014-03-01

    Full Text Available In the field of Wireless Communication, there is always a demand for reliability, improved range and speed. Many wireless networks such as OFDM, CDMA2000, WCDMA etc., provide a solution to this problem when incorporated with Multiple input- multiple output (MIMO technology. Due to the complexity in signal processing, MIMO is highly expensive in terms of area consumption. In this paper, a method of MIMO receiver design is proposed to reduce the area consumed by the processing elements involved in complex signal processing. In this paper, a solution for area reduction in the Multiple input multiple output(MIMO Maximum Likelihood Receiver(MLE using Sorted QR Decomposition and Unitary transformation method is analyzed. It provides unified approach and also reduces ISI and provides better performance at low cost. The receiver pre-processor architecture based on Minimum Mean Square Error (MMSE is compared while using Iterative SQRD and Unitary transformation method for vectoring. Unitary transformations are transformations of the matrices which maintain the Hermitian nature of the matrix, and the multiplication and addition relationship between the operators. This helps to reduce the computational complexity significantly. The dynamic range of all variables is tightly bound and the algorithm is well suited for fixed point arithmetic.

  6. The estimated effect of mass or footprint reduction in recent light-duty vehicles on U.S. societal fatality risk per vehicle mile traveled.

    Science.gov (United States)

    Wenzel, Tom

    2013-10-01

    The National Highway Traffic Safety Administration (NHTSA) recently updated its 2003 and 2010 logistic regression analyses of the effect of a reduction in light-duty vehicle mass on US societal fatality risk per vehicle mile traveled (VMT; Kahane, 2012). Societal fatality risk includes the risk to both the occupants of the case vehicle as well as any crash partner or pedestrians. The current analysis is the most thorough investigation of this issue to date. This paper replicates the Kahane analysis and extends it by testing the sensitivity of his results to changes in the definition of risk, and the data and control variables used in the regression models. An assessment by Lawrence Berkeley National Laboratory (LBNL) indicates that the estimated effect of mass reduction on risk is smaller than in Kahane's previous studies, and is statistically non-significant for all but the lightest cars (Wenzel, 2012a). The estimated effects of a reduction in mass or footprint (i.e. wheelbase times track width) are small relative to other vehicle, driver, and crash variables used in the regression models. The recent historical correlation between mass and footprint is not so large to prohibit including both variables in the same regression model; excluding footprint from the model, i.e. allowing footprint to decrease with mass, increases the estimated detrimental effect of mass reduction on risk in cars and crossover utility vehicles (CUVs)/minivans, but has virtually no effect on light trucks. Analysis by footprint deciles indicates that risk does not consistently increase with reduced mass for vehicles of similar footprint. Finally, the estimated effects of mass and footprint reduction are sensitive to the measure of exposure used (fatalities per induced exposure crash, rather than per VMT), as well as other changes in the data or control variables used. It appears that the safety penalty from lower mass can be mitigated with careful vehicle design, and that manufacturers can

  7. Estimating the Horizon of articles to decide when to stop searching in systematic reviews: an example using a systematic review of RCTs evaluating osteoporosis clinical decision support tools.

    Science.gov (United States)

    Kastner, Monika; Straus, Sharon; Goldsmith, Charlie H

    2007-10-11

    Researchers conducting systematic reviews need to search multiple bibliographic databases such as MEDLINE and EMBASE. However, researchers have no rational search stopping rule when looking for potentially-relevant articles. We empirically tested a stopping rule based on the concept of capture-mark-recapture (CMR), which was first pioneered in ecology. The principles of CMR can be adapted to systematic reviews and meta-analyses to estimate the Horizon of articles in the literature with its confidence interval. We retrospectively tested this Horizon Estimation using a systematic review of randomized controlled trials (RCTs) that evaluated clinical decision support tools for osteoporosis disease management. The Horizon Estimation was calculated based on 4 bibliographic databases that were included as the main data sources for the review in the following order: MEDLINE, EMBASE, CINAHL, and EBM Reviews. The systematic review captured 68% of known articles from the 4 data sources, which represented 592 articles that were estimated as missing from the Horizon.

  8. Dynamic modelling and humus balances as tools for estimating and upscaling soil carbon stock changes in temperate cropland

    Science.gov (United States)

    Oberholzer, Hans-Rudolf; Holenstein, Hildegard; Mayer, Jochen; Leifeld, Jens

    2010-05-01

    Humus balances are simple mathematical tools used by farmers for assessing the overall performance of their management in terms of soil organic matter changes. They are based on humus reproduction factors which themselves mainly depend on crop rotation, residue management, and amount and type of organic fertilization. Dynamic models, on the other hand, are typically complex and need more detailed input data and are designed to calculate the time course of soil carbon content. In both cases, thorough validation is needed to utilize their potential for estimating carbon stock changes. We compared the results of three humus balance methods SALCA-SQ (Neyroud 1997), VDLUFA method (VDLUFA 2004), Humod (Brock et al. 2008) and the RothC model with measured soil carbon stocks in a long-term experiment in Switzerland for the period 1977-2005 (Fliessbach et al 2007). The field trial comprises various minerally and organically fertilized treatments, the latter differing in the amount and composition of organics applied. All methods were able to distinguish systematic management effects on soil organic carbon (SOC). However, only those SOC trajectories calculated with the dynamic model RothC matched measured stocks quantitatively. For both, humus balances and dynamic modelling the result strongly depended on parameterization of organic fertilizers, i.e. its stability and organic matter content. Therefore, incomplete information on the amount and composition of organic fertilizer and lack of knowledge about its potential for humus reproduction is regarded an uncertainty in both dynamic modelling and humus balance calculation, and seems to be a major drawback for the reliable application of these approaches at the regional scale. Our results stress the need for more detailed and harmonized data bases of organic fertilizer composition and application rates. References Brock C., Hoyer U., Leithold G., Hülsbergen K.-J., 2008. Entwicklung einer praxisanwendbaren Methode der

  9. Calliphora vicina (Diptera: Calliphoridae) pupae: a timeline of external morphological development and a new age and PMI estimation tool.

    Science.gov (United States)

    Brown, Katherine; Thorne, Alan; Harvey, Michelle

    2015-07-01

    The minimum postmortem interval (PMI(min)) is commonly estimated using calliphorid larvae, for which there are established age estimation methods based on morphological and development data. Despite the increased duration and sedentary nature of the pupal stage of the blowfly, morphological age estimation methods are poorly documented and infrequently used for PMI determination. The aim of this study was to develop a timeline of metamorphosis, focusing on the development of external morphology (within the puparium), to provide a means of age and PMI estimation for Calliphora vicina (Rob-Desvoidy) pupae. Under controlled conditions, 1,494 pupae were reared and sampled at regular time intervals. After puparium removal, observations of 23 external metamorphic developments were correlated to age in accumulated degree hours (ADH). Two age estimation methods were developed based on (1) the combination of possible age ranges observed for each characteristic and (2) regression analyses to generate age estimation equations employing all 23 characteristics observed and a subset of ten characteristics most significantly correlated with age. Blind sample analysis indicated that, using the combination of both methods, pupal age could be estimated to within ±500 ADH with 95% reliability.

  10. Property Estimation of Functionally Graded Materials Between M2 Tool Steel and Cu Fabricated by Powder Metallurgy

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Jong-Seol; Shin, Ki-Hoon [Seoul National University of Science and Technology, Seoul (Korea, Republic of)

    2014-09-15

    The use of functionally graded materials (FGMs) may enhance thermal conductivity without reducing the desired strength in many applications such as injection molds embedding conformal cooling channels and cutting tools with heat sinks (or cooling devices). As a fundamental study for cutting tools having FGM heat sinks between M2 tool steel and Cu, six FGM specimens (M2 and Cu powders were premixed such that the relative compositions of M2 and Cu were 100:0, 80:20, 60:40, 40:60, 20:80, and 0:100 wt%) were fabricated by powder metallurgy in this study. The cross sections of these specimens were observed by optical microscopy, and then the material properties (such as thermal conductivity, specific heat, and coefficient of thermal expansion) related to heat transfer were measured and analyzed.

  11. New Applications of Gamma Spectroscopy: Characterization Tools for D&D Process Development, Inventory Reduction Planning & Shipping, Safety Analysis & Facility Management During the Heavy Element Facility Risk Reduction Program

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, M; Anderson, B; Gray, L; Vellinger, R; West, M; Gaylord, R; Larson, J; Jones, G; Shingleton, J; Harris, L; Harward, N

    2006-01-23

    Novel applications of gamma ray spectroscopy for D&D process development, inventory reduction, safety analysis and facility management are discussed in this paper. These applications of gamma spectroscopy were developed and implemented during the Risk Reduction Program (RPP) to successfully downgrade the Heavy Element Facility (B251) at Lawrence Livermore National Laboratory (LLNL) from a Category II Nuclear Facility to a Radiological Facility. Non-destructive assay in general, gamma spectroscopy in particular, were found to be important tools in project management, work planning, and work control (''Expect the unexpected and confirm the expected''), minimizing worker dose, and resulted in significant safety improvements and operational efficiencies. Inventory reduction activities utilized gamma spectroscopy to identify and confirm isotopics of legacy inventory, ingrowth of daughter products and the presence of process impurities; quantify inventory; prioritize work activities for project management; and to supply information to satisfy shipper/receiver documentation requirements. D&D activities utilize in-situ gamma spectroscopy to identify and confirm isotopics of legacy contamination; quantify contamination levels and monitor the progress of decontamination efforts; and determine the point of diminishing returns in decontaminating enclosures and glove boxes containing high specific activity isotopes such as {sup 244}Cm and {sup 238}Pu. In-situ gamma spectroscopy provided quantitative comparisons of several decontamination techniques (e.g. TLC-free Stripcoat{trademark}, Radiac{trademark} wash, acid wash, scrubbing) and was used as a part of an iterative process to determine the appropriate level of decontamination and optimal cost to benefit ratio. Facility management followed a formal, rigorous process utilizing an independent, state certified, peer-reviewed gamma spectroscopy program, in conjunction with other characterization techniques

  12. Uncertainty reduction and parameters estimation of a~distributed hydrological model with ground and remote sensing data

    Science.gov (United States)

    Silvestro, F.; Gabellani, S.; Rudari, R.; Delogu, F.; Laiolo, P.; Boni, G.

    2014-06-01

    During the last decade the opportunity and usefulness of using remote sensing data in hydrology, hydrometeorology and geomorphology has become even more evident and clear. Satellite based products often provide the advantage of observing hydrologic variables in a distributed way while offering a different view that can help to understand and model the hydrological cycle. Moreover, remote sensing data are fundamental in scarce data environments. The use of satellite derived DTM, which are globally available (e.g. from SRTM as used in this work), have become standard practice in hydrologic model implementation, but other types of satellite derived data are still underutilized. In this work, Meteosat Second Generation Land Surface Temperature (LST) estimates and Surface Soil Moisture (SSM) available from EUMETSAT H-SAF are used to calibrate the Continuum hydrological model that computes such state variables in a prognostic mode. This work aims at proving that satellite observations dramatically reduce uncertainties in parameters calibration by reducing their equifinality. Two parameter estimation strategies are implemented and tested: a multi-objective approach that includes ground observations and one solely based on remotely sensed data. Two Italian catchments are used as the test bed to verify the model capability in reproducing long-term (multi-year) simulations.

  13. The Massachusetts Sustainable-Yield Estimator: A decision-support tool to assess water availability at ungaged stream locations in Massachusetts

    Science.gov (United States)

    Archfield, Stacey A.; Vogel, Richard M.; Steeves, Peter A.; Brandt, Sara L.; Weiskel, Peter K.; Garabedian, Stephen P.

    2010-01-01

    Federal, State and local water-resource managers require a variety of data and modeling tools to better understand water resources. The U.S. Geological Survey, in cooperation with the Massachusetts Department of Environmental Protection, has developed a statewide, interactive decision-support tool to meet this need. The decision-support tool, referred to as the Massachusetts Sustainable-Yield Estimator (MA SYE) provides screening-level estimates of the sustainable yield of a basin, defined as the difference between the unregulated streamflow and some user-specified quantity of water that must remain in the stream to support such functions as recreational activities or aquatic habitat. The MA SYE tool was designed, in part, because the quantity of surface water available in a basin is a time-varying quantity subject to competing demands for water. To compute sustainable yield, the MA SYE tool estimates a daily time series of unregulated, daily mean streamflow for a 44-year period of record spanning October 1, 1960, through September 30, 2004. Selected streamflow quantiles from an unregulated, daily flow-duration curve are estimated by solving six regression equations that are a function of physical and climate basin characteristics at an ungaged site on a stream of interest. Streamflow is then interpolated between the estimated quantiles to obtain a continuous daily flow-duration curve. A time series of unregulated daily streamflow subsequently is created by transferring the timing of the daily streamflow at a reference streamgage to the ungaged site by equating exceedence probabilities of contemporaneous flow at the two locations. One of 66 reference streamgages is selected by kriging, a geostatistical method, which is used to map the spatial relation among correlations between the time series of the logarithm of daily streamflows at each reference streamgage and the ungaged site. Estimated unregulated, daily mean streamflows show good agreement with observed

  14. Towards substrate-independent age estimation of blood stains based on dimensionality reduction and k-nearest neighbor classification of absorbance spectroscopic data.

    Science.gov (United States)

    Bergmann, Tommy; Heinke, Florian; Labudde, Dirk

    2017-09-01

    The age determination of blood traces provides important hints for the chronological assessment of criminal events and their reconstruction. Current methods are often expensive, involve significant experimental complexity and often fail to perform when being applied to aged blood samples taken from different substrates. In this work an absorption spectroscopy-based blood stain age estimation method is presented, which utilizes 400-640nm absorption spectra in computation. Spectral data from 72 differently aged pig blood stains (2h to three weeks) dried on three different substrate surfaces (cotton, polyester and glass) were acquired and the turnover-time correlations were utilized to develop a straightforward age estimation scheme. More precisely, data processing includes data dimensionality reduction, upon which classic k-nearest neighbor classifiers are employed. This strategy shows good agreement between observed and predicted blood stain age (r>0.9) in cross-validation. The presented estimation strategy utilizes spectral data from dissolved blood samples to bypass spectral artifacts which are well known to interfere with other spectral methods such as reflection spectroscopy. Results indicate that age estimations can be drawn from such absorbance spectroscopic data independent from substrate the blood dried on. Since data in this study was acquired under laboratory conditions, future work has to consider perturbing environmental conditions in order to assess real-life applicability. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Tidal Mixing Box Submodel for Tampa Bay: Calibration of Tidal Exchange Flows with the Parameter Estimation Tool (PEST)

    Science.gov (United States)

    In the mid-1990s the Tampa Bay Estuary Program proposed a nutrient reduction strategy focused on improving water clarity to promote seagrass expansion within Tampa Bay. A System Dynamics Model is being developed to evaluate spatially and temporally explicit impacts of nutrient r...

  16. STREET:Swedish Tool for Risk/Resource Estimation at EvenTs. Part one, risk assessment-face validity and inter-rater reliability

    Institute of Scientific and Technical Information of China (English)

    Andreas Berner; Tariq Saleem Alharbi; Eric Carlstrm; Amir Khorram-Manesh

    2015-01-01

    Objective: To develop a validated and generalized high reliability organizations collaborative tool in order to conduct common assessments and information sharing of potential risks during mass-gatherings. Methods: The Swedish resource and risk estimation guide was used as foundation for the development of the generalized collaborative tool, by three different expert groups, and then analyzed. Analysis of inter-rater reliability was conducted through simulated cases that showed weighted and unweightκ-statistics. Results:The results revealed a mean of unweightκ-value from the three cases of 0.37 and a mean accuracy of 62%of the tool. Conclusions:The collaboration tool,“STREET”, showed acceptable reliability and validity to be used as a foundation for high reliability organization collaboration in a simulated environment. However, the lack of reliability in one of the cases highlights the challenges of creating measurable values from simulated cases. A study on real events can provide higher reliability but need, on the other hand, an already developed tool.

  17. Estimating Potential Reductions in Premature Mortality in New York City From Raising the Minimum Wage to $15.

    Science.gov (United States)

    Tsao, Tsu-Yu; Konty, Kevin J; Van Wye, Gretchen; Barbot, Oxiris; Hadler, James L; Linos, Natalia; Bassett, Mary T

    2016-06-01

    To assess potential reductions in premature mortality that could have been achieved in 2008 to 2012 if the minimum wage had been $15 per hour in New York City. Using the 2008 to 2012 American Community Survey, we performed simulations to assess how the proportion of low-income residents in each neighborhood might change with a hypothetical $15 minimum wage under alternative assumptions of labor market dynamics. We developed an ecological model of premature death to determine the differences between the levels of premature mortality as predicted by the actual proportions of low-income residents in 2008 to 2012 and the levels predicted by the proportions of low-income residents under a hypothetical $15 minimum wage. A $15 minimum wage could have averted 2800 to 5500 premature deaths between 2008 and 2012 in New York City, representing 4% to 8% of total premature deaths in that period. Most of these avertable deaths would be realized in lower-income communities, in which residents are predominantly people of color. A higher minimum wage may have substantial positive effects on health and should be considered as an instrument to address health disparities.

  18. Estimation of CO2 reduction by parallel hard-type power hybridization for gasoline and diesel vehicles.

    Science.gov (United States)

    Oh, Yunjung; Park, Junhong; Lee, Jong Tae; Seo, Jigu; Park, Sungwook

    2017-10-01

    The purpose of this study is to investigate possible improvements in ICEVs by implementing fuzzy logic-based parallel hard-type power hybrid systems. Two types of conventional ICEVs (gasoline and diesel) and two types of HEVs (gasoline-electric, diesel electric) were generated using vehicle and powertrain simulation tools and a Matlab-Simulink application programming interface. For gasoline and gasoline-electric HEV vehicles, the prediction accuracy for four types of LDV models was validated by conducting comparative analysis with the chassis dynamometer and OBD test data. The predicted results show strong correlation with the test data. The operating points of internal combustion engines and electric motors are well controlled in the high efficiency region and battery SOC was well controlled within ±1.6%. However, for diesel vehicles, we generated virtual diesel-electric HEV vehicle because there is no available vehicles with similar engine and vehicle specifications with ICE vehicle. Using a fuzzy logic-based parallel hybrid system in conventional ICEVs demonstrated that HEVs showed superior performance in terms of fuel consumption and CO2 emission in most driving modes. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Numerical tools to estimate the flux of a gas across the air-water interface and assess the heterogeny of its forcing functions

    Directory of Open Access Journals (Sweden)

    V. M. N. de C. da S. Vieira

    2012-03-01

    Full Text Available A numerical tool was developed for the estimation of gas fluxes across the air water interface. The primary objective is to use it to estimate CO2 fluxes. Nevertheless application to other gases is easily accomplished by changing the values of the parameters related to the physical properties of the gases. A user friendly software was developed allowing to build upon a standard kernel a custom made gas flux model with the preferred parametrizations. These include single or double layer models; several numerical schemes for the effects of wind in the air-side and water-side transfer velocities; the effect of turbulence from current drag with the bottom; and the effects on solubility of water temperature, salinity, air temperature and pressure. It was also developed an analysis which decomposes the difference between the fluxes in a reference situation and in alternative situations into its several forcing functions. This analysis relies on the Taylor expansion of the gas flux model, requiring the numerical estimation of partial derivatives by a multivariate version of the collocation polynomial. Both the flux model and the difference decomposition analysis were tested with data taken from surveys done in the lagoonary system of Ria Formosa, south Portugal, in which the CO2 fluxes were estimated using the IRGA and floating chamber method whereas the CO2 concentrations were estimated using the IRGA and degasification chamber. Observations and estimations show a remarkable fit.

  20. Numerical tools to estimate the flux of a gas across the air–water interface and assess the heterogeneity of its forcing functions

    Directory of Open Access Journals (Sweden)

    V. M. N. C. S. Vieira

    2013-03-01

    Full Text Available A numerical tool was developed for the estimation of gas fluxes across the air–water interface. The primary objective is to use it to estimate CO2 fluxes. Nevertheless application to other gases is easily accomplished by changing the values of the parameters related to the physical properties of the gases. A user-friendly software was developed allowing to build upon a standard kernel a custom-made gas flux model with the preferred parameterizations. These include single or double layer models; several numerical schemes for the effects of wind in the air-side and water-side transfer velocities; the effects of atmospheric stability, surface roughness and turbulence from current drag with the bottom; and the effects on solubility of water temperature, salinity, air temperature and pressure. An analysis was also developed which decomposes the difference between the fluxes in a reference situation and in alternative situations into its several forcing functions. This analysis relies on the Taylor expansion of the gas flux model, requiring the numerical estimation of partial derivatives by a multivariate version of the collocation polynomial. Both the flux model and the difference decomposition analysis were tested with data taken from surveys done in the lagoon system of Ria Formosa, south Portugal, in which the CO2 fluxes were estimated using the infrared gas analyzer (IRGA and floating chamber method, whereas the CO2 concentrations were estimated using the IRGA and degasification chamber. Observations and estimations show a remarkable fit.

  1. Estimating energy intensity and CO{sub 2} emission reduction potentials in the manufacturing sectors in Thailand

    Energy Technology Data Exchange (ETDEWEB)

    Wangskarn, P.; Khummongkol, P.; Schrattenholzer, L. [and others

    1996-12-31

    The final energy consumption in Thailand increased at about ten percent annually within the last 10 years. To slow the energy demand growth rate while maintaining the country`s economic advance and environmental sustainability, the Energy Conservation Promotion Act (ECPA) was adopted in 1992. With this Act, a comprehensive Energy Conservation Program (ENCON) was initiated. ENCON commits the government to promoting energy conservation, to developing appropriate regulations, and to providing financial and organizational resources for program implementation. Due to this existing ENCON program a great benefit is expected not only to reducing energy consumption, but also to decreasing GHGs emissions substantially. This study is a part of the ENCON research program which was supported by the German Federal Government under the program called Prompt-Start Measures to Implement the U.N. Framework Convention on Climate Change (FCCC). The basic activities carried out during the project included (1) An assessment of Thailand`s total and specific energy consumption in the industrial sectors and commercial buildings; (2) Identification of existing and candidate technologies for GHG emission reduction and energy efficiency improvements in specific factories and commercial buildings; and (3) Identification of individual factories and commercial buildings as candidates for detailed further study. Although the energy assessment had been carried out for the commercial buildings also, this paper will cover only the work on the manufacturing sector. On the basis of these steps, 14 factories were visited by the project team and preliminary energy audits were performed. As a result, concrete measures and investments were proposed and classified into two groups according to their economic characteristics. Those investments with a payback time of less than four years were considered together in a Moderate scenario, and those with longer payback times in an Intensive scenario.

  2. Uncertainty reduction and parameter estimation of a distributed hydrological model with ground and remote-sensing data

    Science.gov (United States)

    Silvestro, F.; Gabellani, S.; Rudari, R.; Delogu, F.; Laiolo, P.; Boni, G.

    2015-04-01

    During the last decade the opportunity and usefulness of using remote-sensing data in hydrology, hydrometeorology and geomorphology has become even more evident and clear. Satellite-based products often allow for the advantage of observing hydrologic variables in a distributed way, offering a different view with respect to traditional observations that can help with understanding and modeling the hydrological cycle. Moreover, remote-sensing data are fundamental in scarce data environments. The use of satellite-derived digital elevation models (DEMs), which are now globally available at 30 m resolution (e.g., from Shuttle Radar Topographic Mission, SRTM), have become standard practice in hydrologic model implementation, but other types of satellite-derived data are still underutilized. As a consequence there is the need for developing and testing techniques that allow the opportunities given by remote-sensing data to be exploited, parameterizing hydrological models and improving their calibration. In this work, Meteosat Second Generation land-surface temperature (LST) estimates and surface soil moisture (SSM), available from European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) H-SAF, are used together with streamflow observations (S. N.) to calibrate the Continuum hydrological model that computes such state variables in a prognostic mode. The first part of the work aims at proving that satellite observations can be exploited to reduce uncertainties in parameter calibration by reducing the parameter equifinality that can become an issue in forecast mode. In the second part, four parameter estimation strategies are implemented and tested in a comparative mode: (i) a multi-objective approach that includes both satellite and ground observations which is an attempt to use different sources of data to add constraints to the parameters; (ii and iii) two approaches solely based on remotely sensed data that reproduce the case of a scarce data

  3. Reduction of uncertainty for estimating runoff with the NRCS CN model by the adaptation to local climatic conditions

    Science.gov (United States)

    Durán-Barroso, Pablo; González, Javier; Valdés, Juan B.

    2016-04-01

    Rainfall-runoff quantification is one of the most important tasks in both engineering and watershed management as it allows to identify, forecast and explain watershed response. For that purpose, the Natural Resources Conservation Service Curve Number method (NRCS CN) is the conceptual lumped model more recognized in the field of rainfall-runoff estimation. Furthermore, there is still an ongoing discussion about the procedure to determine the portion of rainfall retained in the watershed before runoff is generated, called as initial abstractions. This concept is computed as a ratio (λ) of the soil potential maximum retention S of the watershed. Initially, this ratio was assumed to be 0.2, but later it has been proposed to be modified to 0.05. However, the actual procedures to convert NRCS CN model parameters obtained under a different hypothesis about λ do not incorporate any adaptation of climatic conditions of each watershed. By this reason, we propose a new simple method for computing model parameters which is adapted to local conditions taking into account regional patterns of climate conditions. After checking the goodness of this procedure against the actual ones in 34 different watersheds located in Ohio and Texas (United States), we concluded that this novel methodology represents the most accurate and efficient alternative to refit the initial abstraction ratio.

  4. A joint resonance frequency estimation and in-band noise reduction method for enhancing the detectability of bearing fault signals

    Science.gov (United States)

    Bozchalooi, I. Soltani; Liang, Ming

    2008-05-01

    The vibration signal measured from a bearing contains vital information for the prognostic and health assessment purposes. However, when bearings are installed as part of a complex mechanical system, the measured signal is often heavily clouded by various noises due to the compounded effect of interferences of other machine elements and background noises present in the measuring device. As such, reliable condition monitoring would not be possible without proper de-noising. This is particularly true for incipient bearing faults with very weak signature signals. A new de-noising scheme is proposed in this paper to enhance the vibration signals acquired from faulty bearings. This de-noising scheme features a spectral subtraction to trim down the in-band noise prior to wavelet filtering. The Gabor wavelet is used in the wavelet transform and its parameters, i.e., scale and shape factor are selected in separate steps. The proper scale is found based on a novel resonance estimation algorithm. This algorithm makes use of the information derived from the variable shaft rotational speed though such variation is highly undesirable in fault detection since it complicates the process substantially. The shape factor value is then selected by minimizing a smoothness index. This index is defined as the ratio of the geometric mean to the arithmetic mean of the wavelet coefficient moduli. De-noising results are presented for simulated signals and experimental data acquired from both normal and faulty bearings with defective outer race, inner race, and rolling element.

  5. Comparison of tobacco control scenarios: quantifying estimates of long-term health impact using the DYNAMO-HIA modeling tool.

    Science.gov (United States)

    Kulik, Margarete C; Nusselder, Wilma J; Boshuizen, Hendriek C; Lhachimi, Stefan K; Fernández, Esteve; Baili, Paolo; Bennett, Kathleen; Mackenbach, Johan P; Smit, H A

    2012-01-01

    There are several types of tobacco control interventions/policies which can change future smoking exposure. The most basic intervention types are 1) smoking cessation interventions 2) preventing smoking initiation and 3) implementation of a nationwide policy affecting quitters and starters simultaneously. The possibility for dynamic quantification of such different interventions is key for comparing the timing and size of their effects. We developed a software tool, DYNAMO-HIA, which allows for a quantitative comparison of the health impact of different policy scenarios. We illustrate the outcomes of the tool for the three typical types of tobacco control interventions if these were applied in the Netherlands. The tool was used to model the effects of different types of smoking interventions on future smoking prevalence and on health outcomes, comparing these three scenarios with the business-as-usual scenario. The necessary data input was obtained from the DYNAMO-HIA database which was assembled as part of this project. All smoking interventions will be effective in the long run. The population-wide strategy will be most effective in both the short and long term. The smoking cessation scenario will be second-most effective in the short run, though in the long run the smoking initiation scenario will become almost as effective. Interventions aimed at preventing the initiation of smoking need a long time horizon to become manifest in terms of health effects. The outcomes strongly depend on the groups targeted by the intervention. We calculated how much more effective the population-wide strategy is, in both the short and long term, compared to quit smoking interventions and measures aimed at preventing the initiation of smoking. By allowing a great variety of user-specified choices, the DYNAMO-HIA tool is a powerful instrument by which the consequences of different tobacco control policies and interventions can be assessed.

  6. HydrogeoSieveXL: an Excel-based tool to estimate hydraulic conductivity from grain-size analysis

    Science.gov (United States)

    Devlin, J. F.

    2015-06-01

    For over a century, hydrogeologists have estimated hydraulic conductivity ( K) from grain-size distribution curves. The benefits of the practice are simplicity, cost, and a means of identifying spatial variations in K. Many techniques have been developed over the years, but all suffer from similar shortcomings: no accounting of heterogeneity within samples (i.e., aquifer structure is lost), loss of grain packing characteristics, and failure to account for the effects of overburden pressure on K. In addition, K estimates can vary by an order of magnitude between the various methods, and it is not generally possible to identify the best method for a given sample. The drawbacks are serious, but the advantages have seen the use of grain-size distribution curves for K estimation continue, often using a single selected method to estimate K in a given project. In most cases, this restriction results from convenience. It is proposed here that extending the analysis to include several methods would be beneficial since it would provide a better indication of the range of K that might apply. To overcome the convenience limitation, an Excel-based spreadsheet program, HydrogeoSieveXL, is introduced here. HydrogeoSieveXL is a freely available program that calculates K from grain-size distribution curves using 15 different methods. HydrogeoSieveXL was found to calculate K values essentially identical to those reported in the literature, using the published grain-size distribution curves.

  7. Commissioning the neutron production of a Linac: Development of a simple tool for second cancer risk estimation

    Energy Technology Data Exchange (ETDEWEB)

    Romero-Expósito, M., E-mail: mariateresa.romero@uab.cat [Departamento de Fisiología Médica y Biofísica, Universidad de Sevilla, Sevilla 41009, Spain and Departament de Física, Universitat Autònoma de Barcelona, Bellaterra 08193 (Spain); Sánchez-Nieto, B. [Instituto de Física, Pontificia Universidad Católica de Chile, Santiago 4880 (Chile); Terrón, J. A. [Servicio de Radiofísica, Hospital Universitario Virgen Macarena, Sevilla 41009 (Spain); Lopes, M. C. [Serviço de Física Médica, Instituto Português de Oncologia, Coimbra 3000-075 (Portugal); Ferreira, B. C. [i3N, Department of Physics, University of Aveiro, Aveiro 3810-193 (Portugal); Grishchuk, D. [Radiotherapy Service, Russian Research Center for Radiology and Surgical Technology, Saint Petersburg 197758 (Russian Federation); Sandín, C. [Elekta, Ltd., Crawley RH10 9RR (United Kingdom); Moral-Sánchez, S. [Servicio de Radiofísica, Instituto Onkologikoa, San Sebastián 20014 (Spain); Melchor, M. [Servicio de Radiofísica, Hospital Universitario de la Ribera, Alzira 46600, Valencia (Spain); Domingo, C. [Departament de Física, Universitat Autònoma de Barcelona, Bellaterra 08193 (Spain); and others

    2015-01-15

    Purpose: Knowing the contribution of neutron to collateral effects in treatments is both a complex and a mandatory task. This work aims to present an operative procedure for neutron estimates in any facility using a neutron digital detector. Methods: The authors’ previous work established a linear relationship between the total second cancer risk due to neutrons (TR{sup n}) and the number of MU of the treatment. Given that the digital detector also presents linearity with MU, its response can be used to determine the TR{sup n} per unit MU, denoted as m, normally associated to a generic Linac model and radiotherapy facility. Thus, from the number of MU of each patient treatment, the associated risk can be estimated. The feasibility of the procedure was tested by applying it in eight facilities; patients were evaluated as well. Results: From the reading of the detector under selected irradiation conditions, m values were obtained for different machines, ranging from 0.25 × 10{sup −4}% per MU for an Elekta Axesse at 10 MV to 6.5 × 10{sup −4}% per MU for a Varian Clinac at 18 MV. Using these values, TR{sup n} of patients was estimated in each facility and compared to that from the individual evaluation. Differences were within the range of uncertainty of the authors’ methodology of equivalent dose and risk estimations. Conclusions: The procedure presented here allows an easy estimation of the second cancer risk due to neutrons for any patient, given the number of MU of the treatment. It will enable the consideration of this information when selecting the optimal treatment for a patient by its implementation in the treatment planning system.

  8. STREET:Swedish Tool for Risk/Resource Estimation at EvenTs. Part two, resource assessment - face validity and inter-rater reliability

    Institute of Scientific and Technical Information of China (English)

    Andreas Berner; Tariq Saleem Alharbi; Eric Carlstrm; Amir Khorram-Manesh

    2015-01-01

    Objective: To develop a validated and generalized collaborative tool to be utilized by high reliability organizations in order to conduct common resource assessment before major events and mass gatherings.Methods:The Swedish resource and risk estimation guide was used as foundation for the development of the generalized collaborative tool, by three different expert groups, and then analyzed. Analysis of inter-rater reliability was conducted through simulated cases that showed weighted and unweight κ-statistics.Results:The results revealed a mean of unweight κ-value from the three cases of 0.44 and a mean accuracy of 61% of the tool.Conclusions:A better collaboration ability and more accurate resource assessment with acceptable reliability and validity were shown in this study to be used as a foundation for resource assessment before major events/mass-gathering in a simulated environment. However, the result also indicates the challenges of creating measurable values from simulated cases. A study on real events can provide higher reliability but needs, on the other hand, an already developed tool.

  9. A software tool to estimate the dynamic behaviour of the IP{sup 2}C samples as sensors for didactic purposes

    Energy Technology Data Exchange (ETDEWEB)

    Graziani, S.; Pagano, F.; Pitrone, N.; Umana, E., E-mail: nicola.pitrone@diees.unict.i [Dipartimento di Ingegneria Elettrica Elettronica e dei Sistemi -University of Catania V.le A. Doria 6, 95125, Catania (Italy)

    2010-07-01

    Ionic Polymer Polymer Composites (IP{sup 2}Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP{sup 2}C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP{sup 2}Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP{sup 2}C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP{sup 2}C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  10. Costing support and cost control in manufacturing. A cost estimation tool applied in the sheet metal domain.

    OpenAIRE

    ten Brinke, E.

    2002-01-01

    In the product development cycle several engineering tasks like design, process planning and production planning have to be executed. The execution of these tasks mainly involves information processing and decision-making. Because costs is an important factor in manufacturing, adequate information about costs is extremely valuable for all engineering tasks. Therefore, a cost estimation system for the generation of cost information and for cost control, integrated in the product development cy...

  11. Comparative Benchmark Dose Modeling as a Tool to Make the First Estimate of Safe Human Exposure Levels to Lunar Dust

    Science.gov (United States)

    James, John T.; Lam, Chiu-wing; Scully, Robert R.

    2013-01-01

    Brief exposures of Apollo Astronauts to lunar dust occasionally elicited upper respiratory irritation; however, no limits were ever set for prolonged exposure ot lunar dust. Habitats for exploration, whether mobile of fixed must be designed to limit human exposure to lunar dust to safe levels. We have used a new technique we call Comparative Benchmark Dose Modeling to estimate safe exposure limits for lunar dust collected during the Apollo 14 mission.

  12. The construction of a decision tool to analyse local demand and local supply for GP care using a synthetic estimation model.

    Science.gov (United States)

    de Graaf-Ruizendaal, Willemijn A; de Bakker, Dinny H

    2013-10-27

    This study addresses the growing academic and policy interest in the appropriate provision of local healthcare services to the healthcare needs of local populations to increase health status and decrease healthcare costs. However, for most local areas information on the demand for primary care and supply is missing. The research goal is to examine the construction of a decision tool which enables healthcare planners to analyse local supply and demand in order to arrive at a better match. National sample-based medical record data of general practitioners (GPs) were used to predict the local demand for GP care based on local populations using a synthetic estimation technique. Next, the surplus or deficit in local GP supply were calculated using the national GP registry. Subsequently, a dynamic internet tool was built to present demand, supply and the confrontation between supply and demand regarding GP care for local areas and their surroundings in the Netherlands. Regression analysis showed a significant relationship between sociodemographic predictors of postcode areas and GP consultation time (F [14, 269,467] = 2,852.24; P 1,000 inhabitants in the Netherlands covering 97% of the total population. Confronting these estimated demand figures with the actual GP supply resulted in the average GP workload and the number of full-time equivalent (FTE) GP too much/too few for local areas to cover the demand for GP care. An estimated shortage of one FTE GP or more was prevalent in about 19% of the postcode areas with >1,000 inhabitants if the surrounding postcode areas were taken into consideration. Underserved areas were mainly found in rural regions. The constructed decision tool is freely accessible on the Internet and can be used as a starting point in the discussion on primary care service provision in local communities and it can make a considerable contribution to a primary care system which provides care when and where people need it.

  13. Lingual palpation for porcine cysticercosis: a rapid epidemiological tool for estimating prevalence and community risk in Africa.

    Science.gov (United States)

    Guyatt, Helen L; Fèvre, Eric M

    2016-10-01

    To assess the association between the prevalence of tongue cyst-positive and antigen-positive pigs across different settings in Africa, to evaluate whether examining pigs for cysts could be used as a rapid surveillance tool for identifying geographical areas with a higher probability of high transmission of cysticercosis. Published data were collated from 26 study sites across Africa that reported the prevalence of porcine cysticercosis by both lingual and serological examinations. The study sites were located in 10 countries across Africa. Seroprevalence rates ranged from 4% to 41%. Despite the varied study sites, the relationship between the two variables was highly consistent and suggests identification of tongue cysts may be useful for cysticercosis surveillance. We found that all areas with more than 10% of pigs having cysts in their tongues had at least 30% seroprevalence (PPV of 100%), although this cut-off is less reliable at predicting that an area is of low transmission (NPV of 84%). Assessing the prevalence of tongue cyst-positive pigs is a potential rapid epidemiological tool for identifying areas at high risk of cysticercosis, although further refinement and validation is required using standardised data sets. © 2016 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  14. Culvert Analysis Program Graphical User Interface 1.0--A preprocessing and postprocessing tool for estimating flow through culvert

    Science.gov (United States)

    Bradley, D. Nathan

    2013-01-01

    The peak discharge of a flood can be estimated from the elevation of high-water marks near the inlet and outlet of a culvert after the flood has occurred. This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks on trees or buildings. When combined with the cross-sectional geometry of the channel upstream from the culvert and the culvert size, shape, roughness, and orientation, the high-water marks define a water-surface profile that can be used to estimate the peak discharge by using the methods described by Bodhaine (1968). This type of measurement is in contrast to a “direct” measurement of discharge made during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a streamgage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the streamgage, resulting in more accurate computation of high flows. The Culvert Analysis Program (CAP) (Fulford, 1998) is a command-line program written in Fortran for computing peak discharges and culvert rating surfaces or curves. CAP reads input data from a formatted text file and prints results to another formatted text file. Preparing and correctly formatting the input file may be time-consuming and prone to errors. This document describes the CAP graphical user interface (GUI)—a modern, cross-platform, menu-driven application that prepares the CAP input file, executes the program, and helps the user interpret the output

  15. Development of tools for evaluating rainfall estimation models in real- time using the Integrated Meteorological Observation Network in Castilla y León (Spain)

    Science.gov (United States)

    Merino, Andres; Guerrero-Higueras, Angel Manuel; López, Laura; Gascón, Estibaliz; Sánchez, José Luis; Lorente, José Manuel; Marcos, José Luis; Matía, Pedro; Ortiz de Galisteo, José Pablo; Nafría, David; Fernández-González, Sergio; Weigand, Roberto; Hermida, Lucía; García-Ortega, Eduardo

    2014-05-01

    The integration of various public and private observation networks into the Observation Network of Castile-León (ONet_CyL), Spain, allows us to monitor the risks in real-time. One of the most frequent risks in this region is severe precipitation. Thus, the data from the network allows us to determine the area where precipitation was registered and also to know the areas with precipitation in real-time. The observation network is managed with a LINUX system. The observation platform makes it possible to consult the observation data in a specific point in the region, or otherwise to see the spatial distribution of the precipitation in a user-defined area and time interval. In this study, we compared several rainfall estimation models, based on satellite data for Castile-León, with precipitation data from the meteorological observation network. The rainfall estimation models obtained from the meteorological satellite data provide us with a precipitation field covering a wide area, although its operational use requires a prior evaluation using ground truth data. The aim is to develop a real-time evaluation tool for rainfall estimation models that allows us to monitor the accuracy of its forecasting. This tool makes it possible to visualise different Skill Scores (Probability of Detection, False Alarm Ratio and others) of each rainfall estimation model in real time, thereby not only allowing us to know the areas where the rainfall models indicate precipitation, but also the validation of the model in real-time for each specific meteorological situation. Acknowledgements The authors would like to thank the Regional Government of Castile-León for its financial support through the project LE220A11-2. This study was supported by the following grants: GRANIMETRO (CGL2010-15930); MICROMETEO (IPT-310000-2010-22).

  16. FPGA-based fused smart-sensor for tool-wear area quantitative estimation in CNC machine inserts.

    Science.gov (United States)

    Trejo-Hernandez, Miguel; Osornio-Rios, Roque Alfredo; de Jesus Romero-Troncoso, Rene; Rodriguez-Donate, Carlos; Dominguez-Gonzalez, Aurelio; Herrera-Ruiz, Gilberto

    2010-01-01

    Manufacturing processes are of great relevance nowadays, when there is a constant claim for better productivity with high quality at low cost. The contribution of this work is the development of a fused smart-sensor, based on FPGA to improve the online quantitative estimation of flank-wear area in CNC machine inserts from the information provided by two primary sensors: the monitoring current output of a servoamplifier, and a 3-axis accelerometer. Results from experimentation show that the fusion of both parameters makes it possible to obtain three times better accuracy when compared with the accuracy obtained from current and vibration signals, individually used.

  17. Lattice energy calculation - A quick tool for screening of cocrystals and estimation of relative solubility. Case of flavonoids

    Science.gov (United States)

    Kuleshova, L. N.; Hofmann, D. W. M.; Boese, R.

    2013-03-01

    Cocrystals (or multicomponent crystals) have physico-chemical properties that are different from crystals of pure components. This is significant in drug development, since the desired properties, e.g. solubility, stability and bioavailability, can be tailored by binding two substances into a single crystal without chemical modification of an active component. Here, the FLEXCRYST program suite, implemented with a data mining force field, was used to estimate the relative stability and, consequently, the relative solubility of cocrystals of flavonoids vs their pure crystals, stored in the Cambridge Structural Database. The considerable potency of this approach for in silico screening of cocrystals, as well as their relative solubility, was demonstrated.

  18. FPGA-Based Fused Smart-Sensor for Tool-Wear Area Quantitative Estimation in CNC Machine Inserts

    Science.gov (United States)

    Trejo-Hernandez, Miguel; Osornio-Rios, Roque Alfredo; de Jesus Romero-Troncoso, Rene; Rodriguez-Donate, Carlos; Dominguez-Gonzalez, Aurelio; Herrera-Ruiz, Gilberto

    2010-01-01

    Manufacturing processes are of great relevance nowadays, when there is a constant claim for better productivity with high quality at low cost. The contribution of this work is the development of a fused smart-sensor, based on FPGA to improve the online quantitative estimation of flank-wear area in CNC machine inserts from the information provided by two primary sensors: the monitoring current output of a servoamplifier, and a 3-axis accelerometer. Results from experimentation show that the fusion of both parameters makes it possible to obtain three times better accuracy when compared with the accuracy obtained from current and vibration signals, individually used. PMID:22319304

  19. MultipLa--a tool for the combined overall estimation of various types of manual handling tasks.

    Science.gov (United States)

    Karlheinz, Schaub; Max, Bierwirth; Michaela, Kugler; Ralph, Bruder

    2012-01-01

    In the 1990ies the German Federal Institute for Occupational Safety and Health (FIOSH) published "Key Indicator Methods" (KIM) for the evaluation of manual material handling tasks. These methods served as a national German implementation of the EU Manual Handling Directive (90/269/EEC). These methods allow the evaluation of individual handling tasks like lifting or pushing. KIM tools do not allow evaluating complex handling tasks like a combined lifting and pushing task. With respect to the needs at shop floor level (e.g. logistics), MultipLa tries to bridge that gap by means of an EXCEL based worksheet using the KIM philosophy. In the past several algorithms for a risk assessment had been developed. At the moment MultipLa is in a test phase at several automotive OEMs.

  20. Breast dose reduction for chest CT by modifying the scanning parameters based on the pre-scan size-specific dose estimate (SSDE)

    Energy Technology Data Exchange (ETDEWEB)

    Kidoh, Masafumi; Utsunomiya, Daisuke; Oda, Seitaro; Nakaura, Takeshi; Yuki, Hideaki; Hirata, Kenichiro; Namimoto, Tomohiro; Sakabe, Daisuke; Hatemura, Masahiro; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Faculty of Life Sciences, Honjo, Kumamoto (Japan); Funama, Yoshinori [Kumamoto University, Department of Medical Physics, Faculty of Life Sciences, Honjo, Kumamoto (Japan)

    2017-06-15

    To investigate the usefulness of modifying scanning parameters based on the size-specific dose estimate (SSDE) for a breast-dose reduction for chest CT. We scanned 26 women with a fixed volume CT dose index (CTDI{sub vol}) (15 mGy) and another 26 with a fixed SSDE (15 mGy) protocol (protocol 1 and 2, respectively). In protocol 2, tube current was calculated based on the patient habitus obtained on scout images. We compared the mean breast dose and the inter-patient breast dose variability and performed linear regression analysis of the breast dose and the body mass index (BMI) of the two protocols. The mean breast dose was about 35 % lower under protocol 2 than protocol 1 (10.9 mGy vs. 16.8 mGy, p < 0.01). The inter-patient breast dose variability was significantly lower under protocol 2 than 1 (1.2 mGy vs. 2.5 mGy, p < 0.01). We observed a moderate negative correlation between the breast dose and the BMI under protocol 1 (r = 0.43, p < 0.01); there was no significant correlation (r = 0.06, p = 0.35) under protocol 2. The SSDE-based protocol achieved a reduction in breast dose and in inter-patient breast dose variability. (orig.)

  1. Stochastic differential equations as a tool to regularize the parameter estimation problem for continuous time dynamical systems given discrete time measurements.

    Science.gov (United States)

    Leander, Jacob; Lundh, Torbjörn; Jirstrand, Mats

    2014-05-01

    In this paper we consider the problem of estimating parameters in ordinary differential equations given discrete time experimental data. The impact of going from an ordinary to a stochastic differential equation setting is investigated as a tool to overcome the problem of local minima in the objective function. Using two different models, it is demonstrated that by allowing noise in the underlying model itself, the objective functions to be minimized in the parameter estimation procedures are regularized in the sense that the number of local minima is reduced and better convergence is achieved. The advantage of using stochastic differential equations is that the actual states in the model are predicted from data and this will allow the prediction to stay close to data even when the parameters in the model is incorrect. The extended Kalman filter is used as a state estimator and sensitivity equations are provided to give an accurate calculation of the gradient of the objective function. The method is illustrated using in silico data from the FitzHugh-Nagumo model for excitable media and the Lotka-Volterra predator-prey system. The proposed method performs well on the models considered, and is able to regularize the objective function in both models. This leads to parameter estimation problems with fewer local minima which can be solved by efficient gradient-based methods.

  2. Health risk estimates for groundwater and soil contamination in the Slovak Republic: a convenient tool for identification and mapping of risk areas.

    Science.gov (United States)

    Fajčíková, K; Cvečková, V; Stewart, A; Rapant, S

    2014-10-01

    We undertook a quantitative estimation of health risks to residents living in the Slovak Republic and exposed to contaminated groundwater (ingestion by adult population) and/or soils (ingestion by adult and child population). Potential risk areas were mapped to give a visual presentation at basic administrative units of the country (municipalities, districts, regions) for easy discussion with policy and decision-makers. The health risk estimates were calculated by US EPA methods, applying threshold values for chronic risk and non-threshold values for cancer risk. The potential health risk was evaluated for As, Ba, Cd, Cu, F, Hg, Mn, NO3 (-), Pb, Sb, Se and Zn for groundwater and As, B, Ba, Be, Cd, Cu, F, Hg, Mn, Mo, Ni, Pb, Sb, Se and Zn for soils. An increased health risk was identified mainly in historical mining areas highly contaminated by geogenic-anthropogenic sources (ore deposit occurrence, mining, metallurgy). Arsenic and antimony were the most significant elements in relation to health risks from groundwater and soil contamination in the Slovak Republic contributing a significant part of total chronic risk levels. Health risk estimation for soil contamination has highlighted the significance of exposure through soil ingestion in children. Increased cancer risks from groundwater and soil contamination by arsenic were noted in several municipalities and districts throughout the country in areas with significantly high arsenic levels in the environment. This approach to health risk estimations and visualization represents a fast, clear and convenient tool for delineation of risk areas at national and local levels.

  3. SOAP 2.0: A Tool to Estimate the Photometric and Radial Velocity Variations Induced by Stellar Spots and Plages

    Science.gov (United States)

    Dumusque, X.; Boisse, I.; Santos, N. C.

    2014-12-01

    This paper presents SOAP 2.0, a new version of the Spot Oscillation And Planet (SOAP) code that estimates in a simple way the photometric and radial velocity (RV) variations induced by active regions. The inhibition of the convective blueshift (CB) inside active regions is considered, as well as the limb brightening effect of plages, a quadratic limb darkening law, and a realistic spot and plage contrast ratio. SOAP 2.0 shows that the activity-induced variation of plages is dominated by the inhibition of the CB effect. For spots, this effect becomes significant only for slow rotators. In addition, in the case of a major active region dominating the activity-induced signal, the ratio between the FWHM and the RV peak-to-peak amplitudes of the cross correlation function can be used to infer the type of active region responsible for the signal for stars with v sin i SOAP 2.0 manages to reproduce the activity variation as well as previous simulations when a spot is dominating the activity-induced variation. In addition, SOAP 2.0 also reproduces the activity variation induced by a plage on the slowly rotating star α Cen B, which is not possible using previous simulations. Following these results, SOAP 2.0 can be used to estimate the signal induced by spots and plages, but also to correct for it when a major active region is dominating the RV variation. . The work in this paper is based on observations made with the MOST satellite, the HARPS instrument on the ESO 3.6 m telescope at La Silla Observatory (Chile), and the SOPHIE instrument at the Observatoire de Haute Provence (France).

  4. SOAP 2.0: a tool to estimate the photometric and radial velocity variations induced by stellar spots and plages

    Energy Technology Data Exchange (ETDEWEB)

    Dumusque, X. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Boisse, I. [Laboratoire d' Astrophysique de Marseille (UMR 6110), Technopole de Château-Gombert, 38 rue Frédéric Joliot-Curie, F-13388 Marseille Cedex 13 (France); Santos, N. C., E-mail: xdumusque@cfa.harvard.edu [Centro de Astrofìsica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal)

    2014-12-01

    This paper presents SOAP 2.0, a new version of the Spot Oscillation And Planet (SOAP) code that estimates in a simple way the photometric and radial velocity (RV) variations induced by active regions. The inhibition of the convective blueshift (CB) inside active regions is considered, as well as the limb brightening effect of plages, a quadratic limb darkening law, and a realistic spot and plage contrast ratio. SOAP 2.0 shows that the activity-induced variation of plages is dominated by the inhibition of the CB effect. For spots, this effect becomes significant only for slow rotators. In addition, in the case of a major active region dominating the activity-induced signal, the ratio between the FWHM and the RV peak-to-peak amplitudes of the cross correlation function can be used to infer the type of active region responsible for the signal for stars with v sin i ≤8 km s{sup –1}. A ratio smaller than three implies a spot, while a larger ratio implies a plage. Using the observation of HD 189733, we show that SOAP 2.0 manages to reproduce the activity variation as well as previous simulations when a spot is dominating the activity-induced variation. In addition, SOAP 2.0 also reproduces the activity variation induced by a plage on the slowly rotating star α Cen B, which is not possible using previous simulations. Following these results, SOAP 2.0 can be used to estimate the signal induced by spots and plages, but also to correct for it when a major active region is dominating the RV variation.

  5. The Distributed Thermal Perturbation Sensor: A New Tool for In Situ Estimation of Formation Thermal Properties and Geothermal Heat Flux

    Science.gov (United States)

    Freifeld, B. M.; Kryder, L.; Gilmore, K.; Henninges, J.; Onstott, T. C.; Lisa, P.

    2007-12-01

    Variations in geothermal heat flux provide a window into a diverse array of geological processes including plate tectonics and crustal fluid circulation. The Distributed Thermal Perturbation Sensor (DTPS) is a novel device that can simultaneously determine formation thermal properties and heat flux in situ. The device consists of a fiber- optic distributed temperature sensor (DTS) and a heat trace cable installed along the axis of a borehole. To operate the DTPS, the sensor is backfilled into a borehole and the disturbed thermal field is allowed to dissipate. A baseline temperature profile is subsequently recorded. Next, the heat trace cable is used to provide constant heating along the borehole and the thermal transient is recorded. DTS monitoring continues after heating concludes during the ensuing cool-down phase. To obtain in situ estimates for thermal properties and heat flux, simple conductive or conductive-convective models can be used to interpret the data. Given the 1 meter spatial resolution of the DTS - the DTPS provides thermal property and heat flux estimates at similar spatial resolution. To date, the DTPS has been deployed at three continental sites: (1) in the Amargosa Valley, Amargosa, NV, USA, to characterize groundwater flow through fractured volcanic tuffs, (2) in a deep permafrost boring within an Archean mafic volcanic belt at the High Lake Project Site (67°22"N, 110°50"W), Nunavut, Canada, and (3) as part of the monitoring program at CO2SINK, a carbon geosequestration experiment being conducted in Ketzin, Germany. The authors present results from these three sites and discuss potential modalities for future deployment in suboceanic environments.

  6. The GAAS metagenomic tool and its estimations of viral and microbial average genome size in four major biomes.

    Science.gov (United States)

    Angly, Florent E; Willner, Dana; Prieto-Davó, Alejandra; Edwards, Robert A; Schmieder, Robert; Vega-Thurber, Rebecca; Antonopoulos, Dionysios A; Barott, Katie; Cottrell, Matthew T; Desnues, Christelle; Dinsdale, Elizabeth A; Furlan, Mike; Haynes, Matthew; Henn, Matthew R; Hu, Yongfei; Kirchman, David L; McDole, Tracey; McPherson, John D; Meyer, Folker; Miller, R Michael; Mundt, Egbert; Naviaux, Robert K; Rodriguez-Mueller, Beltran; Stevens, Rick; Wegley, Linda; Zhang, Lixin; Zhu, Baoli; Rohwer, Forest

    2009-12-01

    Metagenomic studies characterize both the composition and diversity of uncultured viral and microbial communities. BLAST-based comparisons have typically been used for such analyses; however, sampling biases, high percentages of unknown sequences, and the use of arbitrary thresholds to find significant similarities can decrease the accuracy and validity of estimates. Here, we present Genome relative Abundance and Average Size (GAAS), a complete software package that provides improved estimates of community composition and average genome length for metagenomes in both textual and graphical formats. GAAS implements a novel methodology to control for sampling bias via length normalization, to adjust for multiple BLAST similarities by similarity weighting, and to select significant similarities using relative alignment lengths. In benchmark tests, the GAAS method was robust to both high percentages of unknown sequences and to variations in metagenomic sequence read lengths. Re-analysis of the Sargasso Sea virome using GAAS indicated that standard methodologies for metagenomic analysis may dramatically underestimate the abundance and importance of organisms with small genomes in environmental systems. Using GAAS, we conducted a meta-analysis of microbial and viral average genome lengths in over 150 metagenomes from four biomes to determine whether genome lengths vary consistently between and within biomes, and between microbial and viral communities from the same environment. Significant differences between biomes and within aquatic sub-biomes (oceans, hypersaline systems, freshwater, and microbialites) suggested that average genome length is a fundamental property of environments driven by factors at the sub-biome level. The behavior of paired viral and microbial metagenomes from the same environment indicated that microbial and viral average genome sizes are independent of each other, but indicative of community responses to stressors and environmental conditions.

  7. The GAAS metagenomic tool and its estimations of viral and microbial average genome size in four major biomes.

    Directory of Open Access Journals (Sweden)

    Florent E Angly

    2009-12-01

    Full Text Available Metagenomic studies characterize both the composition and diversity of uncultured viral and microbial communities. BLAST-based comparisons have typically been used for such analyses; however, sampling biases, high percentages of unknown sequences, and the use of arbitrary thresholds to find significant similarities can decrease the accuracy and validity of estimates. Here, we present Genome relative Abundance and Average Size (GAAS, a complete software package that provides improved estimates of community composition and average genome length for metagenomes in both textual and graphical formats. GAAS implements a novel methodology to control for sampling bias via length normalization, to adjust for multiple BLAST similarities by similarity weighting, and to select significant similarities using relative alignment lengths. In benchmark tests, the GAAS method was robust to both high percentages of unknown sequences and to variations in metagenomic sequence read lengths. Re-analysis of the Sargasso Sea virome using GAAS indicated that standard methodologies for metagenomic analysis may dramatically underestimate the abundance and importance of organisms with small genomes in environmental systems. Using GAAS, we conducted a meta-analysis of microbial and viral average genome lengths in over 150 metagenomes from four biomes to determine whether genome lengths vary consistently between and within biomes, and between microbial and viral communities from the same environment. Significant differences between biomes and within aquatic sub-biomes (oceans, hypersaline systems, freshwater, and microbialites suggested that average genome length is a fundamental property of environments driven by factors at the sub-biome level. The behavior of paired viral and microbial metagenomes from the same environment indicated that microbial and viral average genome sizes are independent of each other, but indicative of community responses to stressors and

  8. DNA barcoding, microarrays and next generation sequencing: recent tools for genetic diversity estimation and authentication of medicinal plants.

    Science.gov (United States)

    Sarwat, Maryam; Yamdagni, Manu Mayank

    2016-01-01

    DNA barcoding, microarray technology and next generation sequencing have emerged as promising tools for the elucidation of plant genetic diversity and its conservation. They are proving to be immensely helpful in authenticating the useful medicinal plants for herbal drug preparations. These newer versions of molecular markers utilize short genetic markers in the genome to characterize the organism to a particular species. This has the potential not only to classify the known and yet unknown species but also has a promising future to link the medicinally important plants according to their properties. The newer trends being followed in DNA chips and barcoding pave the way for a future with many different possibilities. Several of these possibilities might be: characterization of unknown species in a considerably less time than usual, identification of newer medicinal properties possessed by the species and also updating the data of the already existing but unnoticed properties. This can assist us to cure many different diseases and will also generate novel opportunities in medicinal drug delivery and targeting.

  9. Submerged macrophyte communities in the Forsmark area. Building of a GIS application as a tool for biomass estimations

    Energy Technology Data Exchange (ETDEWEB)

    Fredriksson, Ronny [Univ. of Kalmar (Sweden)

    2005-12-15

    The aim of this study was to compile the information from previous studies to produce a GIS application that both illustrates the distribution of different vegetation communities and also makes it possible to estimate the total biomass of the different vegetation communities and its associated fauna. The GIS application was created by means of the software Arc View 3.3 by Environmental Systems Research Institute, Inc. Distribution readings and quantitative data of submerged macrophyte communities and its associated fauna was obtained from studies by Kautsky et al. and by Borgiel. Information about the macrophyte distribution in Laangoersviken, located in the northern parts of Kallrigafjaerden, was obtained from a report by Upplandsstiftelsen. Information about water depth and bottom substrate was available as USGS DEM file, produced by Geological Survey of Sweden. Complementary data of the covering degree of submerged vegetation was obtained from a study using an under water video camera by Tobiasson. Quantitative data on macrophyte and faunal biomass were either obtained from the primary SKB data base SICADA or directly from reports. Samples were compiled and analysed according to dominating vegetation. The work was carried out as follows: Where information about the bottom substrate was available polygons were created by means of the substrate shape file and depth grid from Geological Survey of Sweden. The vegetation community and the covering degree on a certain depth and substrate combination were determined by compiled information from studies by Kautsky and by Borgiel. All observations from a certain bottom substrate were analysed to find the dominating vegetation within different depth ranges. After determining the dominating vegetation, the covering degrees of different macrophyte classes within each depth range were calculated as a mean of all readings. Areas without information about the bottom substrate, but still adjacent to areas included in the

  10. Exonic Splicing Mutations Are More Prevalent than Currently Estimated and Can Be Predicted by Using In Silico Tools.

    Directory of Open Access Journals (Sweden)

    Omar Soukarieh

    2016-01-01

    Full Text Available The identification of a causal mutation is essential for molecular diagnosis and clinical management of many genetic disorders. However, even if next-generation exome sequencing has greatly improved the detection of nucleotide changes, the biological interpretation of most exonic variants remains challenging. Moreover, particular attention is typically given to protein-coding changes often neglecting the potential impact of exonic variants on RNA splicing. Here, we used the exon 10 of MLH1, a gene implicated in hereditary cancer, as a model system to assess the prevalence of RNA splicing mutations among all single-nucleotide variants identified in a given exon. We performed comprehensive minigene assays and analyzed patient's RNA when available. Our study revealed a staggering number of splicing mutations in MLH1 exon 10 (77% of the 22 analyzed variants, including mutations directly affecting splice sites and, particularly, mutations altering potential splicing regulatory elements (ESRs. We then used this thoroughly characterized dataset, together with experimental data derived from previous studies on BRCA1, BRCA2, CFTR and NF1, to evaluate the predictive power of 3 in silico approaches recently described as promising tools for pinpointing ESR-mutations. Our results indicate that ΔtESRseq and ΔHZEI-based approaches not only discriminate which variants affect splicing, but also predict the direction and severity of the induced splicing defects. In contrast, the ΔΨ-based approach did not show a compelling predictive power. Our data indicates that exonic splicing mutations are more prevalent than currently appreciated and that they can now be predicted by using bioinformatics methods. These findings have implications for all genetically-caused diseases.

  11. SOAP 2.0: A tool to estimate the photometric and radial velocity variations induced by stellar spots and plages

    CERN Document Server

    Dumusque, X; Santos, N C

    2014-01-01

    This paper presents SOAP 2.0, a new version of the SOAP code that estimates in a simple way the photometric and radial velocity variations induced by active regions. The inhibition of the convective blueshift inside active regions is considered, as well as the limb brightening effect of plages, a quadratic limb darkening law, and a realistic spot and plage contrast ratio. SOAP 2.0 shows that the activity-induced variation of plages is dominated by the inhibition of the convective blueshift effect. For spots, this effect becomes significant only for slow rotators. In addition, in the case of a major active region dominating the activity-induced signal, the ratio between the full width at half maximum (FWHM) and the RV peak-to-peak amplitudes of the cross correlation function can be used to infer the type of active region responsible for the signal for stars with \\vsini$\\le8$\\kms. A ratio smaller than three implies a spot, while a larger ratio implies a plage. Using the observation of HD189733, we show that SOA...

  12. The iron $K_\\alpha$ lines as a tool for magnetic field estimations in non-flat accretion flows

    CERN Document Server

    Zakharov, A F; Bao, Y

    2004-01-01

    Observations of AGNs and microquasars by ASCA, RXTE, Chandra and XMM-Newton indicate the existence of broad X-ray emission lines of ionized heavy elements in their spectra. Such spectral lines were discovered also in X-ray spectra of neutron stars and X-ray afterglows of GRBs. Recently, Zakharov et al. (MNRAS, 2003, 342, 1325) described a procedure to estimate an upper limit of the magnetic fields in regions from which X-ray photons are emitted. The authors simulated typical profiles of the iron $K_\\alpha$ line in the presence of magnetic field and compared them with observational data in the framework of the widely accepted accretion disk model. Here we further consider typical Zeeman splitting in the framework of a model of non-flat accretion flows, which is a generalization of previous consideration into non-equatorial plane motion of particles emitting X-ray photons. Using perspective facilities of space borne instruments (e.g. Constellation-X mission) a better resolution of the blue peak structure of iro...

  13. Cosmological Perturbation Theory as a Tool for Estimating Box-Scale Effects in N-body Simulations

    CERN Document Server

    Orban, Chris

    2013-01-01

    In performing cosmological N-body simulations, it is widely appreciated that the growth of structure on the largest scales within a simulation box will be inhibited by the finite size of the simulation volume. Following ideas set forth in Seto 1999, this paper shows that standard (a.k.a. 1-loop) cosmological perturbation theory (SPT) can be used to predict at an order-of-magnitude level the deleterious effect of the box scale on the power spectrum of density fluctuations in simulation volumes. Alternatively, this approach can be used to quickly estimate post facto the effect of the box scale on power spectrum results from existing simulations. In this way SPT can help determine whether larger box sizes or other more-sophisticated methods are needed to achieve a particular level of precision for a given application (e.g. simulations to measure the non-linear evolution of baryon acoustic oscillations). I focus on SPT in this note and show that its predictions are order-of-magnitude accurate compared to N-body s...

  14. Noise estimation in infrared image sequences: a tool for the quantitative evaluation of the effectiveness of registration algorithms.

    Science.gov (United States)

    Agostini, Valentina; Delsanto, Silvia; Knaflitz, Marco; Molinari, Filippo

    2008-07-01

    Dynamic infrared imaging has been proposed in literature as an adjunctive technique to mammography in breast cancer diagnosis. It is based on the acquisition of hundreds of consecutive thermal images with a frame rate ranging from 50 to 200 frames/s, followed by the harmonic analysis of temperature time series at each image pixel. However, the temperature fluctuation due to blood perfusion, which is the signal of interest, is small compared to the signal fluctuation due to subject movements. Hence, before extracting the time series describing temperature fluctuations, it is fundamental to realign the thermal images to attenuate motion artifacts. In this paper, we describe a method for the quantitative evaluation of any kind of feature-based registration algorithm on thermal image sequences, provided that an estimation of local velocities of reference points on the skin is available. As an example of evaluation of a registration algorithm, we report the evaluation of the SNR improvement obtained by applying a nonrigid piecewise linear algorithm.

  15. The Asset Drivers, Well-being Interaction Matrix (ADWIM: A participatory tool for estimating future impacts on ecosystem services and livelihoods

    Directory of Open Access Journals (Sweden)

    T.D. Skewes

    2016-01-01

    Full Text Available Building an effective response for communities to climate change requires decision-support tools that deliver information which stakeholders find relevant for exploring potential short and long-term impacts on livelihoods. Established principles suggest that to successfully communicate scientific information, such tools must be transparent, replicable, relevant, credible, flexible, affordable and unbiased. In data-poor contexts typical of developing countries, they should also be able to integrate stakeholders’ knowledge and values, empowering them in the process. We present a participatory tool, the Asset Drivers Well-being Interaction Matrix (ADWIM, which estimates future impacts on ecosystem goods and services (EGS and communities’ well-being through the cumulative effects of system stressors. ADWIM consists of two modelling steps: an expert-informed, cumulative impact assessment for EGS; which is then integrated with a stakeholder-informed EGS valuation process carried out during adaptation planning workshops. We demonstrate the ADWIM process using examples from Nusa Tenggara Barat Province (NTB in eastern Indonesia. The semi-quantitative results provide an assessment of the relative impacts on EGS and human well-being under the ‘Business as Usual’ scenario of climate change and human population growth at different scales in NTB, information that is subsequently used for designing adaptation strategies. Based on these experiences, we discuss the relative strengths and weaknesses of ADWIM relative to principles of effective science communication and ecosystem services modelling. ADWIM’s apparent attributes as an analysis, decision support and communication tool promote its utility for participatory adaptation planning. We also highlight its relevance as a ‘boundary object’ to provide learning and reflection about the current and likely future importance of EGS to livelihoods in NTB.

  16. SVPWM Technique with Varying DC-Link Voltage for Common Mode Voltage Reduction in a Matrix Converter and Analytical Estimation of its Output Voltage Distortion

    Science.gov (United States)

    Padhee, Varsha

    Common Mode Voltage (CMV) in any power converter has been the major contributor to premature motor failures, bearing deterioration, shaft voltage build up and electromagnetic interference. Intelligent control methods like Space Vector Pulse Width Modulation (SVPWM) techniques provide immense potential and flexibility to reduce CMV, thereby targeting all the afore mentioned problems. Other solutions like passive filters, shielded cables and EMI filters add to the volume and cost metrics of the entire system. Smart SVPWM techniques therefore, come with a very important advantage of being an economical solution. This thesis discusses a modified space vector technique applied to an Indirect Matrix Converter (IMC) which results in the reduction of common mode voltages and other advanced features. The conventional indirect space vector pulse-width modulation (SVPWM) method of controlling matrix converters involves the usage of two adjacent active vectors and one zero vector for both rectifying and inverting stages of the converter. By suitable selection of space vectors, the rectifying stage of the matrix converter can generate different levels of virtual DC-link voltage. This capability can be exploited for operation of the converter in different ranges of modulation indices for varying machine speeds. This results in lower common mode voltage and improves the harmonic spectrum of the output voltage, without increasing the number of switching transitions as compared to conventional modulation. To summarize it can be said that the responsibility of formulating output voltages with a particular magnitude and frequency has been transferred solely to the rectifying stage of the IMC. Estimation of degree of distortion in the three phase output voltage is another facet discussed in this thesis. An understanding of the SVPWM technique and the switching sequence of the space vectors in detail gives the potential to estimate the RMS value of the switched output voltage of any

  17. Estimation of matal balances a tool for improving of management in a farm from polluted area Copsa Mica

    Science.gov (United States)

    Olimpia Vrinceanu, Nicoleta; Simota, Catalin; Motelica, Dumitru-Marian; Dumitru, Mihail; Ignat, Petru; Vrinceanu, Andrei; Mircea Rotaru, Lucian

    2015-04-01

    Long-term accumulation of heavy metals in arable ecosystems from Copsa Mica area negatively affecting soil fertility and product quality. A sustainable heavy metal management in these agro-ecosystems allows to ensure that the soils continues to fulfill its functions and to provide its ecosystem services (especially supporting and provisioning services). An analysis of the input and output flows of heavy metals in agro-ecosystems and of their resulting accumulation is necessary to define strategies that ensure sustainable management of these metals in agricultural systems. The aim of this study was to calculate the farm-gate and barn balances for the heavy metals (Cd, Pb and Zn) using the data from a farm located in polluted area Copşa Mică. For all heavy metals (Cd, Pb and Zn) farm-gate balances are negative; the export of metal in the farm was done mainly through the manure. The barn balance for cadmium was positive, indicating an accumulation of metal in the system. Inputs of cadmium in the system were estimated at 163.67 g Cd / year and losses of cadmium from the system were made mainly through manure (77.22g Cd / year). Both lead and zinc barn-gate balances are negative. Also externalization of lead and zinc in the system was achieved by manure (969 g Pb / year and 2390 g Zn / year). Monitoring metal balances at different scales (farm-gate, barn) proved to a successful way to identifying farm management issues not revealed by determining metal balances at the farm-gate alone. The main finding was that the substantial amounts of cadmium, lead and zinc were released from internal sources, mainly through fodder obtained from their own land (some plots are located in polluted area). The manure is the main contributor to outflows both for heavy metals. Using this manure as organic fertilizer could lead to accumulation of cadmium in soil with major risk on soil fertility and crop quality.

  18. 云南省玉龙县碳减排效应估算%Estimation of Carbon Emission Reduction in Yulong County of Yunnan Province

    Institute of Scientific and Technical Information of China (English)

    郭飞; 香宝; 马广文; 李双权

    2009-01-01

    Major components of carbon emission reductions in Yulong County include conversion from farmland to forests, afforestation, small hydro powerand, and usage of alternative energy including solar power and methane digesters. The carbon sinks of afforestation have long-term functionality, providing continuous carbon sink services. Usage of alternative energy can effectively cut down the consumption of wood for combustion and reduce carbon emissions. Wood is the major energy source used by families of Yulong County, where most of population are farmers, therefore, the usage of alternative energy is highly necessary. According to the research on the situation in Yulong County in 2005, the carbon emission reduction from small hydro power, solar power and methane digesters was 14.43×10~3 tons C, 66.24 tons C and 7.74×10~3 tons C, respectively. The total carbon emission reduction from the use of small hydro power and alternative energy was 22.24×10~3 tons C. Meanwhile, scenarios of how forests could grow and alternative energy could be used were identified, and scenario simulation was carried out to estimate the effects of afforestation. The carbon sink could be 21.07×10~3 tons C in five years and 24.92×10~3 tons C in ten years. Small hydro power contributes most to the total emission reduction, followed by methane digesters and afforestation, while reforestation and grasslands contribute the least. As time goes on, the contribution from small hydro power and methane digesters will increase.%玉龙县碳减排效应估算主要包括退耕还林、造林、小水电、替代能源(包括沼气池和太阳能)使用等方面. 其中,造林具有较长期的碳汇功能,会持续提供碳增汇服务功能;替代能源的使用可以有效减少薪柴燃烧,在以农村人口为主的玉龙县,薪柴是主要的取暖和生活能源,故替代能源的使用颇具意义. 以2005年为基准年,小水电和替代能源总的碳减排效应为 22.24×10~3 t,其中

  19. Use of modified threat reduction assessments to estimate success of conservation measures within and adjacent to Kruger National Park, South Africa.

    Science.gov (United States)

    Anthony, Brandon P

    2008-12-01

    The importance of biodiversity as natural capital for economic development and sustaining human welfare is well documented. Nevertheless, resource degradation rates and persistent deterioration of human welfare in developing countries is increasingly worrisome. Developing effective monitoring and evaluation schemes and measuring biodiversity loss continue to pose unique challenges, particularly when there is a paucity of historical data. Threat reduction assessment (TRA) has been proposed as a method to measure conservation success and as a proxy measurement of conservation impact, monitoring threats to resources rather than changes to biological parameters themselves. This tool is considered a quick, practical alternative to more cost- and time-intensive approaches, but has inherent weaknesses. I conducted TRAs to evaluate the effectiveness of Kruger National Park (KNP) and Limpopo Province, South Africa, in mitigating threats to biodiversity from 1994 to 2004 in 4 geographical areas. I calculated TRA index values in these TRAs by using the original scoring developed by Margoluis and Salafsky (2001)and a modified scoring system that assigned negative mitigation values to incorporate new or worsening threats. Threats were standardized to allow comparisons across the sites. Modified TRA index values were significantly lower than values derived from the original scoring exercise. Five of the 11 standardized threats were present in all 4 assessment areas, 2 were restricted to KNP, 2 to Limpopo Province, and 2 only to Malamulele municipality. These results indicate, first, the need to integrate negative mitigation values into TRA scoring. By including negative values, investigators will be afforded a more accurate picture of biodiversity threats and of temporal and spatial trends across sites. Where the original TRA scoring was used to measure conservation success, reevaluation of these cases with the modified scoring is recommended. Second, practitioners must

  20. Technical measures without enforcement tools: is there any sense? A methodological approach for the estimation of passive net length in small scale fisheries

    Directory of Open Access Journals (Sweden)

    A. LUCCHETTI

    2014-09-01

    Full Text Available Passive nets are currently among the most important fishing gears largely used along the Mediterranean coasts by the small scale fisheries sector. The fishing effort exerted by this sector is strongly correlated with net dimensions. Therefore, the use of passive nets is worldwide managed by defining net length and net drop. The EC Reg. 1967/2006 reports that the length of bottom-set and drifting nets may be also defined considering their weight or volume; however, no practical suggestions for fisheries inspectors are yet available. Consequently,  even if such technical measures are reasonable from a theoretical viewpoint, they are hardly suitable as a management tool, due to the difficulties in harbour control. The overall objective of this paper is to provide a quick methodological approach for the gross estimation of passive net length (by net type on the basis of net volume. The final goal is to support fisheries managers with suitable advice for enforcement and control purposes. The results obtained are important for the management of the fishing effort exerted by small scale fisheries. The methodology developed in this study should be considered as a first attempt to tackle the tangled problem of net length estimation that can be easily applied in other fisheries and areas in order to improve the precision of the models developed herein.

  1. Technical measures without enforcement tools: is there any sense? A methodological approach for the estimation of passive net length in small scale fisheries

    Directory of Open Access Journals (Sweden)

    A. LUCCHETTI

    2015-01-01

    Full Text Available Passive nets are currently among the most important fishing gears largely used along the Mediterranean coasts by the small scale fisheries sector. The fishing effort exerted by this sector is strongly correlated with net dimensions. Therefore, the use of passive nets is worldwide managed by defining net length and net drop. The EC Reg. 1967/2006 reports that the length of bottom-set and drifting nets may be also defined considering their weight or volume; however, no practical suggestions for fisheries inspectors are yet available. Consequently,  even if such technical measures are reasonable from a theoretical viewpoint, they are hardly suitable as a management tool, due to the difficulties in harbour control. The overall objective of this paper is to provide a quick methodological approach for the gross estimation of passive net length (by net type on the basis of net volume. The final goal is to support fisheries managers with suitable advice for enforcement and control purposes. The results obtained are important for the management of the fishing effort exerted by small scale fisheries. The methodology developed in this study should be considered as a first attempt to tackle the tangled problem of net length estimation that can be easily applied in other fisheries and areas in order to improve the precision of the models developed herein.

  2. Ion implantation of superhard ceramic cutting tools

    Science.gov (United States)

    Chou, Y. Kevin; Liu, Jie

    2004-08-01

    Despite numerous reports of tool life increase by ion implantation in machining operations, ion implantation applications of cutting tools remain limited, especially for ceramic tools. Mechanisms of tool-life improvement by implantation are not clearly established due to complexity of both implantation and tool-wear processes. In an attempt to improve performance of cubic boron nitride (CBN) tools for hard machining by ion implantation, a literature survey of ion-implanted cutting tools was carried out with a focus on mechanisms of tool-wear reduction by ion implantation. Implantation and machining experiments were then conducted to investigate implantation effects on CBN tools in hard machining. A batch of CBN tools was implanted with nitrogen ions at 150 keV and 2.5×1017 ions/cm2 and further used to cut 61 HRc AISI 52100 steel at different conditions. Results show that ion implantation has strong effects on partsurface finish, moderate effect on cutting forces, but an insignificant impact on tool wear. Friction coefficients, estimated from measured cutting forces, are possibly reduced by ion implantation, which may improve surface finish. However, surprisingly, 2-D orthogonal cutting to evaluate tribological loading in hard machining showed no difference on contact stresses and friction coefficients between implanted and nonimplanted CBN tools.

  3. GPP/RE Partitioning of Long-term Network Flux Data as a Tool for Estimating Ecosystem-scale Ecophysiological Parameters of Grasslands and Croplands

    Science.gov (United States)

    Gilmanov, T. G.; Wylie, B. K.; Gu, Y.; Howard, D. M.; Zhang, L.

    2013-12-01

    The physiologically based model of canopy CO2 exchange by Thornly and Johnson (2000) modified to incorporate vapor pressure deficit (VPD) limitation of photosynthesis is a robust tool for partitioning tower network net CO2 exchange data into gross photosynthesis (GPP) and ecosystem respiration (RE) (Gilmanov et al. 2013a, b). In addition to 30-min and daily photosynthesis and respiration values, the procedure generates daily estimates and uncertainties of essential ecosystem-scale parameters such as apparent quantum yield ALPHA, photosynthetic capacity AMAX, convexity of light response THETA, gross ecological light-use efficiency LUE, daytime ecosystem respiration rate RDAY, and nighttime ecosystem respiration rate RNIGHT. These ecosystem-scale parameters are highly demanded by the modeling community and open opportunities for comparison with the rich data of leaf-level estimates of corresponding parameters available from physiological studies of previous decades. Based on the data for 70+ site-years of flux tower measurements at the non-forest sites of the Ameriflux network and the non-affiliated sites, we present results of the comparative analysis and multi-site synthesis of the magnitudes, uncertainties, patterns of seasonal and yearly dynamics, and spatiotemporal distribution of these parameters for grasslands and croplands of the conterminous United States (CONUS). Combining this site-level parameter data set with the rich spatiotemporal data sets of a remotely sensed vegetation index, weather and climate conditions, and site biophysical and geophysical features (phenology, photosynthetically active radiation, and soil water holding capacity) using methods of multivariate analysis (e.g., Cubist regression tree) offers new opportunities for predictive modeling and scaling-up of ecosystem-scale parameters of carbon cycling in grassland and agricultural ecosystems of CONUS (Zhang et al. 2011; Gu et al. 2012). REFERENCES Gilmanov TG, Baker JM, Bernacchi CJ

  4. The VeTOOLS Project: an example of how to strengthen collaboration between scientists and Civil Protections in disaster risk reduction

    Science.gov (United States)

    Marti, Joan; Bartolini, Stefania; Becerril, Laura

    2016-04-01

    VeTOOLS is a project funded by the European Commission's Humanitarian Aid and Civil Protection department (ECHO), and aims at creating an integrated software platform specially designed to assess and manage volcanic risk. The project facilitates interaction and cooperation between scientists and Civil Protection Agencies in order to share, unify, and exchange procedures, methodologies and technologies to effectively reduce the impacts of volcanic disasters. The project aims at 1) improving and developing volcanic risk assessment and management capacities in active volcanic regions; 2) developing universal methodologies, scenario definitions, response strategies and alert protocols to cope with the full range of volcanic threats; 4) improving quantitative methods and tools for vulnerability and risk assessment; and 5) defining thresholds and protocols for civil protection. With these objectives, the VeTOOLS project points to two of the Sendai Framework resolutions for implementing it: i) Provide guidance on methodologies and standards for risk assessments, disaster risk modelling and the use of data; ii) Promote and support the availability and application of science and technology to decision-making, and offers a good example on how a close collaboration between science and civil protection is an effective way to contribute to DRR. European Commission ECHO Grant SI2.695524

  5. Analysis of metal artifact reduction tools for dental hardware in CT scans of the oral cavity: kVp, iterative reconstruction, dual-energy CT, metal artifact reduction software: does it make a difference?

    Energy Technology Data Exchange (ETDEWEB)

    Crop, An de; Hoof, Tom van; Herde, Katharina d' ; Thierens, Hubert; Bacher, Klaus [Ghent University, Department of Basic Medical Sciences, Gent (Belgium); Casselman, Jan; Vereecke, Elke; Bossu, Nicolas [AZ Sint Jan Bruges Ostend AV, Department of Radiology, Bruges (Belgium); Dierens, Melissa [Ghent University, Dental School, Unit for Oral and Maxillofacial Imaging, Ghent (Belgium); Pamplona, Jaime [Hospital Lisboa Central, Department of Neuroradiology, Lisbon (Portugal)

    2015-08-15

    Metal artifacts may negatively affect radiologic assessment in the oral cavity. The aim of this study was to evaluate different metal artifact reduction techniques for metal artifacts induced by dental hardware in CT scans of the oral cavity. Clinical image quality was assessed using a Thiel-embalmed cadaver. A Catphan phantom and a polymethylmethacrylate (PMMA) phantom were used to evaluate physical-technical image quality parameters such as artifact area, artifact index (AI), and contrast detail (IQF{sub inv}). Metal cylinders were inserted in each phantom to create metal artifacts. CT images of both phantoms and the Thiel-embalmed cadaver were acquired on a multislice CT scanner using 80, 100, 120, and 140 kVp; model-based iterative reconstruction (Veo); and synthesized monochromatic keV images with and without metal artifact reduction software (MARs). Four radiologists assessed the clinical image quality, using an image criteria score (ICS). Significant influence of increasing kVp and the use of Veo was found on clinical image quality (p = 0.007 and p = 0.014, respectively). Application of MARs resulted in a smaller artifact area (p < 0.05). However, MARs reconstructed images resulted in lower ICS. Of all investigated techniques, Veo shows to be most promising, with a significant improvement of both the clinical and physical-technical image quality without adversely affecting contrast detail. MARs reconstruction in CT images of the oral cavity to reduce dental hardware metallic artifacts is not sufficient and may even adversely influence the image quality. (orig.)

  6. Estimate of the technological costs of CO{sub 2} emission reductions in passenger cars. Emission reduction potentials and their costs; Technikkostenschaetzung fuer die CO{sub 2}-Emissionsminderung bei Pkw. Emissionsminderungspotenziale und ihre Kosten

    Energy Technology Data Exchange (ETDEWEB)

    Herbener, Reinhard; Jahn, Helge; Wetzel, Frank [Umweltbundesamt, Dessau-Rosslau (Germany). Fachgebiet I 3.2 - Schadstoffminderung und Energieeinsparung im Verkehr

    2008-08-06

    The Federal Environmental Office intended to identify the current fuel consumption reduction potential and the cost of efficiency-enhancing measures on passenger cars. For this purpose, an extensive bibliographic search was carried out, and experts from research institutes and from the automobile supplier industry were asked for their opinion. The results are published in table form. (orig.)

  7. Toward robust deconvolution of pass-through paleomagnetic measurements: new tool to estimate magnetometer sensor response and laser interferometry of sample positioning accuracy

    Science.gov (United States)

    Oda, Hirokuni; Xuan, Chuang; Yamamoto, Yuhji

    2016-07-01

    Pass-through superconducting rock magnetometers (SRM) offer rapid and high-precision remanence measurements for continuous samples that are essential for modern paleomagnetism studies. However, continuous SRM measurements are inevitably smoothed and distorted due to the convolution effect of SRM sensor response. Deconvolution is necessary to restore accurate magnetization from pass-through SRM data, and robust deconvolution requires reliable estimate of SRM sensor response as well as understanding of uncertainties associated with the SRM measurement system. In this paper, we use the SRM at Kochi Core Center (KCC), Japan, as an example to introduce new tool and procedure for accurate and efficient estimate of SRM sensor response. To quantify uncertainties associated with the SRM measurement due to track positioning errors and test their effects on deconvolution, we employed laser interferometry for precise monitoring of track positions both with and without placing a u-channel sample on the SRM tray. The acquired KCC SRM sensor response shows significant cross-term of Z-axis magnetization on the X-axis pick-up coil and full widths of ~46-54 mm at half-maximum response for the three pick-up coils, which are significantly narrower than those (~73-80 mm) for the liquid He-free SRM at Oregon State University. Laser interferometry measurements on the KCC SRM tracking system indicate positioning uncertainties of ~0.1-0.2 and ~0.5 mm for tracking with and without u-channel sample on the tray, respectively. Positioning errors appear to have reproducible components of up to ~0.5 mm possibly due to patterns or damages on tray surface or rope used for the tracking system. Deconvolution of 50,000 simulated measurement data with realistic error introduced based on the position uncertainties indicates that although the SRM tracking system has recognizable positioning uncertainties, they do not significantly debilitate the use of deconvolution to accurately restore high

  8. Strategies for poverty reduction

    OpenAIRE

    Øyen, Else

    2003-01-01

    SIU konferanse Solstrand 6.-7. October 2003 Higher education has a value of its own. When linked to the issue of poverty reduction it is necessary to ask another set of questions, including the crutial one whether higher education in general is the best tool for poverty reduction.

  9. Strategies for poverty reduction

    OpenAIRE

    Øyen, Else

    2003-01-01

    SIU konferanse Solstrand 6.-7. October 2003 Higher education has a value of its own. When linked to the issue of poverty reduction it is necessary to ask another set of questions, including the crutial one whether higher education in general is the best tool for poverty reduction.

  10. NaBH{sub 4} reduction of alkenones to the corresponding alkenols: a useful tool for their characterisation in natural samples

    Energy Technology Data Exchange (ETDEWEB)

    Rontani, J.F.; Marchand, D. [Centre d' Oceanologie de Marseille (France). Lab. d' Oceanographie et de Biogeochimie; Volkman, J.K. [CSIRO Marine Research, Hobart, Tasmania (Australia)

    2001-07-01

    During a search for phytoplanktonic lipid photoproducts in sediment extracts that had been reduced with NaBH{sub 4}, we observed that the alkenones present had been reduced quantitatively to the corresponding unsaturated alcohols (alkenols). The silylated alkenols display better chromatographic characteristics than the corresponding alkenones and very useful EI mass spectra, which show strong fragment ions at m/z 117 or m/z 131 due to cleavage {alpha} to the functional group, allowing methyl and ethyl alkenols (and hence the parent alkenones) to be readily differentiated. Calculations of U{sup K{sub 37}} values using the alkenols formed by reduction agreed within experimental error ({+-}0.01 units) with values calculated from the original alkenones, suggesting that this might be a useful method for samples containing low contents of alkenones or where co-elution is a problem. Although small amounts of alkenols have been found in some sediments their presence does not appear to affect U{sup K{sub 37}} calculations. We have found large amounts of alkenols (up to 20% of the amount of the corresponding alkenone) in extracts of some sediments from Camargue (France), which we suggest are probably formed by a non-selective bacterial reduction of alkenones under anoxic conditions. The possibility of a natural contribution of alkenols from haptophytes was tested and we were able for the first time to detect traces (less than 1% of the amount of the corresponding alkenone) of alkenols in non-reduced extracts of Gephyrocapsa oceanica and Isochrysis galbana. We presume that their biosynthesis is closely related to that of the alkenones. Studies of other haptophytes (particularly those from benthic or coastal environments) are needed to ascertain whether other species might contain higher contents of alkenols. The reduction-silylation technique has also proven to be very useful for the characterisation of previously unreported alkenones in microalgae, sediments and seawater. A

  11. A business intelligence approach using web search tools and online data reduction techniques to examine the value of product-enabled services

    DEFF Research Database (Denmark)

    Tanev, Stoyan; Liotta, Giacomo; Kleismantas, Andrius

    2015-01-01

    in Canada and Europe. It adopts an innovative methodology based on online textual data that could be implemented in advanced business intelligence tools aiming at the facilitation of innovation, marketing and business decision making. Combinations of keywords referring to different aspects of service value...... were designed and used in a web search resulting in the frequency of their use on companies’ websites. Principal component analysis was applied to identify distinctive groups of keyword combinations that were interpreted in terms of specific service value attributes. Finally, the firms were classified...... by means of K-means cluster analysis in order to identify the firms with a high degree of articulation of their service value attributes. The results show that the main service value attributes of the Canadian firms are: better service effectiveness, higher market share, higher service quality...

  12. Identification of abiotic and biotic reductive dechlorination in a chlorinated ethene plume after thermal source remediation by means of isotopic and molecular biology tools

    DEFF Research Database (Denmark)

    Badin, Alice; Broholm, Mette Martina; Jacobsen, Carsten S.

    2016-01-01

    was the predominant chlorinated ethene near the source area prior to thermal treatment. After thermal treatment, cDCE became predominant. The biotic contribution to these changes was supported by the presence of Dehalococcoides sp. DNA (Dhc) and Dhc targeted rRNA close to the source area. In contrast, dual C...... is supported by the relative lack of Dhc in the downgradient part of the plume. The results of this study show that thermal remediation can enhance the biodegradation of chlorinated ethenes, and that this effect can be traced to the mobilisation of DOC due to steam injection. This, in turn, results in more...... with molecular biology tools to evaluate which biogeochemical processes are taking place in an aquifer contaminated with chlorinated ethenes....

  13. Joint Motion Estimation and Layer Segmentation in Transparent Image Sequences—Application to Noise Reduction in X-Ray Image Sequences

    Directory of Open Access Journals (Sweden)

    Jean Liénard

    2009-01-01

    Full Text Available This paper is concerned with the estimation of the motions and the segmentation of the spatial supports of the different layers involved in transparent X-ray image sequences. Classical motion estimation methods fail on sequences involving transparent effects since they do not explicitly model this phenomenon. We propose a method that comprises three main steps: initial block-matching for two-layer transparent motion estimation, motion clustering with 3D Hough transform, and joint transparent layer segmentation and parametric motion estimation. It is validated on synthetic and real clinical X-ray image sequences. Secondly, we derive an original transparent motion compensation method compatible with any spatiotemporal filtering technique. A direct transparent motion compensation method is proposed. To overcome its limitations, a novel hybrid filter is introduced which locally selects which type of motion compensation is to be carried out for optimal denoising. Convincing experiments on synthetic and real clinical images are also reported.

  14. The extended running W-plasty:an additional tool for simultaneous reduction of the hypertrophied labia minora and redundant clitoral hood

    Institute of Scientific and Technical Information of China (English)

    Hamdy A. Elkhatib

    2016-01-01

    Aim: The extended running W-plasty technique using the W-plasty principle is a modiifcation of the conventional technique. The use of this technique was utilized for simultaneous reduction of the protuberant labia minora and the redundant clitoris.Methods: Twenty-three patients presented to the plastic surgery clinic between 2008 and 2015 with the complaints of protuberant and enlarged labia minora in conjunction with a hypertrophied clitorial hood. The extended running W-plasty was performed in all patients. Surgery was performed under general anesthesia as an outpatient procedure with a range of operative time from 30-45 min. The Likert scale was used to evaluate outcomes.Results: Patients maintained labial length with decreased scarring. Small hematomas occurred in 2 patients and were treated conservatively. One case of wound dehiscence occurred and was also treated conservatively. Patients returned to normal activity 5-7 days postoperatively. The cosmetic outcome of all patients was very satisfactory.Conclusion: The running W-plasty technique is ideal for closure of secondary defects following excision of both the redundant labia minora and clitoral hood, while maintaining length and providing tensionless scars. The technique conserves the original tissues while avoiding over- or under- resection of the labia.

  15. Planning and clinical studies of a commercial orthopedic metal artifact reduction tool for CT simulations for head-and-neck radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Kon, Hyuck Jun; Ye, Sung Joon [Interdisplinary Program in Radiation Applied Life Science, Seoul National University Graduate School, Seoul (Korea, Republic of); Kim, Jung In; Park, Jong Min; Lee, Jae Gi; Heo, Tae Min [Institute of Radiation Medicine, Seoul National University Medical Research Center, Seoul (Korea, Republic of); Kim Kyung Su [Dept. of Radiation Oncology, Seoul National University College of Medicine, Seoul (Korea, Republic of); Chun, Young Mi [Philips Healthcare Korea, Seoul (Korea, Republic of); Callahan, Zachariah [Program in Biomedical Radiation Sciences, Dept. of Transdisciplinary Studies, Graduate School of Convergence Science and Technology, Seoul National University, Seoul (Korea, Republic of)

    2013-11-15

    In computed tomography (CT) images, the presence of high Z materials induces typical streak artifacts, called metal artifacts which can pervert CT Hounsfield numbers in the reconstructed images. These artifact-induced distortion of CT images can impact on the dose calculation based on the CT images. In the radiation therapy of Head-and-Neck cancer because of the concave-shaped target volumes, the complex anatomy, a lot of sensitive normal tissues and air cavity structures, it is important to get accurate CT images for dose calculation. But dental implant is common for H and N patients so that it is hard to get undistorted CT images. Moreover because dental implants are generally with the air cavity like oral cavity and nasal cavity in the same CT slice, they can make lots of distortion. In this study, we focused on evaluating the distortion on air cavity by the metal artifact and the effectiveness of the commercial orthopedic metal artifact reduction function (O-MAR) about the metal artifacts induced by the dental implant. The O-MAR algorithm increases the accuracy of CT Hounsfield numbers and reducing noises. Thus, it can contribute to the entire radiation treatment planning process, especially for contouring/segmentation. Although there was no significant difference in dose distributions for most cases, the O-MAR correction was shown to have an impact on high dose regions in air cavity.

  16. Estimated cardiovascular relative risk reduction from fixed-dose combination pill (polypill) treatment in a wide range of patients with a moderate risk of cardiovascular disease

    NARCIS (Netherlands)

    Lafeber, Melvin; Webster, Ruth; Visseren, Frank L J; Bots, Michiel L.; Grobbee, Diederick E.; Spiering, W.; Rodgers, Anthony

    2016-01-01

    Aims Recent data indicate that fixed-dose combination (FDC) pills, polypills, can produce sizeable risk factor reductions. There are very few published data on the consistency of the effects of a polypill in different patient populations. It is unclear for example whether the effects of the polypill

  17. An Estimation of Reduction of the Primary Energy and the CO2 Emission in Residential PEFC Co-Generation System with Li-ion Battery Modules

    Science.gov (United States)

    Maeda, Kazushige; Yonemori, Hideto; Yasaka, Yasuyoshi

    This paper presents the effects of introduction of residential polymer electrolyte fuel cell (PEFC) co-generation system with batteries in comparison with conventional systems that consist of a gas boiler and electric power from commercial grid, by computer simulation. The PEFC co-generation system in commercial use provides the average primary energy saving rate of 12.7% and CO2 reduction rate of 15.4% with respect to the conventional system. Addition of 8.0-kWh batteries to the PEFC system results in limited improvements of 0.8 points and 0.9 points in the reduction rates, respectively, yielding 13.5% and 16.3%, when using a conventional operation planning method. A new operation planning method is proposed in order to make a precise control of charging and discharging the batteries. The average primary energy saving rate reaches up to 16.9% by the improvement of 4.2 points, and CO2 reduction rate reaches up to 20.4% by the improvement of 5.0 points in the PEFC co-generation system with 8.0-kWh batteries using the new operation planning method. The new method can thus realize a substantial improvement in reduction rates. Furthermore, it is shown that the suitable battery module capacity for the residential PEFC co-generation system is 4.0kWh.

  18. Estimation of the cost savings resulting from the use of ursodiol for the prevention of gallstones in obese patients undergoing rapid weight reduction.

    Science.gov (United States)

    Shoheiber, O; Biskupiak, J E; Nash, D B

    1997-11-01

    Morbidly obese patients enrolled in a rapid weight reduction program are at a high risk of developing gallstones. Two multicenter, placebo-controlled, randomized, double-blind trials have demonstrated that the prophylactic use of ursodiol in males and females 18 to 70 years of age is effective for the prevention of gallstone formation in this patient population. This study examines the cost consequences associated with the prophylactic use of ursodiol. A medical decision analysis model for the prophylactic administration of ursodiol in morbidly obese patients undergoing rapid weight reduction by either gastric bypass surgery or very-low-calorie-diet, was developed through the use of data from two clinical trials and review of the related literature. The expert opinion of clinicians from the fields of internal medicine, gastroenterology and surgery were solicited. Financial data for the charges associated with cholecystectomies, physician fees and ursodiol were obtained from current financial databases. The model demonstrates that the prophylactic administration of ursodiol, in morbidly obese patients undergoing rapid weight reduction, results in cost savings. Sensitivity analysis was performed to illustrate that the cost savings achieved by the prophylactic use of ursodiol were valid over a realistic range of charges and assumptions. The decision model may allow health care decision makers to apply their own data to the model to determine the cost savings obtainable through the prophylactic use of ursodiol in patients undergoing rapid weight reduction.

  19. Development of an assessment tool to measure students′ perceptions of respiratory care education programs: Item generation, item reduction, and preliminary validation

    Directory of Open Access Journals (Sweden)

    Ghazi Alotaibi

    2013-01-01

    Full Text Available Objectives: Students who perceived their learning environment positively are more likely to develop effective learning strategies, and adopt a deep learning approach. Currently, there is no validated instrument for measuring the educational environment of educational programs on respiratory care (RC. The aim of this study was to develop an instrument to measure students′ perception of the RC educational environment. Materials and Methods: Based on the literature review and an assessment of content validity by multiple focus groups of RC educationalists, potential items of the instrument relevant to RC educational environment construct were generated by the research group. The initial 71 item questionnaire was then field-tested on all students from the 3 RC programs in Saudi Arabia and was subjected to multi-trait scaling analysis. Cronbach′s alpha was used to assess internal consistency reliabilities. Results: Two hundred and twelve students (100% completed the survey. The initial instrument of 71 items was reduced to 65 across 5 scales. Convergent and discriminant validity assessment demonstrated that the majority of items correlated more highly with their intended scale than a competing one. Cronbach′s alpha exceeded the standard criterion of >0.70 in all scales except one. There was no floor or ceiling effect for scale or overall score. Conclusions: This instrument is the first assessment tool developed to measure the RC educational environment. There was evidence of its good feasibility, validity, and reliability. This first validation of the instrument supports its use by RC students to evaluate educational environment.

  20. Poster — Thur Eve — 11: Validation of the orthopedic metallic artifact reduction tool for CT simulations at the Ottawa Hospital Cancer Centre

    Energy Technology Data Exchange (ETDEWEB)

    Sutherland, J; Foottit, C [The Ottawa Hospital Cancer Centre (Canada)

    2014-08-15

    Metallic implants in patients can produce image artifacts in kilovoltage CT simulation images which can introduce noise and inaccuracies in CT number, affecting anatomical segmentation and dose distributions. The commercial orthopedic metal artifact reduction algorithm (O-MAR) (Philips Healthcare System) was recently made available on CT simulation scanners at our institution. This study validated the clinical use of O-MAR by investigating its effects on CT number and dose distributions. O-MAR corrected and uncorrected images were acquired with a Philips Brilliance Big Bore CT simulator of a cylindrical solid water phantom that contained various plugs (including metal) of known density. CT number accuracy was investigated by determining the mean and standard deviation in regions of interest (ROI) within each plug for uncorrected and O-MAR corrected images and comparing with no-metal image values. Dose distributions were calculated using the Monaco treatment planning system. Seven open fields were equally spaced about the phantom around a ROI near the center of the phantom. These were compared to a “correct” dose distribution calculated by overriding electron densities a no-metal phantom image to produce an image containing metal but no artifacts. An overall improvement in CT number and dose distribution accuracy was achieved by applying the O-MAR correction. Mean CT numbers and standard deviations were found to be generally improved. Exceptions included lung equivalent media, which is consistent with vendor specified contraindications. Dose profiles were found to vary by ±4% between uncorrected or O-MAR corrected images with O-MAR producing doses closer to ground truth.

  1. ANALYSIS OF REDUCTION IN COMPLEXITY OF MULTIPLE INPUT- MULTIPLE OUTPUT-ORTHOGONAL FREQUENCY DIVISION MULTIPLEXING SYSTEMS WITH CARRIER FREQUENCY OFFSET ESTIMATION AND CORRECTION

    Directory of Open Access Journals (Sweden)

    Sabitha Gauni

    2014-01-01

    Full Text Available Orthogonal Frequency Division Multiplexing (OFDM is a promising research area in Wireless Communication for high data rates. The Multiple Input-Multiple Output (MIMO technology when incorporated with the OFDM system promise a significant boost in the performance. But, the MIMO-OFDM systems are very sensitive to Carrier Frequency Offset (CFO as it deteriorates the system performance with the rise of Inter-Carrier-Interference (ICI. A theoretical analysis to evaluate the performance of Orthogonal Frequency Division Multiplexing (OFDM systems is done here, under the combined influence of Phase Offset and Frequency Offset over Rayleigh, Weibull and Nakagami fading channels using Binary Phase Shift Keying (BPSK Modulation. The analysis of increase in Bit Error Rate (BER caused by the presence of Phase Offset and Frequency Offset is evaluated assuming Gaussian probability density function. Hence the estimation and correction of CFO plays a vital role in MIMO-OFDM systems. A method for CFO estimation and correction is analyzed in the MIMO-OFDM system with 16-QAM modulation. In non-pilot-aided systems the CFO acquisition is done using Maximum Likelihood Estimation (MLE algorithm. The proposed scheme uses the same block for CFO correction of MIMO-OFDM symbols. Since the same phase calculation block is used for the ML estimation along with the acquisition, the computational cost and the complexity of implementation is reduced.

  2. Electron transfer between iron minerals and quinones: estimating the reduction potential of the Fe(II)-goethite surface from AQDS speciation.

    Science.gov (United States)

    Orsetti, Silvia; Laskov, Christine; Haderlein, Stefan B

    2013-12-17

    Redox reactions at iron mineral surfaces play an important role in controlling biogeochemical processes of natural porous media such as sediments, soils and aquifers, especially in the presence of recurrent variations in redox conditions. Ferrous iron associated with iron mineral phases forms highly reactive species and is regarded as a key factor in determining pathways, rates, and extent of chemically and microbially driven electron transfer processes across the iron mineral-water interface. Due to their transient nature and heterogeneity a detailed characterization of such surface bound Fe(II) species in terms of redox potential is still missing. To this end, we used the nonsorbing anthraquinone-2,6-disulfonate (AQDS) as a redox probe and studied the thermodynamics of its redox reactions in heterogeneous iron systems, namely goethite-Fe(II). Our results provide a thermodynamic basis for and are consistent with earlier observations on the ability of AQDS to "shuttle" electrons between microbes and iron oxide minerals. On the basis of equilibrium AQDS speciation we reported for the first time robust reduction potential measurements of reactive iron species present at goethite in aqueous systems (EH,Fe-GT ≈ -170 mV). Due to the high redox buffer intensity of heterogeneous mixed valent iron systems, this value might be characteristic for many iron-reducing environments in the subsurface at circumneutral pH. Our results corroborate the picture of a dynamic remodelling of Fe(II)/Fe(III) surface sites at goethite in response to oxidation/reduction events. As quinones play an essential role in the electron transport systems of microbes, the proposed method can be considered as a biomimetic approach to determine "effective" biogeochemical reduction potentials in heterogeneous iron systems.

  3. EFSA Panel on Biological Hazards (BIOHAZ); Scientific Opinion on an estimation of the public health impact of setting a new target for the reduction of Salmonella in turkeys

    DEFF Research Database (Denmark)

    Hald, Tine

    The quantitative contribution of turkeys and other major animal-food sources to the burden of human salmonellosis in the European Union was estimated. A ‘Turkey Target Salmonella Attribution Model’ (TT-SAM) based on the microbial-subtyping approach was used. TT-SAM includes data from 25 EU Member...... States, four animal-food sources of Salmonella and 23 Salmonella serovars. The model employs 2010 EU statutory monitoring data on Salmonella in animal populations (EU baseline survey data for pigs), data on reported cases of human salmonellosis and food availability data. It estimates that 2.6 %, 10.......6 %, 17.0 % and 56.8 % of the human salmonellosis cases are attributable to turkeys, broilers, laying hens (eggs) and pigs, respectively. The top-6 serovars of fattening turkeys that contribute most to human cases are S. Enteritidis, S. Kentucky, S. Typhimurium, S. Newport, S. Virchow and S. Saintpaul...

  4. Comparative estimation of soil and plant pollution in the impact area of air emissions from an aluminium plant after technogenic load reduction.

    Science.gov (United States)

    Evdokimova, Galina A; Mozgova, Natalya P

    2015-01-01

    The work provides a comparative analysis of changes in soil properties in the last 10-13 years along the pollution gradient of air emissions from Kandalaksha aluminium plant in connection with the reduction of their volume. The content of the priority pollutant fluorine (F) in atmospheric precipitation and in the organic horizon of soil in the plant impact zone significantly decreased in 2011-2013 compared to 2001. The aluminium concentrations reduced only in immediate proximity to the plant (2 km). The fluorine, calcium (Ca) and magnesium (Mg) concentrations are higher in liquid phase compared to solid phase thus these elements can migrated to greater distances from the pollution source (up to 15-20 km). Silicon (Si), aluminium (Al), iron (Fe) and phosphorus (P) can be found only in solid phases and in fall-out within the 5 km. The acidity of soil litter reduced by 2 pH units in the proximity to the plot within the 2 km. The zone of maximum soil contamination decreased from 2.5 km to 1.5 km from the emission source, the zones of heavy and moderate pollution reduced by 5 km in connection with the reduction of pollutant emissions in the plant. A high correlation between the fluorine concentrations in vegetables and litter was found. Higher fluorine concentrations in the soil result in its accumulation in plants. Mosses accumulate fluorine most intensively.

  5. Estimation of black carbon deposition from particulate data in the atmosphere at NCO-P site in Himalayas during pre-monsoon season and its implication to snow surface albedo reduction

    Science.gov (United States)

    Yasunari, T. J.; Bonasoni, P.; Laj, P.; Fujita, K.; Vuillermoz, E.; Marinoni, A.; Cristofanelli, P.; Calzolari, F.; Duchi, R.; Tartari, G.; Lau, W. K.

    2009-12-01

    The black carbon (BC) impact on snow surface may contribute to snow melting and acceleration of glacier retreat. The BC deposition amount onto snow surface in 2006 during pre-monsoon season (March-May) was estimated from the observed equivalent BC (eqBC) concentration (MAAP) and aerosol size distribution observation (SMPS and OPC) in the atmosphere at Nepal Climate Observatory at Pyramid (NCO-P) site in Himalayan region. We, first, carried out correlation analyses in time series data between the eqBC and aerosol size distribution and then determined main eqBC size range here as higher correlations coefficient of more than 0.8. The corresponding eqBC size at NCO-P site was determined predominantly in the 103.1-669.8 nm size range. Simply terminal velocity for each particle size bin was used for calculating deposition flux of BC onto surface. Our estimation of the deposition is considered to be minimal estimation because deposition velocity is in general faster if we include aerodynamic and other terms; moreover we didn’t take into account deposition processes other than gravitational deposition. We estimated the BC deposition of 209 µg m-2 for March-May. If we use snow density variations in surface snow of 192-512 kg m-3, as measured at Yala glacier in Himalayas, the BC concentrations in 2-cm surface snow of 20.4-53.6 µg kg-1 is estimated. This leads to a snow albedo reduction of 1.6-4.1% by using regression relationship between BC concentration in snow and snow albedo reductions by previous studies. If we used the values of the albedo reductions as continuous forcing for a sensitivity test of glacier melting by using a mass-balance model with the same initial settings in a previous study (pointed out for Dongkemadi Glaciers in Tibetan region), increase of total melt water runoff of 54-149 mm w.e. is expected. We are aware of the limitation of this preliminary estimate but it is important to consider that it clearly indicates that BC deposition during March

  6. Integrating soil water and tracer balances, numerical modelling and GIS tools to estimate regional groundwater recharge: Application to the Alcadozo Aquifer System (SE Spain).

    Science.gov (United States)

    Hornero, Jorge; Manzano, Marisol; Ortega, Lucía; Custodio, Emilio

    2016-10-15

    Groundwater recharge is one of the key variables for aquifer management and also one of the most difficult to be evaluated with acceptable accuracy. This is especially relevant in semiarid areas, where the processes involved in recharge are widely variable. Uncertainty should be estimated to know how reliable recharge estimations are. Groundwater recharge has been calculated in the Alcadozo Aquifer System, under steady state conditions, at regional (aquifer) and sub-regional (spring catchment) scales applying different methods. The regional distribution of long-term average recharge values has been estimated with the chloride mass balance method using data from four rain stations and 40 groundwater samples covering almost the whole aquifer surface. A remarkable spatial variability has been found. Average annual recharge rates ranges from 20 to 243mmyear(-1) across the aquifer, with an estimated coefficient of variation between 0.16 and 0.38. The average recharge/precipitation ratio decreases from 34% in the NW to 6% in the SE, following the topographic slope. At spring-catchment scale, recharge has been estimated by modelling the soil water balance with the code Visual Balan 2.0. The results, calibrated with discharge data of the two main springs Liétor and Ayna, are 35.5 and 50mmyear(-1) respectively, with estimated coefficients of variation of 0.49 and 0.36. A sensitivity analysis showed that soil parameters influence the most the uncertainty of recharge estimations. Recharge values estimated with both methods and at two temporal and spatial scales are consistent, considering the regional variability obtained with the chloride method and the respective confidence intervals. Evaluating the uncertainties of each method eased to compare their relative results and to check their agreement, which provided confidence to the values obtained. Thus, the use of independent methods together with their uncertainties is strongly recommended to constrain the magnitude and to

  7. Using learning curves on energy-efficient technologies to estimate future energy savings and emission reduction potentials in the U.S. iron and steel industry

    Energy Technology Data Exchange (ETDEWEB)

    Karali, Nihan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Park, Won Young [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McNeil, Michael A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-06-18

    Increasing concerns on non-sustainable energy use and climate change spur a growing research interest in energy efficiency potentials in various critical areas such as industrial production. This paper focuses on learning curve aspects of energy efficiency measures in the U.S iron and steel sector. A number of early-stage efficient technologies (i.e., emerging or demonstration technologies) are technically feasible and have the potential to make a significant contribution to energy saving and CO2 emissions reduction, but fall short economically to be included. However, they may also have the cost effective potential for significant cost reduction and/or performance improvement in the future under learning effects such as ‘learning-by-doing’. The investigation is carried out using ISEEM, a technology oriented, linear optimization model. We investigated how steel demand is balanced with/without the availability learning curve, compared to a Reference scenario. The retrofit (or investment in some cases) costs of energy efficient technologies decline in the scenario where learning curve is applied. The analysis also addresses market penetration of energy efficient technologies, energy saving, and CO2 emissions in the U.S. iron and steel sector with/without learning impact. Accordingly, the study helps those who use energy models better manage the price barriers preventing unrealistic diffusion of energy-efficiency technologies, better understand the market and learning system involved, predict future achievable learning rates more accurately, and project future savings via energy-efficiency technologies with presence of learning. We conclude from our analysis that, most of the existing energy efficiency technologies that are currently used in the U.S. iron and steel sector are cost effective. Penetration levels increases through the years, even though there is no price reduction. However, demonstration technologies are not economically

  8. Mark-resight approach as a tool to estimate population size of one of the world’s smallest goose populations

    DEFF Research Database (Denmark)

    Clausen, Kevin Kuhlmann; Fælled, Casper Cæsar; Clausen, Preben

    2013-01-01

    The present study investigates the use of a mark–resight procedure to estimate total population size in a local goose population. Using colour-ring sightings of the increasingly scattered population of Light-bellied Brent Geese Branta bernicla hrota from their Danish staging areas, we estimate a ...... a total population size of 7845 birds (95% CI: 7252–8438). This is in good agreement with numbers obtained from total counts, emphasizing that this population, although steadily increasing, is still small compared with historic numbers.......The present study investigates the use of a mark–resight procedure to estimate total population size in a local goose population. Using colour-ring sightings of the increasingly scattered population of Light-bellied Brent Geese Branta bernicla hrota from their Danish staging areas, we estimate...

  9. Integral Criticality Estimators in MCATK

    Energy Technology Data Exchange (ETDEWEB)

    Nolen, Steven Douglas [Los Alamos National Laboratory; Adams, Terry R. [Los Alamos National Laboratory; Sweezy, Jeremy Ed [Los Alamos National Laboratory

    2016-06-14

    The Monte Carlo Application ToolKit (MCATK) is a component-based software toolset for delivering customized particle transport solutions using the Monte Carlo method. Currently under development in the XCP Monte Carlo group at Los Alamos National Laboratory, the toolkit has the ability to estimate the ke f f and a eigenvalues for static geometries. This paper presents a description of the estimators and variance reduction techniques available in the toolkit and includes a preview of those slated for future releases. Along with the description of the underlying algorithms is a description of the available user inputs for controlling the iterations. The paper concludes with a comparison of the MCATK results with those provided by analytic solutions. The results match within expected statistical uncertainties and demonstrate MCATK’s usefulness in estimating these important quantities.

  10. Estimating the exposure of small mammals at three sites within the Chernobyl exclusion zone--a test application of the ERICA Tool.

    Science.gov (United States)

    Beresford, N A; Gaschak, S; Barnett, C L; Howard, B J; Chizhevsky, I; Strømman, G; Oughton, D H; Wright, S M; Maksimenko, A; Copplestone, D

    2008-09-01

    An essential step in the development of any modelling tool is the validation of its predictions. This paper describes a study conducted within the Chernobyl exclusion zone to acquire data to conduct an independent test of the predictions of the ERICA Tool which is designed for use in assessments of radiological risk to the environment. Small mammals were repeatedly trapped at three woodland sites between early July and mid-August 2005. Thermoluminescent dosimeters mounted on collars were fitted to Apodemus flavicollis, Clethrionomys glareolus and Microtus spp. to provide measurements of external dose rate. A total of 85 TLDs were recovered. All animals from which TLDs were recovered were live-monitored to determine (90)Sr and (137)Cs whole-body activity concentrations. A limited number of animals were also analysed to determine (239,240)Pu activity concentrations. Measurements of whole-body activity concentrations and dose rates recorded by the TLDs were compared to predictions of the ERICA-Tool. The predicted (90)Sr and (137)Cs mean activity concentrations were within an order of magnitude of the observed data means. Whilst there was some variation between sites in the agreement between measurements and predictions this was consistent with what would be expected from the differences in soil types at the sites. Given the uncertainties of conducting a study such as this, the agreement observed between the TLD results and the predicted external dose rates gives confidence to the predictions of the ERICA Tool.

  11. Planck 2016 intermediate results. XLVI. Reduction of large-scale systematic effects in HFI polarization maps and estimation of the reionization optical depth

    CERN Document Server

    Aghanim, N; Aumont, J; Baccigalupi, C; Ballardini, M; Banday, A J; Barreiro, R B; Bartolo, N; Basak, S; Battye, R; Benabed, K; Bernard, J -P; Bersanelli, M; Bielewicz, P; Bock, J J; Bonaldi, A; Bonavera, L; Bond, J R; Borrill, J; Bouchet, F R; Boulanger, F; Bucher, M; Burigana, C; Butler, R C; Calabrese, E; Cardoso, J -F; Carron, J; Challinor, A; Chiang, H C; Colombo, L P L; Combet, C; Comis, B; Coulais, A; Crill, B P; Curto, A; Cuttaia, F; Davis, R J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Delouis, J -M; Di Valentino, E; Dickinson, C; Diego, J M; Doré, O; Douspis, M; Ducout, A; Dupac, X; Efstathiou, G; Elsner, F; Enßlin, T A; Eriksen, H K; Falgarone, E; Fantaye, Y; Finelli, F; Forastieri, F; Frailis, M; Fraisse, A A; Franceschi, E; Frolov, A; Galeotta, S; Galli, S; Ganga, K; Génova-Santos, R T; Gerbino, M; Ghosh, T; González-Nuevo, J; Górski, K M; Gratton, S; Gruppuso, A; Gudmundsson, J E; Hansen, F K; Helou, G; Henrot-Versillé, S; Herranz, D; Hivon, E; Huang, Z; Ilic, S; Jaffe, A H; Jones, W C; Keihänen, E; Keskitalo, R; Kisner, T S; Knox, L; Krachmalnicoff, N; Kunz, M; Kurki-Suonio, H; Lagache, G; Lamarre, J -M; Langer, M; Lasenby, A; Lattanzi, M; Lawrence, C R; Jeune, M Le; Leahy, J P; Levrier, F; Liguori, M; Lilje, P B; López-Caniego, M; Ma, Y -Z; Macías-Pérez, J F; Maggio, G; Mangilli, A; Maris, M; Martin, P G; Martínez-González, E; Matarrese, S; Mauri, N; McEwen, J D; Meinhold, P R; Melchiorri, A; Mennella, A; Migliaccio, M; Miville-Deschênes, M -A; Molinari, D; Moneti, A; Montier, L; Morgante, G; Moss, A; Mottet, S; Naselsky, P; Natoli, P; Oxborrow, C A; Pagano, L; Paoletti, D; Partridge, B; Patanchon, G; Patrizii, L; Perdereau, O; Perotto, L; Pettorino, V; Piacentini, F; Plaszczynski, S; Polastri, L; Polenta, G; Puget, J -L; Rachen, J P; Racine, B; Reinecke, M; Remazeilles, M; Renzi, A; Rocha, G; Rossetti, M; Roudier, G; Rubiño-Martín, J A; Ruiz-Granados, B; Salvati, L; Sandri, M; Savelainen, M; Scott, D; Sirri, G; Sunyaev, R; Suur-Uski, A -S; Tauber, J A; Tenti, M; Toffolatti, L; Tomasi, M; Tristram, M; Trombetti, T; Valiviita, J; Van Tent, F; Vibert, L; Vielva, P; Villa, F; Vittorio, N; Wandelt, B D; Watson, R; Wehus, I K; White, M; Zacchei, A; Zonca, A

    2016-01-01

    This paper describes the identification, modelling, and removal of previously unexplained systematic effects in the polarization data of the Planck High Frequency Instrument (HFI) on large angular scales, including new mapmaking and calibration procedures, new and more complete end-to-end simulations, and a set of robust internal consistency checks on the resulting maps. These maps, at 100, 143, 217, and 353 GHz, are early versions of those that will be released in final form later in 2016. The improvements allow us to determine the cosmic reionization optical depth $\\tau$ using, for the first time, the low-multipole $EE$ data from HFI, reducing significantly the central value and uncertainty, and hence the upper limit. Two different likelihood procedures are used to constrain $\\tau$ from two estimators of the CMB $E$- and $B$-mode angular power spectra at 100 and 143 GHz, after debiasing the spectra from a small remaining systematic contamination. These all give fully consistent results. A further consistenc...

  12. Semantic Mediation Tool for Risk Reduction Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project focuses on providing an infrastructure to aid the building of ontologies from existing NASA applications, in a manner that leads to long-term risk...

  13. Tanglegrams: a Reduction Tool for Mathematical Phylogenetics.

    Science.gov (United States)

    Matsen, Frederick; Billey, Sara; Kas, Arnold; Konvalinka, Matjaz

    2016-10-03

    Many discrete mathematics problems in phylogenetics are defined in terms of the relative labeling of pairsof leaf-labeled trees. These relative labelings are naturally formalized as tanglegrams, which have previously been an object of study in coevolutionary analysis. Although there has been considerable work on planar drawings of tanglegrams, they have not been fully explored as combinatorial objects until recently. In this paper, we describe how many discrete mathematical questions on trees "factor" through a problem on tanglegrams, and how understanding that factoring can simplify analysis. Depending on the problem, it may be useful to consider a unordered version of tanglegrams, and/or their unrooted counterparts. For all of these definitions, we show how the isomorphism types of tanglegrams can be understood in terms of double cosets of the symmetric group, and we investigate their automorphisms. Understanding tanglegrams better will isolate the distinct problems on leaf-labeled pairs of trees and reveal natural symmetries of spaces associated with such problems.

  14. Estimating the Public Health Impact of Setting Targets at the European Level for the Reduction of Zoonotic Salmonella in Certain Poultry Populations

    Directory of Open Access Journals (Sweden)

    Marta Hugas

    2013-10-01

    Full Text Available In the European Union (EU, targets are being set for the reduction of certain zoonotic Salmonella serovars in different animal populations, including poultry populations, within the framework of Regulation (EC No. 2160/2003 on the control of zoonoses. For a three-year transitional period, the EU targets were to cover only Salmonella Enteritidis and S. Typhimurium (and in addition S. Hadar, S. Infantis and S. Virchow for breeding flocks of Gallus gallus. Before the end of that transitional period, the revision of the EU targets was to be considered, including the potentially addition of other serovars with public health significance to the permanent EU targets. This review article aims at providing an overview of the assessments carried out by the Scientific Panel on Biological Hazards of the European Food Safety Authority in the field of setting targets for Salmonella in poultry populations (breeding flocks of Gallus gallus, laying flocks of Gallus gallus, broiler flocks of Gallus gallus and flocks of breeding and fattening turkeys and their impact in subsequent changes in EU legislation.

  15. Nonlinear complex diffusion approaches based on a novel noise estimation for noise reduction in phase-resolved optical coherence tomography (Conference Presentation)

    Science.gov (United States)

    Xia, Shaoyan; Huang, Yong; Tan, Xiaodi

    2016-03-01

    Partial differential equation (PDE)-based nonlinear diffusion processes have been widely used for image denoising. In the traditional nonlinear anisotropic diffusion denoising techniques, behavior of the diffusion depends highly on the gradient of image. However, it is difficult to get a good effect if we use these methods to reduce noise in optical coherence tomography images. Because background has the gradient that is very similar to regions of interest, so background noise will be mistaken for edge information and cannot be reduced. Therefore, nonlinear complex diffusion approaches using texture feature(NCDTF) for noise reduction in phase-resolved optical coherence tomography is proposed here, which uses texture feature in OCT images and structural OCT images to remove noise in phase-resolved OCT. Taking into account the fact that texture between background and signal region is different, which can be linked with diffusion coefficient of nonlinear complex diffusion model, we use NCDTF method to reduce noises of structure and phase images first. Then, we utilize OCT structure images to filter phase image in OCT. Finally, to validate our method, parameters such as image SNR, contrast-to-noise ratio (CNR), equivalent number of looks (ENL), and edge preservation were compared between our approach and median filter, Gaussian filter, wavelet filter, nonlinear complex diffusion filter (NCDF). Preliminary results demonstrate that NCDTF method is more effective than others in keeping edges and denoising for phase-resolved OCT.

  16. Estimating travel reduction associated with the use of telemedicine by patients and healthcare professionals: proposal for quantitative synthesis in a systematic review

    Directory of Open Access Journals (Sweden)

    Bahaadinbeigy Kambiz

    2011-08-01

    Full Text Available Abstract Background A major benefit offered by telemedicine is the avoidance of travel, by patients, their carers and health care professionals. Unfortunately, there is very little published information about the extent of avoided travel. We propose to undertake a systematic review of literature which reports credible data on the reductions in travel associated with the use of telemedicine. Method The conventional approach to quantitative synthesis of the results from multiple studies is to conduct a meta analysis. However, too much heterogeneity exists between available studies to allow a meaningful meta analysis of the avoided travel when telemedicine is used across all possible settings. We propose instead to consider all credible evidence on avoided travel through telemedicine by fitting a linear model which takes into account the relevant factors in the circumstances of the studies performed. We propose the use of stepwise multiple regression to identify which factors are significant. Discussion Our proposed approach is illustrated by the example of teledermatology. In a preliminary review of the literature we found 20 studies in which the percentage of avoided travel through telemedicine could be inferred (a total of 5199 patients. The mean percentage avoided travel reported in the 12 store-and-forward studies was 43%. In the 7 real-time studies and in a single study with a hybrid technique, 70% of the patients avoided travel. A simplified model based on the modality of telemedicine employed (i.e. real-time or store and forward explained 29% of the variance. The use of store and forward teledermatology alone was associated with 43% of avoided travel. The increase in the proportion of patients who avoided travel (25% when real-time telemedicine was employed was significant (P = 0.014. Service planners can use this information to weigh up the costs and benefits of the two approaches.

  17. Use of Consumer Acceptability as a Tool to Determine the Level of Sodium Reduction: A Case Study on Beef Soup Substituted With Potassium Chloride and Soy-Sauce Odor.

    Science.gov (United States)

    Lee, Cho Long; Lee, Soh Min; Kim, Kwang-Ok

    2015-11-01

    In this study, consumer acceptability was considered as a tool of reducing sodium rather than just using it as a final examination of the successfulness of the substitution. This study consisted of 4 experimental steps. First, by gradually reducing the concentrations of NaCl, consumer rejection threshold (CRT) of NaCl in beef soup was examined. Then, the amount of KCl that can increase preference was examined in 2 low sodium beef soups, with sodium concentrations slightly above or below the CRT. Relative saltiness of various KCl and NaCl/KCl mixtures were also measured. Finally, consumers evaluated acceptability and intensities of sensory characteristics for 9 beef soup samples that differed with respect to NaCl content and/or KCl content with/without addition of salty-congruent odor (soy-sauce odor). The results showed that in the "above CRT" system, consumer acceptability as well as sensory profile of low sodium beef soup substituted using KCl had similar profile to the control although saltiness was not fully recovered, whereas in the "below CRT" system, consumer acceptability was not recovered using KCl solely as a substitute. Potential of using salty-congruent odor as a final touch to induce salty taste was observed; however, the results inferred the importance of having almost no artificialness in the odor and having harmony with the final product when using it as a strategy to substitute sodium. Overall, the results of the study implied the importance of considering consumer acceptability when approaching sodium reduction to better understand the potentials of the sodium substitutes and salty-congruent odor. Strategies attempting to reduce sodium contents in food have mainly substituted sodium to the level that provides equivalent salty taste and then examined consumer liking. However, these approaches may result in failure for consumer appeal. This study attempted to consider consumer acceptability as a tool of reducing sodium in beef soup substituted using

  18. Cubic exact solutions for the estimation of pairwise haplotype frequencies: implications for linkage disequilibrium analyses and a web tool 'CubeX'

    Directory of Open Access Journals (Sweden)

    Day Ian NM

    2007-11-01

    Full Text Available Abstract Background The frequency of a haplotype comprising one allele at each of two loci can be expressed as a cubic equation (the 'Hill equation', the solution of which gives that frequency. Most haplotype and linkage disequilibrium analysis programs use iteration-based algorithms which substitute an estimate of haplotype frequency into the equation, producing a new estimate which is repeatedly fed back into the equation until the values converge to a maximum likelihood estimate (expectation-maximisation. Results We present a program, "CubeX", which calculates the biologically possible exact solution(s and provides estimated haplotype frequencies, D', r2 and χ2 values for each. CubeX provides a "complete" analysis of haplotype frequencies and linkage disequilibrium for a pair of biallelic markers under situations where sampling variation and genotyping errors distort sample Hardy-Weinberg equilibrium, potentially causing more than one biologically possible solution. We also present an analysis of simulations and real data using the algebraically exact solution, which indicates that under perfect sample Hardy-Weinberg equilibrium there is only one biologically possible solution, but that under other conditions there may be more. Conclusion Our analyses demonstrate that lower allele frequencies, lower sample numbers, population stratification and a possible |D'| value of 1 are particularly susceptible to distortion of sample Hardy-Weinberg equilibrium, which has significant implications for calculation of linkage disequilibrium in small sample sizes (eg HapMap and rarer alleles (eg paucimorphisms, q

  19. Poverty Reduction

    OpenAIRE

    Ortiz, Isabel

    2007-01-01

    The paper reviews poverty trends and measurements, poverty reduction in historical perspective, the poverty-inequality-growth debate, national poverty reduction strategies, criticisms of the agenda and the need for redistribution, international policies for poverty reduction, and ultimately understanding poverty at a global scale. It belongs to a series of backgrounders developed at Joseph Stiglitz's Initiative for Policy Dialogue.

  20. The Demographic Assessment for Health Literacy (DAHL): a new tool for estimating associations between health literacy and outcomes in national surveys.

    Science.gov (United States)

    Hanchate, Amresh D; Ash, Arlene S; Gazmararian, Julie A; Wolf, Michael S; Paasche-Orlow, Michael K

    2008-10-01

    To impute limited health literacy from commonly measured socio-demographic data and to compare it to the Short-Test of Functional Health Literacy in Adults (S-TOFHLA) for estimating the influence of limited health literacy on health status in the elderly. The Prudential Medicare Study assesses the S-TOFHLA score, leading to a "reference standard" classification of 25% of people with inadequate literacy; the National Health Interview Survey has no such assessment. We estimated a regression of S-TOFHLA on sex, age, years of schooling, and race/ethnicity in The Prudential Medicare Study data to derive a Demographic Assessment for Health Literacy (DAHL) score, and imputed inadequate literacy to the 25% with the lowest DAHL scores. Using regression, we then examined associations between several health status measures (including hypertension, diabetes, physical and mental SF-12) and inadequate literacy (imputed or test-based). Estimates of association using imputed inadequate literacy closely approximate those obtained using S-TOFHLA-based inadequate literacy for most outcomes examined. As few population surveys measure health literacy, the DAHL, a readily calculated health literacy proxy score, may be useful for expanding the scope of health literacy research in national survey data.

  1. Application of the PSI-NUSS Tool for the Estimation of Nuclear Data Related keff Uncertainties for the OECD/NEA WPNCS UACSA Phase I Benchmark

    Science.gov (United States)

    Zhu, T.; Vasiliev, A.; Ferroukhi, H.; Pautz, A.

    2014-04-01

    At the Paul Scherrer Institute (PSI), a methodology titled PSI-NUSS is under development for the propagation of nuclear data uncertainties into Criticality Safety Evaluation (CSE) with the Monte Carlo code MCNPX. The primary purpose is to provide a complementary option for the uncertainty assessment related to nuclear data, versus the traditional approach which relies on estimating biases/uncertainties based on validation studies against representative critical benchmark experiments. In the present paper, the PSI-NUSS methodology is applied to quantify nuclear data uncertainties for the OECD/NEA UACSA Exercise Phase I benchmark. One underlying reason is that PSI's CSE methodology developed so far and previously applied for this benchmark was based on using a more conventional approach, involving engineering guesses in order to estimate uncertainties in the calculated effective multiplication factor (keff). Therefore, as the PSI-NUSS methodology aims precisely at integrating a more rigorous treatment of the specific type of uncertainties from nuclear data for CSE, its application to the UACSA is conducted here: nuclear data related uncertainty component is estimated and compared to results obtained by other participants using different codes/libraries and methodologies.

  2. An estimation of stress intensity factor in a clamped SE(T)specimen through numerical simulation and experimental verification: case of FCGR of AISI H11 tool steel

    Institute of Scientific and Technical Information of China (English)

    Masood Shah; Catherine Mabru; Farhad Rezai-Aria; Ines Souki; Riffat Asim Pasha

    2012-01-01

    A finite element analysis of stress intensity factors (KI) in clamped SE(T)c specimens (dog bone profile) is presented.A J-integral approach is used to calculate the values of stress intensity factors valid for 0.125≤a/W≤0.625.A detailed comparison is made with the work of other researchers on rectangular specimens.Different boundary conditions are explored to best describe the real conditions in the laboratory.A sensitivity study is also presented to explore the effects of variation in specimen position in the grips of the testing machine.Finally the numerically calculated SIF is used to determine an FCGR curve for AISI H11 tool steel on SE(T)c specimens and compared with C(T) specimen of the same material.

  3. The Black Top Hat function applied to a DEM: A tool to estimate recent incision in a mountainous watershed (Estibère Watershed, Central Pyrenees)

    Science.gov (United States)

    Rodriguez, Felipe; Maire, Eric; Courjault-Radé, Pierre; Darrozes, José

    2002-03-01

    The Top Hat Transform function is a grey-level image analysis tool that allows extracting peaks and valleys in a non-uniform background. This function can be applied onto a grey-level Digital Elevation Model (DEM). It is herein applied to quantify the volume of recent incised material in a mountainous Pyrenean watershed. Grey-level Closing operation applied to the Present-Day DEM gives a new image called ``paleo'' DEM. The Black Top Hat function consists in the subtraction of the ``paleo'' DEM with the Present-Day DEM. It gives a new DEM representing all valleys whose sizes range between the size of the structuring element and the null value as no threshold is used. The calculation of the incised volume is directly derived from the subtraction between the two DEM's. The geological significance of the quantitative results is discussed.

  4. Photutils: Photometry tools

    Science.gov (United States)

    Bradley, Larry; Sipocz, Brigitta; Robitaille, Thomas; Tollerud, Erik; Deil, Christoph; Vinícius, Zè; Barbary, Kyle; Günther, Hans Moritz; Bostroem, Azalee; Droettboom, Michael; Bray, Erik; Bratholm, Lars Andersen; Pickering, T. E.; Craig, Matt; Pascual, Sergio; Greco, Johnny; Donath, Axel; Kerzendorf, Wolfgang; Littlefair, Stuart; Barentsen, Geert; D'Eugenio, Francesco; Weaver, Benjamin Alan

    2016-09-01

    Photutils provides tools for detecting and performing photometry of astronomical sources. It can estimate the background and background rms in astronomical images, detect sources in astronomical images, estimate morphological parameters of those sources (e.g., centroid and shape parameters), and perform aperture and PSF photometry. Written in Python, it is an affiliated package of Astropy (ascl:1304.002).

  5. Lunar hand tools

    Science.gov (United States)

    Bentz, Karl F.; Coleman, Robert D.; Dubnik, Kathy; Marshall, William S.; Mcentee, Amy; Na, Sae H.; Patton, Scott G.; West, Michael C.

    1987-01-01

    Tools useful for operations and maintenance tasks on the lunar surface were determined and designed. Primary constraints are the lunar environment, the astronaut's space suit and the strength limits of the astronaut on the moon. A multipurpose rotary motion tool and a collapsible tool carrier were designed. For the rotary tool, a brushless motor and controls were specified, a material for the housing was chosen, bearings and lubrication were recommended and a planetary reduction gear attachment was designed. The tool carrier was designed primarily for ease of access to the tools and fasteners. A material was selected and structural analysis was performed on the carrier. Recommendations were made about the limitations of human performance and about possible attachments to the torque driver.

  6. Control Strategy Tool (CoST)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The EPA Control Strategy Tool (CoST) is a software tool for projecting potential future control scenarios, their effects on emissions and estimated costs. This tool...

  7. A hybrid downscaling approach for the estimation of climate change effects on droughts using a geo-information tool. Case study: Thessaly, Central Greece

    Directory of Open Access Journals (Sweden)

    Tzabiras John

    2016-01-01

    Full Text Available Multiple linear regression is used to downscale large-scale outputs from CGCM2 (second generation CGCM of Canadian centre for climate monitoring and analysis and ECHAM5 (developed at the Max Planck Institute for Meteorology, statistically to regional precipitation over the Thessaly region, Greece. Mean monthly precipitation data for the historical period Oct.1960-Sep.2002 derived from 79 rain gauges were spatially interpolated using a geostatistical approach over the region of Thessaly, which was divided into 128 grid cells of 10 km × 10 km. The methodology is based on multiple regression of large scale GCM predictant variables with observed precipitation and the application of a stochastic time series model for precipitation residuals simulation (white noise. The methodology was developed for historical period (Oct.1960–Sep.1990 and validated against observed monthly precipitation for period (Oct.1990–Sep.2002. The downscaled proposed methodology was used to calculate the standardized precipitation index (SPI at various timescales (3-month, 6-month, 9-month, 12-month, 24-month in order to estimate climate change effects on droughts. Various evaluation statistics were calculated in order to validate the process and the results showed that the method is efficient in SPI reproduction but the level of uncertainty is quite high due to its stochastic component.

  8. Regional estimation model of the flood duration frequency reduction curves of the italian rivers; Un modello di stima regionale delle curve di riduzione dei colmi di piena dei corsi d'acqua italiani

    Energy Technology Data Exchange (ETDEWEB)

    Tomirotti, M. [Milan Univ., Milan (Italy). Dipt. di Ingegneria Idraulica, Ambientale e del Rilevamento

    2001-10-01

    In this paper a regional estimation model of the flood duration frequency reduction curves of the italian rivers is proposed; the model is based on the estimation of peak flood discharge and instantaneous-daily flood peak ratio of the river basin. While for the estimation of the first quantity a great amount of papers is available in literature less attention has been devoted to the second one. Thus, a regional estimation model of the instantaneous-daily flood peak ratio of the Italian river basins has been formulated, employing for the calibration a set of data referring to 212 gauging stations belonging to the entire territory of the italian Country and characterized by twenty years of observations; the domain of validity of the model covers the most part of the territory of the italian Country. On the basis of the results of the analysis, the methodology of design of flood control reservoirs presented in previous papers has been re-examined and becomes applicable to non-gauged sites. [Italian] Nella presente memoria viene proposto un modello di stima indiretta delle curve di riduzione dei colmi di piena dei corsi d'acqua italiani basato sulla stima regionale delle portate al colmo e del coefficiente di punta del bacino. Mentre per la stima della prima grandezza e' disponibile un'amplissima mole di lavori, minore attenzione e' stata rivolta nella letteratura tecnica alla seconda. E' stato percio' sviluppato un modello di stima regionale del coefficiente di punta dei corsi d'acqua italiani, utilizzando per la taratura i dati relativi a 212 stazioni idrometrografiche distribuite sull'intero territorio nazionale e caratterizzate dal almeno venti anni di osservazioni; il dominio di validita' del modello sviluppato ricopre la quasi totalita' del territorio nazionale. Sulla base dei risultati dell'analisi e' stata infine ripresa la metodologia semplificata per la determinazione dei volumi da assegnare alle

  9. Dissimilatory metal reduction.

    Science.gov (United States)

    Lovley, D R

    1993-01-01

    Microorganisms can enzymatically reduce a variety of metals in metabolic processes that are not related to metal assimilation. Some microorganisms can conserve energy to support growth by coupling the oxidation of simple organic acids and alcohols, H2, or aromatic compounds to the reduction of Fe(III) or Mn(IV). This dissimilatory Fe(III) and Mn(IV) reduction influences the organic as well as the inorganic geochemistry of anaerobic aquatic sediments and ground water. Microorganisms that use U(VI) as a terminal electron acceptor play an important role in uranium geochemistry and may be a useful tool for removing uranium from contaminated environments. Se(VI) serves as a terminal electron acceptor to support anaerobic growth of some microorganisms. Reduction of Se(VI) to Se(O) is an important mechanism for the precipitation of selenium from contaminated waters. Enzymatic reduction of Cr(VI) to the less mobile and less toxic Cr(III), and reduction of soluble Hg(II) to volatile Hg(O) may affect the fate of these compounds in the environment and might be used as a remediation strategy. Microorganisms can also enzymatically reduce other metals such as technetium, vanadium, molybdenum, gold, silver, and copper, but reduction of these metals has not been studied extensively.

  10. Estimated Visceral Adipose Tissue, but Not Body Mass Index, Is Associated with Reductions in Glomerular Filtration Rate Based on Cystatin C in the Early Stages of Chronic Kidney Disease

    Directory of Open Access Journals (Sweden)

    Ana Karina Teixeira da Cunha França

    2014-01-01

    Full Text Available Information on the association between obesity and initial phases of chronic kidney disease (CKD is still limited, principally those regarding the influence of visceral adipose tissue. We investigated whether the visceral adipose tissue is more associated with reductions in glomerular filtration rate (GFR than total and abdominal obesity in hypertensive individuals with stage 1-2 CKD. A cross-sectional study was implemented which involved 241 hypertensive patients undergoing treatment at a primary health care facility. GFR was estimated using equations based on creatinine and cystatin C levels. Explanatory variables included body mass index (BMI, waist circumference (WC, and estimated visceral adipose tissue (eVAT. The mean age was 59.6±9.2 years old and 75.9% were female. According to BMI, 28.2% of subjects were obese. Prevalence of increased WC and eVAT was 63.9% and 58.5%, respectively. Results from the assessment of GFR by BMI, WC, and eVAT categories showed that only women with increased eVAT (≥150 cm2 had a lower mean GFR by Larsson (P=0.016, Levey 2 (P=0.005, and Levey 3 (P=0.008 equations. The same result was not observed when the MDRD equation was employed. No association was found between BMI, WC, eVAT, and GFR using only serum creatinine. In the early stages of CKD, increased eVAT in hypertensive women was associated with decreased GFR based on cystatin C.

  11. Forensic surface metrology: tool mark evidence.

    Science.gov (United States)

    Gambino, Carol; McLaughlin, Patrick; Kuo, Loretta; Kammerman, Frani; Shenkin, Peter; Diaczuk, Peter; Petraco, Nicholas; Hamby, James; Petraco, Nicholas D K

    2011-01-01

    Over the last several decades, forensic examiners of impression evidence have come under scrutiny in the courtroom due to analysis methods that rely heavily on subjective morphological comparisons. Currently, there is no universally accepted system that generates numerical data to independently corroborate visual comparisons. Our research attempts to develop such a system for tool mark evidence, proposing a methodology that objectively evaluates the association of striated tool marks with the tools that generated them. In our study, 58 primer shear marks on 9 mm cartridge cases, fired from four Glock model 19 pistols, were collected using high-resolution white light confocal microscopy. The resulting three-dimensional surface topographies were filtered to extract all "waviness surfaces"-the essential "line" information that firearm and tool mark examiners view under a microscope. Extracted waviness profiles were processed with principal component analysis (PCA) for dimension reduction. Support vector machines (SVM) were used to make the profile-gun associations, and conformal prediction theory (CPT) for establishing confidence levels. At the 95% confidence level, CPT coupled with PCA-SVM yielded an empirical error rate of 3.5%. Complementary, bootstrap-based computations for estimated error rates were 0%, indicating that the error rate for the algorithmic procedure is likely to remain low on larger data sets. Finally, suggestions are made for practical courtroom application of CPT for assigning levels of confidence to SVM identifications of tool marks recorded with confocal microscopy.

  12. CERES: A Set of Automated Reduction Routines for Echelle Spectra

    CERN Document Server

    Brahm, Rafael; Espinoza, Néstor

    2016-01-01

    We present the Collection of Extraction Routines for Echelle Spectra (CERES). These routines were developed for the construction of automated pipelines for the reduction, extraction and analysis of spectra acquired with different instruments, allowing the obtention of homogeneous and standardised results. This modular code includes tools for handling the different steps of the processing: CCD reductions, tracing of the echelle orders, optimal and simple extraction, computation of the wavelength solution, estimation of radial velocities, and rough and fast estimation of the atmospheric parameters. Currently, CERES has been used to develop automated pipelines for eleven different spectrographs, namely CORALIE, FEROS, HARPS, PUCHEROS, FIDEOS, CAFE, DuPont/Echelle, Magellan/Mike, Keck/HIRES, Magellan/PFS and APO/ARCES, but the routines can be easily used in order to deal with data coming from other spectrographs. We show the high precision in radial velocity that CERES achieves for some of these instruments and w...

  13. Slope-Area Computation Program Graphical User Interface 1.0—A Preprocessing and Postprocessing Tool for Estimating Peak Flood Discharge Using the Slope-Area Method

    Science.gov (United States)

    Bradley, D. Nathan

    2012-01-01

    The slope-area method is a technique for estimating the peak discharge of a flood after the water has receded (Dalrymple and Benson, 1967). This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks (HWMs) on trees or buildings. These indicators of flood stage are combined with measurements of the cross-sectional geometry of the stream, estimates of channel roughness, and a mathematical model that balances the total energy of the flow between cross sections. This is in contrast to a “direct” measurement of discharge during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a gage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the stream gage, resulting in more accurate computation of high flows. The Slope-Area Computation program (SAC; Fulford, 1994) is an implementation of the slope-area method that computes a peak-discharge estimate from inputs of water-surface slope (from surveyed HWMs), channel geometry, and estimated channel roughness. SAC is a command line program written in Fortran that reads input data from a formatted text file and prints results to another formatted text file. Preparing the input file can be time-consuming and prone to errors. This document describes the SAC graphical user interface (GUI), a crossplatform “wrapper” application that prepares the SAC input file, executes the program, and helps the user interpret the output. The SAC GUI is an update and enhancement of the slope-area method (SAM; Hortness, 2004; Berenbrock, 1996), an earlier spreadsheet tool used to aid field personnel in the completion of a slope-area measurement. The SAC GUI reads survey data

  14. New Tools for Managing Agricultural P

    Science.gov (United States)

    Nieber, J. L.; Baker, L. A.; Peterson, H. M.; Ulrich, J.

    2014-12-01

    Best management practices (BMPs) generally focus on retaining nutrients (especially P) after they enter the watershed. This approach is expensive, unsustainable, and has not led to reductions of P pollution at large scales (e.g., Mississippi River). Although source reduction, which results in reducing inputs of nutrients to a watershed, has long been cited as a preferred approach, we have not had tools to guide source reduction efforts at the watershed level. To augment conventional TMDL tools, we developed an "actionable" watershed P balance approach, based largely on watershed-specific information, yet simple enough to be utilized as a practical tool. Interviews with farmers were used to obtain detailed farm management data, data from livestock permits were adjusted based on site visits, stream P fluxes were calculated from 3 years of monitoring data, and expert knowledge was used to model P fluxes through animal operations. The overall P use efficiency. Puse was calculated as the sum of deliberate exports (P in animals, milk, eggs, and crops) divided by deliberate inputs (P inputs of fertilizer, feed, and nursery animals x 100. The crop P use efficiency was 1.7, meaning that more P was exported as products that was deliberately imported; we estimate that this mining would have resulted in a loss of 6 mg P/kg across the watershed. Despite the negative P balance, the equivalent of 5% of watershed input was lost via stream export. Tile drainage, the presence of buffer strips, and relatively flat topography result in dominance of P loads by ortho-P (66%) and low particulate P. This, together with geochemical analysis (ongoing) suggest that biological processes may be at least as important as sediment transport in controlling P loads. We have developed a P balance calculator tool to enable watershed management organizations to develop watershed P balances and identify opportunities for improving the efficiency of P utilization.

  15. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  16. VBioindex: A Visual Tool to Estimate Biodiversity

    Directory of Open Access Journals (Sweden)

    Dong Su Yu

    2015-09-01

    Full Text Available Biological diversity, also known as biodiversity, is an important criterion for measuring the value of an ecosystem. As biodiversity is closely related to human welfare and quality of life, many efforts to restore and maintain the biodiversity of species have been made by government agencies and non-governmental organizations, thereby drawing a substantial amount of international attention. In the fields of biological research, biodiversity is widely measured using traditional statistical indices such as the Shannon-Wiener index, species richness, evenness, and relative dominance of species. However, some biologists and ecologists have difficulty using these indices because they require advanced mathematical knowledge and computational techniques. Therefore, we developed VBioindex, a user-friendly program that is capable of measuring the Shannon-Wiener index, species richness, evenness, and relative dominance. VBioindex serves as an easy to use interface and visually represents the results in the form of a simple chart and in addition, VBioindex offers functions for long-term investigations of datasets using time-series analyses.

  17. Chatter and machine tools

    CERN Document Server

    Stone, Brian

    2014-01-01

    Focussing on occurrences of unstable vibrations, or Chatter, in machine tools, this book gives important insights into how to eliminate chatter with associated improvements in product quality, surface finish and tool wear. Covering a wide range of machining processes, including turning, drilling, milling and grinding, the author uses his research expertise and practical knowledge of vibration problems to provide solutions supported by experimental evidence of their effectiveness. In addition, this book contains links to supplementary animation programs that help readers to visualise the ideas detailed in the text. Advancing knowledge in chatter avoidance and suggesting areas for new innovations, Chatter and Machine Tools serves as a handbook for those desiring to achieve significant reductions in noise, longer tool and grinding wheel life and improved product finish.

  18. Integration of vision and haptics during tool use.

    Science.gov (United States)

    Takahashi, Chie; Diedrichsen, Jörn; Watt, Simon J

    2009-06-08

    When integrating signals from vision and haptics the brain must solve a "correspondence problem" so that it only combines information referring to the same object. An invariant spatial rule could be used when grasping with the hand: here the two signals should only be integrated when the estimate of hand and object position coincide. Tools complicate this relationship, however, because visual information about the object, and the location of the hand, are separated spatially. We show that when a simple tool is used to estimate size, the brain integrates visual and haptic information in a near-optimal fashion, even with a large spatial offset between the signals. Moreover, we show that an offset between the tool-tip and the object results in similar reductions in cross-modal integration as when the felt and seen positions of an object are offset in normal grasping. This suggests that during tool use the haptic signal is treated as coming from the tool-tip, not the hand. The brain therefore appears to combine visual and haptic information, not based on the spatial proximity of sensory stimuli, but based on the proximity of the distal causes of stimuli, taking into account the dynamics and geometry of tools.

  19. Estimated potential of energy saving and reduction of the demand commercial buildings illumination; Potencial estimado de ahorro de energia y reduccion de la demanda en iluminacion de edificios comerciales

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez Gomez, Victor Hugo; Morillon Galvez, David [Posgrado en Energetica de la DEPFI-UNAM, Mexico, D. F. (Mexico)

    1999-07-01

    In this paper the estimated energy saving potential in illumination, the energy end use and the technology used in commercial buildings of different use is analyzed. Estimation that departs from information of the Fideicomiso para el Ahorro de Energia (FIDE) demonstrative cases, presents the energy saving and the demand reduction in a sample of 29 buildings, among which are shopping malls, hospitals, schools, hotels, restaurants and public buildings in which energy saving programs have been carried out, with measures such as the cleaning of the luminaries and its replacement for more efficient ones. The average saving obtained is of 21.81%, in the following areas: illumination, air conditioning and others. In addition, in a sample of 4 buildings, it was observed that before applying the energy saving programs, two of them did not fulfill with the norm NOM-007-1995 (electric power density in interior lighting systems W/m{sup 2}) and later did fulfill the values and criteria of the norm. [Spanish] En el presente trabajo se analiza el potencial estimado de ahorro de energia en iluminacion, el uso final de la energia y la tecnologia empleada en edificios comerciales de uso distinto. Estimacion que parte de la informacion de los casos demostrativos del Fideicomiso para el Ahorro de Energia (FIDE), se presenta el ahorro y reduccion de la demanda en una muestra de 29 edificios, entre los que se tienen centros comerciales, hospitales, escuelas, hoteles, restaurantes y edificios publicos en los cuales se ha llevado a cabo programas de ahorro de energia, con medidas como la limpieza de las luminarias y su remplazo por otros mas eficientes. El ahorro promedio obtenido es de 21.81%, en las siguientes areas: iluminacion, aire acondicionado y otros. Ademas, en una muestra de 4 edificios, se observo que antes de aplicar los programas de ahorro de energia, dos no cumplian con la norma NOM-007-ENER-1995 (densidad de potencia electrica en alumbrado interior W/m{sup 2}) y

  20. The development and discussion of computerized visual perception assessment tool for Chinese characters structures - Concurrent estimation of the overall ability and the domain ability in item response theory approach.

    Science.gov (United States)

    Wu, Huey-Min; Lin, Chin-Kai; Yang, Yu-Mao; Kuo, Bor-Chen

    2014-11-12

    Visual perception is the fundamental skill required for a child to recognize words, and to read and write. There was no visual perception assessment tool developed for preschool children based on Chinese characters in Taiwan. The purposes were to develop the computerized visual perception assessment tool for Chinese Characters Structures and to explore the psychometrical characteristic of assessment tool. This study adopted purposive sampling. The study evaluated 551 kindergarten-age children (293 boys, 258 girls) ranging from 46 to 81 months of age. The test instrument used in this study consisted of three subtests and 58 items, including tests of basic strokes, single-component characters, and compound characters. Based on the results of model fit analysis, the higher-order item response theory was used to estimate the performance in visual perception, basic strokes, single-component characters, and compound characters simultaneously. Analyses of variance were used to detect significant difference in age groups and gender groups. The difficulty of identifying items in a visual perception test ranged from -2 to 1. The visual perception ability of 4- to 6-year-old children ranged from -1.66 to 2.19. Gender did not have significant effects on performance. However, there were significant differences among the different age groups. The performance of 6-year-olds was better than that of 5-year-olds, which was better than that of 4-year-olds. This study obtained detailed diagnostic scores by using a higher-order item response theory model to understand the visual perception of basic strokes, single-component characters, and compound characters. Further statistical analysis showed that, for basic strokes and compound characters, girls performed better than did boys; there also were differences within each age group. For single-component characters, there was no difference in performance between boys and girls. However, again the performance of 6-year-olds was better than

  1. Hardware Accelerated Power Estimation

    CERN Document Server

    Coburn, Joel; Raghunathan, Anand

    2011-01-01

    In this paper, we present power emulation, a novel design paradigm that utilizes hardware acceleration for the purpose of fast power estimation. Power emulation is based on the observation that the functions necessary for power estimation (power model evaluation, aggregation, etc.) can be implemented as hardware circuits. Therefore, we can enhance any given design with "power estimation hardware", map it to a prototyping platform, and exercise it with any given test stimuli to obtain power consumption estimates. Our empirical studies with industrial designs reveal that power emulation can achieve significant speedups (10X to 500X) over state-of-the-art commercial register-transfer level (RTL) power estimation tools.

  2. Large Crater Clustering tool

    Science.gov (United States)

    Laura, Jason; Skinner, James A.; Hunter, Marc A.

    2017-08-01

    In this paper we present the Large Crater Clustering (LCC) tool set, an ArcGIS plugin that supports the quantitative approximation of a primary impact location from user-identified locations of possible secondary impact craters or the long-axes of clustered secondary craters. The identification of primary impact craters directly supports planetary geologic mapping and topical science studies where the chronostratigraphic age of some geologic units may be known, but more distant features have questionable geologic ages. Previous works (e.g., McEwen et al., 2005; Dundas and McEwen, 2007) have shown that the source of secondary impact craters can be estimated from secondary impact craters. This work adapts those methods into a statistically robust tool set. We describe the four individual tools within the LCC tool set to support: (1) processing individually digitized point observations (craters), (2) estimating the directional distribution of a clustered set of craters, back projecting the potential flight paths (crater clusters or linearly approximated catenae or lineaments), (3) intersecting projected paths, and (4) intersecting back-projected trajectories to approximate the local of potential source primary craters. We present two case studies using secondary impact features mapped in two regions of Mars. We demonstrate that the tool is able to quantitatively identify primary impacts and supports the improved qualitative interpretation of potential secondary crater flight trajectories.

  3. DESIGN OF AQUIFER REMEDIATION SYSTEMS: (2) Estimating site-specific performance and benefits of partial source removal

    Science.gov (United States)

    A Lagrangian stochastic model is proposed as a tool that can be utilized in forecasting remedial performance and estimating the benefits (in terms of flux and mass reduction) derived from a source zone remedial effort. The stochastic functional relationships that describe the hyd...

  4. Tool steels

    DEFF Research Database (Denmark)

    Højerslev, C.

    2001-01-01

    On designing a tool steel, its composition and heat treatment parameters are chosen to provide a hardened and tempered martensitic matrix in which carbides are evenly distributed. In this condition the matrix has an optimum combination of hardness andtoughness, the primary carbides provide...... resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...

  5. Exceptional Reductions

    CERN Document Server

    Marrani, Alessio; Riccioni, Fabio

    2011-01-01

    Starting from basic identities of the group E8, we perform progressive reductions, namely decompositions with respect to the maximal and symmetric embeddings of E7xSU(2) and then of E6xU(1). This procedure provides a systematic approach to the basic identities involving invariant primitive tensor structures of various irreprs. of finite-dimensional exceptional Lie groups. We derive novel identities for E7 and E6, highlighting the E8 origin of some well known ones. In order to elucidate the connections of this formalism to four-dimensional Maxwell-Einstein supergravity theories based on symmetric scalar manifolds (and related to irreducible Euclidean Jordan algebras, the unique exception being the triality-symmetric N = 2 stu model), we then derive a fundamental identity involving the unique rank-4 symmetric invariant tensor of the 0-brane charge symplectic irrepr. of U-duality groups, with potential applications in the quantization of the charge orbits of supergravity theories, as well as in the study of mult...

  6. Heat reduction of the MWD telemetry system

    OpenAIRE

    Matviykiv, Taras

    2012-01-01

    In this paper the simplified thermal model of conventional downhole MWD (Measurements While Drilling) telemetry system has been made. The heat reduction methods for the IC (integrated circuits) components of downhole drilling tools have been analyzed.

  7. Estimating rare events in biochemical systems using conditional sampling

    Science.gov (United States)

    Sundar, V. S.

    2017-01-01

    The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.

  8. Management Tools

    Science.gov (United States)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  9. Background reduction in cryogenic detectors

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Daniel A.; /Fermilab

    2005-04-01

    This paper discusses the background reduction and rejection strategy of the Cryogenic Dark Matter Search (CDMS) experiment. Recent measurements of background levels from CDMS II at Soudan are presented, along with estimates for future improvements in sensitivity expected for a proposed SuperCDMS experiment at SNOLAB.

  10. Professional liability insurance in Obstetrics and Gynaecology: estimate of the level of knowledge about malpractice insurance policies and definition of an informative tool for the management of the professional activity

    Directory of Open Access Journals (Sweden)

    Scurria Serena

    2011-12-01

    Full Text Available Abstract Background In recent years, due to the increasingly hostile environment in the medical malpractice field and related lawsuits in Italy, physicians began informing themselves regarding their comprehensive medical malpractice coverage. Methods In order to estimate the level of knowledge of medical professionals on liability insurance coverage for healthcare malpractice, a sample of 60 hospital health professionals of the obstetrics and gynaecology area of Messina (Sicily, Italy were recluted. A survey was administered to evaluate their knowledge as to the meaning of professional liability insurance coverage but above all on the most frequent policy forms ("loss occurrence", "claims made" and "I-II risk". Professionals were classified according to age and professional title and descriptive statistics were calculated for all the professional groups and answers. Results Most of the surveyed professionals were unaware or had very bad knowledge of the professional liability insurance coverage negotiated by the general manager, so most of the personnel believed it useful to subscribe individual "private" policies. Several subjects declared they were aware of the possibility of obtaining an extended coverage for gross negligence and substantially all the surveyed had never seen the loss occurrence and claims made form of the policy. Moreover, the sample was practically unaware of the related issues about insurance coverage for damages related to breaches on informed consent. The results revealed the relative lack of knowledge--among the operators in the field of obstetrics and gynaecology--of the effective coverage provided by the policies signed by the hospital managers for damages in medical malpractice. The authors thus proposed a useful information tool to help professionals working in obstetrics and gynaecology regarding aspects of insurance coverage provided on the basis of Italian civil law. Conclusion Italy must introduce a compulsory

  11. Reducts of Ramsey structures

    CERN Document Server

    Bodirsky, Manuel

    2011-01-01

    One way of studying a relational structure is to investigate functions which are related to that structure and which leave certain aspects of the structure invariant. Examples are the automorphism group, the self-embedding monoid, the endomorphism monoid, or the polymorphism clone of a structure. Such functions can be particularly well understood when the relational structure is countably infinite and has a first-order definition in another relational structure which has a finite language, is totally ordered and homogeneous, and has the Ramsey property. This is because in this situation, Ramsey theory provides the combinatorial tool for analyzing these functions -- in a certain sense, it allows to represent such functions by functions on finite sets. This is a survey of results in model theory and theoretical computer science obtained recently by the authors in this context. In model theory, we approach the problem of classifying the reducts of countably infinite ordered homogeneous Ramsey structures in a fin...

  12. Dose Reduction Techniques

    Energy Technology Data Exchange (ETDEWEB)

    WAGGONER, L.O.

    2000-05-16

    As radiation safety specialists, one of the things we are required to do is evaluate tools, equipment, materials and work practices and decide whether the use of these products or work practices will reduce radiation dose or risk to the environment. There is a tendency for many workers that work with radioactive material to accomplish radiological work the same way they have always done it rather than look for new technology or change their work practices. New technology is being developed all the time that can make radiological work easier and result in less radiation dose to the worker or reduce the possibility that contamination will be spread to the environment. As we discuss the various tools and techniques that reduce radiation dose, keep in mind that the radiological controls should be reasonable. We can not always get the dose to zero, so we must try to accomplish the work efficiently and cost-effectively. There are times we may have to accept there is only so much you can do. The goal is to do the smart things that protect the worker but do not hinder him while the task is being accomplished. In addition, we should not demand that large amounts of money be spent for equipment that has marginal value in order to save a few millirem. We have broken the handout into sections that should simplify the presentation. Time, distance, shielding, and source reduction are methods used to reduce dose and are covered in Part I on work execution. We then look at operational considerations, radiological design parameters, and discuss the characteristics of personnel who deal with ALARA. This handout should give you an overview of what it takes to have an effective dose reduction program.

  13. Mathematical tools

    Science.gov (United States)

    Capozziello, Salvatore; Faraoni, Valerio

    In this chapter we discuss certain mathematical tools which are used extensively in the following chapters. Some of these concepts and methods are part of the standard baggage taught in undergraduate and graduate courses, while others enter the tool-box of more advanced researchers. These mathematical methods are very useful in formulating ETGs and in finding analytical solutions.We begin by studying conformal transformations, which allow for different representations of scalar-tensor and f(R) theories of gravity, in addition to being useful in GR. We continue by discussing variational principles in GR, which are the basis for presenting ETGs in the following chapters. We close the chapter with a discussion of Noether symmetries, which are used elsewhere in this book to obtain analytical solutions.

  14. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  15. Twitter as a Potential Disaster Risk Reduction Tool. Part III: Evaluating Variables that Promoted Regional Twitter Use for At-risk Populations During the 2013 Hattiesburg F4 Tornado.

    Science.gov (United States)

    Cooper, Guy Paul; Yeager, Violet; Burkle, Frederick M; Subbarao, Italo

    2015-06-29

    Study goals attempt to identify the variables most commonly associated with successful tweeted messages and determine which variables have the most influence in promoting exponential dissemination of information (viral spreading of the message) and trending (becoming popular) in the given disaster affected region. Part II describes the detailed extraction and triangulation filtration methodological approach to acquiring twitter data for the 2013 Hattiesburg Tornado. The data was then divided into two 48 hour windows before and after the tornado impact with a 2 hour pre-tornado buffer to capture tweets just prior to impact. Criteria-based analysis was completed for Tweets and users. The top 100 pre-Tornado and post-Tornado retweeted users were compared to establish the variability among the top retweeted users during the 4 day span.  Pre-Tornado variables that were correlated to higher retweeted rates include total user tweets (0.324), and total times message retweeted (0.530).  Post-Tornado variables that were correlated to higher retweeted rates include total hashtags in a retweet (0.538) and hashtags #Tornado (0.378) and #Hattiesburg (0.254). Overall hashtags usage significantly increased during the storm. Pre-storm there were 5,763 tweets with a hashtag and post-storm there was 13,598 using hashtags. Twitter's unique features allow it to be considered a unique social media tool applicable for emergency managers and public health officials for rapid and accurate two way communication.  Additionally, understanding how variables can be properly manipulated plays a key role in understanding how to use this social media platform for effective, accurate, and rapid mass information communication.

  16. Health gain by salt reduction in europe: a modelling study.

    Directory of Open Access Journals (Sweden)

    Marieke A H Hendriksen

    Full Text Available Excessive salt intake is associated with hypertension and cardiovascular diseases. Salt intake exceeds the World Health Organization population nutrition goal of 5 grams per day in the European region. We assessed the health impact of salt reduction in nine European countries (Finland, France, Ireland, Italy, Netherlands, Poland, Spain, Sweden and United Kingdom. Through literature research we obtained current salt intake and systolic blood pressure levels of the nine countries. The population health modeling tool DYNAMO-HIA including country-specific disease data was used to predict the changes in prevalence of ischemic heart disease and stroke for each country estimating the effect of salt reduction through its effect on blood pressure levels. A 30% salt reduction would reduce the prevalence of stroke by 6.4% in Finland to 13.5% in Poland. Ischemic heart disease would be decreased by 4.1% in Finland to 8.9% in Poland. When salt intake is reduced to the WHO population nutrient goal, it would reduce the prevalence of stroke from 10.1% in Finland to 23.1% in Poland. Ischemic heart disease would decrease by 6.6% in Finland to 15.5% in Poland. The number of postponed deaths would be 102,100 (0.9% in France, and 191,300 (2.3% in Poland. A reduction of salt intake to 5 grams per day is expected to substantially reduce the burden of cardiovascular disease and mortality in several European countries.

  17. Health gain by salt reduction in europe: a modelling study.

    Science.gov (United States)

    Hendriksen, Marieke A H; van Raaij, Joop M A; Geleijnse, Johanna M; Breda, Joao; Boshuizen, Hendriek C

    2015-01-01

    Excessive salt intake is associated with hypertension and cardiovascular diseases. Salt intake exceeds the World Health Organization population nutrition goal of 5 grams per day in the European region. We assessed the health impact of salt reduction in nine European countries (Finland, France, Ireland, Italy, Netherlands, Poland, Spain, Sweden and United Kingdom). Through literature research we obtained current salt intake and systolic blood pressure levels of the nine countries. The population health modeling tool DYNAMO-HIA including country-specific disease data was used to predict the changes in prevalence of ischemic heart disease and stroke for each country estimating the effect of salt reduction through its effect on blood pressure levels. A 30% salt reduction would reduce the prevalence of stroke by 6.4% in Finland to 13.5% in Poland. Ischemic heart disease would be decreased by 4.1% in Finland to 8.9% in Poland. When salt intake is reduced to the WHO population nutrient goal, it would reduce the prevalence of stroke from 10.1% in Finland to 23.1% in Poland. Ischemic heart disease would decrease by 6.6% in Finland to 15.5% in Poland. The number of postponed deaths would be 102,100 (0.9%) in France, and 191,300 (2.3%) in Poland. A reduction of salt intake to 5 grams per day is expected to substantially reduce the burden of cardiovascular disease and mortality in several European countries.

  18. Estimating Resilience Across Landscapes

    Directory of Open Access Journals (Sweden)

    Garry D. Peterson

    2002-06-01

    Full Text Available Although ecological managers typically focus on managing local or regional landscapes, they often have little ability to control or predict many of the large-scale, long-term processes that drive changes within these landscapes. This lack of control has led some ecologists to argue that ecological management should aim to produce ecosystems that are resilient to change and surprise. Unfortunately, ecological resilience is difficult to measure or estimate in the landscapes people manage. In this paper, I extend system dynamics approaches to resilience and estimate resilience using complex landscape simulation models. I use this approach to evaluate cross-scale edge, a novel empirical method for estimating resilience based on landscape pattern. Cross-scale edge provides relatively robust estimates of resilience, suggesting that, with some further development, it could be used as a management tool to provide rough and rapid estimates of areas of resilience and vulnerability within a landscape.

  19. Automated data reduction workflows for astronomy

    CERN Document Server

    Freudling, W; Bramich, D M; Ballester, P; Forchi, V; Garcia-Dablo, C E; Moehler, S; Neeser, M J

    2013-01-01

    Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. The efficiency of data reduction can be improved by using automatic workflows to organise data and execute the sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowch...

  20. GumTree: Data reduction

    Science.gov (United States)

    Rayner, Hugh; Hathaway, Paul; Hauser, Nick; Fei, Yang; Franceschini, Ferdi; Lam, Tony

    2006-11-01

    Access to software tools for interactive data reduction, visualisation and analysis during a neutron scattering experiment enables instrument users to make informed decisions regarding the direction and success of their experiment. ANSTO aims to enhance the experiment experience of its facility's users by integrating these data reduction tools with the instrument control interface for immediate feedback. GumTree is a software framework and application designed to support an Integrated Scientific Experimental Environment, for concurrent access to instrument control, data acquisition, visualisation and analysis software. The Data Reduction and Analysis (DRA) module is a component of the GumTree framework that allows users to perform data reduction, correction and basic analysis within GumTree while an experiment is running. It is highly integrated with GumTree, able to pull experiment data and metadata directly from the instrument control and data acquisition components. The DRA itself uses components common to all instruments at the facility, providing a consistent interface. It features familiar ISAW-based 1D and 2D plotting, an OpenGL-based 3D plotter and peak fitting performed by fityk. This paper covers the benefits of integration, the flexibility of the DRA module, ease of use for the interface and audit trail generation.

  1. Tool Gear: Infrastructure for Parallel Tools

    Energy Technology Data Exchange (ETDEWEB)

    May, J; Gyllenhaal, J

    2003-04-17

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  2. Parameter Estimation

    DEFF Research Database (Denmark)

    2011-01-01

    of optimisation techniques coupled with dynamic solution of the underlying model. Linear and nonlinear approaches to parameter estimation are investigated. There is also the application of maximum likelihood principles in the estimation of parameters, as well as the use of orthogonal collocation to generate a set......In this chapter the importance of parameter estimation in model development is illustrated through various applications related to reaction systems. In particular, rate constants in a reaction system are obtained through parameter estimation methods. These approaches often require the application...... of algebraic equations as the basis for parameter estimation.These approaches are illustrated using estimations of kinetic constants from reaction system models....

  3. Estimation of the toxicity of silver nanoparticles by using planarian flatworms.

    Science.gov (United States)

    Kustov, Leonid; Tiras, Kharlampii; Al-Abed, Souhail; Golovina, Natalia; Ananyan, Mikhail

    2014-03-01

    The regeneration of planarian flatworms - specifically, changes to the area of the regeneration bud (blastema) after surgical dissection - was proposed for use as a robust tool for estimating the toxicity of silver nanoparticles. The use of Planaria species, due to their unique regenerative capacity, could result in a reduction in the use of more-traditional laboratory animals for toxicity testing. With our novel approach, silver nanoparticles were found to be moderately toxic to the planarian, Girardia tigrina.

  4. Uranium isotopes fingerprint biotic reduction

    Science.gov (United States)

    Stylo, Malgorzata; Neubert, Nadja; Wang, Yuheng; Monga, Nikhil; Romaniello, Stephen J.; Weyer, Stefan; Bernier-Latmani, Rizlan

    2015-01-01

    Knowledge of paleo-redox conditions in the Earth’s history provides a window into events that shaped the evolution of life on our planet. The role of microbial activity in paleo-redox processes remains unexplored due to the inability to discriminate biotic from abiotic redox transformations in the rock record. The ability to deconvolute these two processes would provide a means to identify environmental niches in which microbial activity was prevalent at a specific time in paleo-history and to correlate specific biogeochemical events with the corresponding microbial metabolism. Here, we demonstrate that the isotopic signature associated with microbial reduction of hexavalent uranium (U), i.e., the accumulation of the heavy isotope in the U(IV) phase, is readily distinguishable from that generated by abiotic uranium reduction in laboratory experiments. Thus, isotope signatures preserved in the geologic record through the reductive precipitation of uranium may provide the sought-after tool to probe for biotic processes. Because uranium is a common element in the Earth’s crust and a wide variety of metabolic groups of microorganisms catalyze the biological reduction of U(VI), this tool is applicable to a multiplicity of geological epochs and terrestrial environments. The findings of this study indicate that biological activity contributed to the formation of many authigenic U deposits, including sandstone U deposits of various ages, as well as modern, Cretaceous, and Archean black shales. Additionally, engineered bioremediation activities also exhibit a biotic signature, suggesting that, although multiple pathways may be involved in the reduction, direct enzymatic reduction contributes substantially to the immobilization of uranium. PMID:25902522

  5. Reduction of soil erosion on forest roads

    Science.gov (United States)

    Edward R. Burroughs; John G. King

    1989-01-01

    Presents the expected reduction in surface erosion from selected treatments applied to forest road traveledways, cutslopes, fillslopes, and ditches. Estimated erosion reduction is expressed as functions of ground cover, slope gradient, and soil properties whenever possible. A procedure is provided to select rock riprap size for protection of the road ditch.

  6. Integrated Wind Power Planning Tool

    DEFF Research Database (Denmark)

    Rosgaard, M. H.; Hahmann, Andrea N.; Nielsen, T. S.

    This poster describes the status as of April 2012 of the Public Service Obligation (PSO) funded project PSO 10464 \\Integrated Wind Power Planning Tool". The project goal is to integrate a meso scale numerical weather prediction (NWP) model with a statistical tool in order to better predict short...... term power variation from off shore wind farms, as well as to conduct forecast error assessment studies in preparation for later implementation of such a feature in an existing simulation model. The addition of a forecast error estimation feature will further increase the value of this tool, as it...

  7. ESTIMATION OF UNCERTAINTY AND VALIDATION OF ANALYTICAL PROCEDURES AS A QUALITY CONTROL TOOL THE EVALUATION OF UNCERTAINTY FOR AMINO ACID ANALYSIS WITH ION-EXCHANGE CHROMATOGRAPHY – CASE STUDY

    Directory of Open Access Journals (Sweden)

    Barbara Mickowska

    2013-02-01

    Full Text Available The aim of this study was to assess the importance of validation and uncertainty estimation related to the results of amino acid analysis using the ion-exchange chromatography with post-column derivatization technique. The method was validated and the components of standard uncertainty were identified and quantified to recognize the major contributions to uncertainty of analysis. Estimated relative extended uncertainty (k=2, P=95% varied in range from 9.03% to 12.68%. Quantification of the uncertainty components indicates that the contribution of the calibration concentration uncertainty is the largest and it plays the most important role in the overall uncertainty in amino acid analysis. It is followed by uncertainty of area of chromatographic peaks and weighing procedure of samples. The uncertainty of sample volume and calibration peak area may be negligible. The comparison of CV% with estimated relative uncertainty indicates that interpretation of research results can be misled without uncertainty estimation.

  8. A Machine Learning Tool for Weighted Regressions in Time, Discharge, and Season

    Directory of Open Access Journals (Sweden)

    Alexander Maestre

    2014-01-01

    Full Text Available A new machine learning tool has been developed to classify water stations with similar water quality trends. The tool is based on the statistical method, Weighted Regressions in Time, Discharge, and Season (WRTDS, developed by the United States Geological Survey (USGS to estimate daily concentrations of water constituents in rivers and streams based on continuous daily discharge data and discrete water quality samples collected at the same or nearby locations. WRTDS is based on parametric survival regressions using a jack-knife cross validation procedure that generates unbiased estimates of the prediction errors. One of the disadvantages of WRTDS is that it needs a large number of samples (n > 200 collected during at least two decades. In this article, the tool is used to evaluate the use of Boosted Regression Trees (BRT as an alternative to the parametric survival regressions for water quality stations with a small number of samples. We describe the development of the machine learning tool as well as an evaluation comparison of the two methods, WRTDS and BRT. The purpose of the tool is to evaluate the reduction in variability of the estimates by clustering data from nearby stations with similar concentration and discharge characteristics. The results indicate that, using clustering, the predicted concentrations using BRT are in general higher than the observed concentrations. In addition, it appears that BRT generates higher sum of square residuals than the parametric survival regressions.

  9. Lymphedema Risk Reduction Practices

    Science.gov (United States)

    ... now! Position Paper: Lymphedema Risk Reduction Practices Category: Position Papers Tags: Risks Archives Treatment risk reduction garments surgery obesity infection blood pressure trauma morbid obesity body weight ...

  10. Reduction of Baltic Sea nutrient inputs and allocation of abatement costs within the Baltic Sea catchment.

    Science.gov (United States)

    Wulff, Fredrik; Humborg, Christoph; Andersen, Hans Estrup; Blicher-Mathiesen, Gitte; Czajkowski, Mikołaj; Elofsson, Katarina; Fonnesbech-Wulff, Anders; Hasler, Berit; Hong, Bongghi; Jansons, Viesturs; Mörth, Carl-Magnus; Smart, James C R; Smedberg, Erik; Stålnacke, Per; Swaney, Dennis P; Thodsen, Hans; Was, Adam; Zylicz, Tomasz

    2014-02-01

    The Baltic Sea Action Plan (BSAP) requires tools to simulate effects and costs of various nutrient abatement strategies. Hierarchically connected databases and models of the entire catchment have been created to allow decision makers to view scenarios via the decision support system NEST. Increased intensity in agriculture in transient countries would result in increased nutrient loads to the Baltic Sea, particularly from Poland, the Baltic States, and Russia. Nutrient retentions are high, which means that the nutrient reduction goals of 135 000 tons N and 15 000 tons P, as formulated in the BSAP from 2007, correspond to a reduction in nutrient loadings to watersheds by 675 000 tons N and 158 000 tons P. A cost-minimization model was used to allocate nutrient reductions to measures and countries where the costs for reducing loads are low. The minimum annual cost to meet BSAP basin targets is estimated to 4.7 billion Euro.

  11. EFSA Panel on Biological Hazards (BIOHAZ); Scientific Opinion on a quantitative estimation of the public health impact of setting a new target for the reduction of Salmonella in broilers

    DEFF Research Database (Denmark)

    Hald, Tine

    This assessment relates the percentage of broiler-associated human salmonellosis cases to different Salmonella prevalences in broiler flocks in the European Union. It considers the contribution and relevance of different Salmonella serovars found in broilers to human salmonellosis. The model......, laying hens (eggs), pigs and turkeys respectively. Of the broiler-associated human salmonellosis cases, around 42% and 23% are estimated to be due to the serovars Salmonella Enteritidis and Salmonella Infantis respectively, while other serovars individually contributed less than 5%. Different scenarios...

  12. Automatic Calibration Of Manual Machine Tools

    Science.gov (United States)

    Gurney, Rex D.

    1990-01-01

    Modified scheme uses data from multiple positions and eliminates tedious positioning. Modification of computer program adapts calibration system for convenient use with manually-controlled machine tools. Developed for use on computer-controlled tools. Option added to calibration program allows data on random tool-axis positions to be entered manually into computer for reduction. Instead of setting axis to predetermined positions, operator merely sets it at variety of arbitrary positions.

  13. Model reduction of parametrized systems

    CERN Document Server

    Ohlberger, Mario; Patera, Anthony; Rozza, Gianluigi; Urban, Karsten

    2017-01-01

    The special volume offers a global guide to new concepts and approaches concerning the following topics: reduced basis methods, proper orthogonal decomposition, proper generalized decomposition, approximation theory related to model reduction, learning theory and compressed sensing, stochastic and high-dimensional problems, system-theoretic methods, nonlinear model reduction, reduction of coupled problems/multiphysics, optimization and optimal control, state estimation and control, reduced order models and domain decomposition methods, Krylov-subspace and interpolatory methods, and applications to real industrial and complex problems. The book represents the state of the art in the development of reduced order methods. It contains contributions from internationally respected experts, guaranteeing a wide range of expertise and topics. Further, it reflects an important effor t, carried out over the last 12 years, to build a growing research community in this field. Though not a textbook, some of the chapters ca...

  14. Tool-specific performance of vibration-reducing gloves for attenuating palm-transmitted vibrations in three orthogonal directions.

    Science.gov (United States)

    Dong, Ren G; Welcome, Daniel E; Peterson, Donald R; Xu, Xueyan S; McDowell, Thomas W; Warren, Christopher; Asaki, Takafumi; Kudernatsch, Simon; Brammer, Antony

    2014-11-01

    Vibration-reducing (VR) gloves have been increasingly used to help reduce vibration exposure, but it remains unclear how effective these gloves are. The purpose of this study was to estimate tool-specific performances of VR gloves for reducing the vibrations transmitted to the palm of the hand in three orthogonal directions (3-D) in an attempt to assess glove effectiveness and aid in the appropriate selection of these gloves. Four typical VR gloves were considered in this study, two of which can be classified as anti-vibration (AV) gloves according to the current AV glove test standard. The average transmissibility spectrum of each glove in each direction was synthesized based on spectra measured in this study and other spectra collected from reported studies. More than seventy vibration spectra of various tools or machines were considered in the estimations, which were also measured in this study or collected from reported studies. The glove performance assessments were based on the percent reduction of frequency-weighted acceleration as is required in the current standard for assessing the risk of vibration exposures. The estimated tool-specific vibration reductions of the gloves indicate that the VR gloves could slightly reduce (<5%) or marginally amplify (<10%) the vibrations generated from low-frequency (<25 Hz) tools or those vibrating primarily along the axis of the tool handle. With other tools, the VR gloves could reduce palm-transmitted vibrations in the range of 5%-58%, primarily depending on the specific tool and its vibration spectra in the three directions. The two AV gloves were not more effective than the other gloves with some of the tools considered in this study. The implications of the results are discussed.

  15. Natural Attenuation Software (NAS): A computer program for estimating remediation times of contaminated groundwater

    Science.gov (United States)

    Mendez, E.; Widdowson, M.; Brauner, S.; Chapelle, F.; Casey, C.; ,

    2004-01-01

    This paper describes the development and application of a modeling system called Natural Attenuation Software (NAS). NAS was designed as a screening tool to estimate times of remediation (TORs), associated with monitored natural attenuation (MNA), to lower groundwater contaminant concentrations to regulatory limits. Natural attenuation processes that NAS models include advection, dispersion, sorption, biodegradation, and non-aqueous phase liquid (NAPL) dissolution. This paper discusses the three main interactive components of NAS: 1) estimation of the target source concentration required for a plume extent to contract to regulatory limits, 2) estimation of the time required for NAFL contaminants in the source area to attenuate to a predetermined target source concentration, and 3) estimation of the time required for a plume extent to contract to regulatory limits after source reduction. The model's capability is illustrated by results from a case study at a MNA site, where NAS time of remediation estimates compared well with observed monitoring data over multiple years.

  16. Development of an expert data reduction assistant

    Science.gov (United States)

    Miller, Glenn E.; Johnston, Mark D.; Hanisch, Robert J.

    1993-01-01

    We propose the development of an expert system tool for the management and reduction of complex datasets. the proposed work is an extension of a successful prototype system for the calibration of CCD (charge coupled device) images developed by Dr. Johnston in 1987. (ref.: Proceedings of the Goddard Conference on Space Applications of Artificial Intelligence). The reduction of complex multi-parameter data sets presents severe challenges to a scientist. Not only must a particular data analysis system be mastered, (e.g. IRAF/SDAS/MIDAS), large amounts of data can require many days of tedious work and supervision by the scientist for even the most straightforward reductions. The proposed Expert Data Reduction Assistant will help the scientist overcome these obstacles by developing a reduction plan based on the data at hand and producing a script for the reduction of the data in a target common language.

  17. Vowel Reduction in Japanese

    Institute of Scientific and Technical Information of China (English)

    Shirai; Setsuko

    2009-01-01

    This paper reports the result that vowel reduction occurs in Japanese and vowel reduction is the part of the language universality.Compared with English,the effect of the vowel reduction in Japanese is relatively weak might because of the absence of stress in Japanese.Since spectral vowel reduction occurs in Japanese,various types of researches would be possible.

  18. Web tool to estimate the impact radiological associated transport of radioactive waste to ATC; Herramienta WEB para calcular el impacto radiologico asociado al transporte de residuos radiactivos al ATC

    Energy Technology Data Exchange (ETDEWEB)

    Calleja Rubio, J. A.; Gutierrez, F.; Colon, C.

    2012-07-01

    The transport of radioactive materials is a topic of renewed interest in our country due to the increasing mobility to be expected, especially after the entry into operation of the centralized buffer planned for the coming years. It is possible to estimate radiological impacts associated.

  19. Infinitary Combinatory Reduction Systems: Normalising Reduction Strategies

    NARCIS (Netherlands)

    Ketema, Jeroen; Simonsen, Jakob Grue

    2010-01-01

    We study normalising reduction strategies for infinitary Combinatory Reduction Systems (iCRSs). We prove that all fair, outermost-fair, and needed-fair strategies are normalising for orthogonal, fully-extended iCRSs. These facts properly generalise a number of results on normalising strategies in fi

  20. Estimating Risk Parameters

    OpenAIRE

    Aswath Damodaran

    1999-01-01

    Over the last three decades, the capital asset pricing model has occupied a central and often controversial place in most corporate finance analysts’ tool chests. The model requires three inputs to compute expected returns – a riskfree rate, a beta for an asset and an expected risk premium for the market portfolio (over and above the riskfree rate). Betas are estimated, by most practitioners, by regressing returns on an asset against a stock index, with the slope of the regression being the b...

  1. Drag reduction regime by hydrofoil and resistance estimation method for gliding-hydrofoil craft%高速艇水翼减阻方案及翼滑艇阻力估算方法

    Institute of Scientific and Technical Information of China (English)

    唐建飞; 杨帅

    2015-01-01

    This paper presents the drag reduction mechanism and regime by using hydrofoil and its effect for three types of high speed crafts. Resistance prediction method of full scale is given for planning boat with fore hydrofoil. A numerical model, based on nonlinear vortex lattice method, is developed for hydrodynam-ic prediction of single hydrofoil or its assembly. Numerical results are agree well with the test results. The method presented in this paper can be used for initial design of hydrofoil crafts.%文章介绍了高速艇上水翼减阻的原理以及三种不同类型的高速艇上加装水翼的技术方案及其达到的减阻效果,并给出了滑行艇首部加装水翼(即翼滑艇)后整船阻力的估算方法。基于三维非线性涡格法,建立了单独水翼/水翼组合体/多水翼系统的水动力性能理论计算方法,计算结果与试验结果吻合较好,可作为翼滑艇阻力估算中单独水翼水动力性能的计算方法。算例结果表明,文中的方法可用于单独水翼/水翼组合体/多水翼系统和滑行艇加装减阻水翼的初步技术方案设计。

  2. 需求侧响应对降低发电成本的效益估计%Estimation of Generating Costs Reduction as a Result of Demand Response

    Institute of Scientific and Technical Information of China (English)

    翟桥柱; 王凌云

    2014-01-01

    实施分时电价的重要目的之一是鼓励用户调整其用电模式,即做出负荷响应。由此希望诱导出更为合理的系统总负荷曲线,即使得电力生产中的总发电成本、煤耗及排放量更小。在实施分时电价前对这种潜在效益做出某种估计,找到最满意的总负荷曲线,对于制定合理的电价至关重要。为此,对负荷响应导致的发电成本降低效益提出了一种估计方法。该方法基于求解一种特殊形式的安全约束机组组合问题而实现,同时也得到了使总发电成本最低的最优系统总负荷曲线。有关数值试验结果验证了方法的有效性。由于电价曲线与负荷曲线之间的密切联系,所以该方法可以为制定合理的电力定价策略提供参考。%One of the most important aims of time-of-use electricity pricing is to encourage the customers to adjust their load profiles. As a result, the aggregate system load profile of a grid may be more preferable, i.e., the corresponding generating costs, fuel consumption and emission in electricity generation will be reduced. Estimating the potential benefits before the implementation of time-of-use pricing and finding the most satisfactory system load profile are very important in determining the suitable level of electricity prices. In this paper, the potential benefits resulted from demand response were estimated by solving a special kind of security constrained unit commitment (SCUC) problem and the optimal system load profile was also obtained. Numerical testing was performed for two example systems and the results were also analyzed. The proposed method may provide reference for finding suitable pricing strategies.

  3. Certifying Tools for Test Reduction in Open Architecture

    Science.gov (United States)

    2012-04-30

    requirements, requirements on the subsystems, and a sound software evolution model capturing the design rationale ( Rajkumar , Lee, Sha, & Stankovic, 2010...International Conference on Embedded Systems and Applications. Rajkumar , R., Lee, I., Sha, L., & Stankovic, J. (2010). Cyber-physical systems: The

  4. Genome-Enabled Molecular Tools for Reductive Dehalogenation

    Science.gov (United States)

    2011-11-01

    operon , vcrABC, of Dehalococcoides sp. strain VS is embedded in a horizontally-acquired genomic island that integrated at the single-copy gene ssrA. The...vinyl chloride reductase operon , vcrABC, of Dehalococcoides sp. strain VS is embedded in a horizontally-acquired genomic island that integrated at

  5. Lean-Six Sigma: tools for rapid cycle cost reduction.

    Science.gov (United States)

    Caldwell, Chip

    2006-10-01

    Organizational costs can be grouped as process cost, cost of quality, and cost of poor quality. Providers should train managers in the theory and application of Lean-Six Sigma, including the seven categories of waste and how to remove them. Healthcare financial executives should work with managers in eliminating waste to improve service and reduce costs.

  6. Downhole tool with replaceable tool sleeve sections

    Energy Technology Data Exchange (ETDEWEB)

    Case, W. A.

    1985-10-29

    A downhole tool for insertion in a drill stem includes elongated cylindrical half sleeve tool sections adapted to be non-rotatably supported on an elongated cylindrical body. The tool sections are mountable on and removable from the body without disconnecting either end of the tool from a drill stem. The half sleeve tool sections are provided with tapered axially extending flanges on their opposite ends which fit in corresponding tapered recesses formed on the tool body and the tool sections are retained on the body by a locknut threadedly engaged with the body and engageable with an axially movable retaining collar. The tool sections may be drivably engaged with axial keys formed on the body or the tool sections may be formed with flat surfaces on the sleeve inner sides cooperable with complementary flat surfaces formed on a reduced diameter portion of the body around which the tool sections are mounted.

  7. Parameter Estimation

    DEFF Research Database (Denmark)

    Sales-Cruz, Mauricio; Heitzig, Martina; Cameron, Ian;

    2011-01-01

    of optimisation techniques coupled with dynamic solution of the underlying model. Linear and nonlinear approaches to parameter estimation are investigated. There is also the application of maximum likelihood principles in the estimation of parameters, as well as the use of orthogonal collocation to generate a set...

  8. 小卫星CCD相机MTF在轨测量与图像复原%On-orbit MTF estimation and restoration for CCD cameras of environment and disaster reduction small satellites (HJ 1A/1B)

    Institute of Scientific and Technical Information of China (English)

    杨贵军; 邢著荣; 黄文江; 王纪华

    2011-01-01

    Focusing on the CCD cameras of environment and disaster reduction small satellites (HJ1A/1B),an approach on the on-orbit measurement of modulation transfer function (MTF)to the imaging system was proposed. According to selecting the distinct edges among two homogeneous objects from images, the degraded images were processed with smooth filtering, curve fitting and the forward Fourier transform(FFT) of line spread function(LSF) to obtain the point spread function (PSF). After that, MTF corresponding to PSF can be obtained through FFT. Taking into account the noise of the complex constraint restoration method (CRM) without universality, the deconvolution restoration method in frequency domain was proposed, because it had the advantage that we neet not prove the parameters of sensor and atmosphere. Finally, analyzing and evaluating the restored image were carried out based on three factors: the feature value, definition and texture, respectively. The results show that the quality of HJ1A/1B image has been obviously improved, and achieved good restoration results.%基于调制传递函数的基本理论,针对环境与减灾小卫星光学相机,基于刃边法给出了一种在轨测量成像系统调制传递函数的方法.通过选取图像中均匀地物之间灰度变化明显的边缘,对有限采样点进行平滑滤波、曲线拟合和对线扩展函数(LSF)的傅里叶变换获得点扩展函数(PSF),提取调制传递函数(MTF)曲线,进而选取最优的MTF曲线.考虑到采用有约束复原方法时噪声统计、估计的复杂性和欠普适性,本方法通过频率域的去卷积进行复原处理,其优势在于不需要提供传感器、大气状况等参数.并对复原图像从统计特征、清晰度、纹理特征等3方面进行了定量分析和评价.结果表明:复原后的环境与减灾小卫星图像质量得到明显提高,收到良好的复原效果.

  9. Local reduction in physics

    Science.gov (United States)

    Rosaler, Joshua

    2015-05-01

    A conventional wisdom about the progress of physics holds that successive theories wholly encompass the domains of their predecessors through a process that is often called "reduction." While certain influential accounts of inter-theory reduction in physics take reduction to require a single "global" derivation of one theory's laws from those of another, I show that global reductions are not available in all cases where the conventional wisdom requires reduction to hold. However, I argue that a weaker "local" form of reduction, which defines reduction between theories in terms of a more fundamental notion of reduction between models of a single fixed system, is available in such cases and moreover suffices to uphold the conventional wisdom. To illustrate the sort of fixed-system, inter-model reduction that grounds inter-theoretic reduction on this picture, I specialize to a particular class of cases in which both models are dynamical systems. I show that reduction in these cases is underwritten by a mathematical relationship that follows a certain liberalized construal of Nagel/Schaffner reduction, and support this claim with several examples. Moreover, I show that this broadly Nagelian analysis of inter-model reduction encompasses several cases that are sometimes cited as instances of the "physicist's" limit-based notion of reduction.

  10. A Smart Thermal Block Diagram Tool

    Science.gov (United States)

    Tsuyuki, Glenn; Miyake, Robert; Dodge, Kyle

    2008-01-01

    The presentation describes a Smart Thermal Block Diagram Tool. It is used by JPL's Team X in studying missions during the Pre-Phase A. It helps generate cost and mass estimates using proprietary data bases.

  11. Reduction in language testing

    DEFF Research Database (Denmark)

    Dimova, Slobodanka; Jensen, Christian

    2013-01-01

    /video recorded speech samples and written reports produced by two experienced raters after testing. Our findings suggest that reduction or reduction-like pronunciation features are found in tested L2 speech, but whenever raters identify and comment on such reductions, they tend to assess reductions negatively......This study represents an initial exploration of raters' comments and actual realisations of form reductions in L2 test speech performances. Performances of three L2 speakers were selected as case studies and illustrations of how reductions are evaluated by the raters. The analysis is based on audio...

  12. Un outil moderne d'estimation pour l'industrie pétrolière : les modèles mathématiques de coût A Modern Tool of Estimation for the Oil Industry: Mathematical Models of Cost

    Directory of Open Access Journals (Sweden)

    Fournier G.

    2006-11-01

    , the costs and schedules of conception and manufacturing, for the considered product. Now, technological evolution, which is faster and faster, often forbids to use classical estimation methods. Asa matter of fact, either they need too detailed a description of the activities necessary to design the product(analytical or detailed approach, or they make it necessary to dispose of too many neighbouring reference points(analogical approach. Therefore, it is fitting to dispose of a complementary tool. Mathematical models of cost have been developed in orderto answer this need. They are based on a functional description and make use of universal relationships , that originate in the following principle : the cost of an equipment is linked to its internal thermodynamics. In other words, other things being equal , it is all the more expensive as it manipulates , a lot of energy per mass unit, or else as it requires less mass to manipulatea given amount of energy. Practically, if one still makes reference to former products, it is not compulsory for the latter to be analogous , with the one to be valorized; they are used for calibrating the corporate culture , of the company. The purpose of the work broached at the Institut Français du Pétrole (IFP is to show that this kind of methods may be very useful for the petroleum industry. Two actual examples have been dealt with (the Packinox plate heat exchangers and the automotive engines and are the subject of the last part of this article. They f ollow more theoretical a process that presents the diff erent approaches and emphasizes their respective advantages and drawbacks.

  13. Chromatography process development in the quality by design paradigm I: Establishing a high-throughput process development platform as a tool for estimating "characterization space" for an ion exchange chromatography step.

    Science.gov (United States)

    Bhambure, R; Rathore, A S

    2013-01-01

    This article describes the development of a high-throughput process development (HTPD) platform for developing chromatography steps. An assessment of the platform as a tool for establishing the "characterization space" for an ion exchange chromatography step has been performed by using design of experiments. Case studies involving use of a biotech therapeutic, granulocyte colony-stimulating factor have been used to demonstrate the performance of the platform. We discuss the various challenges that arise when working at such small volumes along with the solutions that we propose to alleviate these challenges to make the HTPD data suitable for empirical modeling. Further, we have also validated the scalability of this platform by comparing the results from the HTPD platform (2 and 6 μL resin volumes) against those obtained at the traditional laboratory scale (resin volume, 0.5 mL). We find that after integration of the proposed correction factors, the HTPD platform is capable of performing the process optimization studies at 170-fold higher productivity. The platform is capable of providing semi-quantitative assessment of the effects of the various input parameters under consideration. We think that platform such as the one presented is an excellent tool for examining the "characterization space" and reducing the extensive experimentation at the traditional lab scale that is otherwise required for establishing the "design space." Thus, this platform will specifically aid in successful implementation of quality by design in biotech process development. This is especially significant in view of the constraints with respect to time and resources that the biopharma industry faces today. Copyright © 2013 American Institute of Chemical Engineers.

  14. Fractionation of Nitrogen and Oxygen Isotopes During Microbial Nitrate Reduction

    Science.gov (United States)

    Lehmann, M. F.; Bernasconi, S. M.; Reichert, P.; Barbieri, A.; McKenzie, J. A.

    2001-12-01

    Lakes represent an important continental sink of fixed nitrogen. Besides the burial of particulate nitrogen, fixed nitrogen is eliminated from lakes by emission of N2 and N2O to the atmosphere during dissimilative nitrate reduction within suboxic and anoxic waters or sediments. The understanding and quantification of this efficient nitrogen removal process in eutrophic lakes is crucial for nitrogen budget modelling and the application and evaluation of lake restoration measures. In order to use natural abundance N and O isotope ratios as tracers for microbial nitrate reduction and to obtain quantitative estimates on its intensity, it is crucial to constrain the associated isotope fractionation. This is the first report of nitrogen and oxygen isotope effects associated with microbial nitrate reduction in lacustrine environments. Nitrate reduction in suboxic and anoxic waters of the southern basin of Lake Lugano (Switzerland) is demonstrated by a progressive nitrate depletion coupled to increasing δ 15N and δ 18O values for residual nitrate. 15N and 18O enrichment factors (ɛ ) were estimated using a closed-system (Rayleigh-distillation) model and a dynamic reaction-diffusion model. Calculated enrichment factors ɛ ranged between -11.2 and -22‰ for 15N and between -6.6 and -11.3‰ for 18O with both nitrogen and oxygen isotope fractionation being greatest during times with the highest nitrate reduction rates. The closed-system model neglects vertical diffusive mixing and does not distinguish between sedimentary and water-column nitrate reduction. Therefore, it tends to underestimate the intrinsic isotope effect of microbial nitrate reduction. Based upon results from earlier studies that indicate that nitrate reduction in sediments displays a highly reduced N-isotope effect (Brandes and Devol, 1997), model-derived enrichment factors could be used to discern the relative importance of nitrate reduction in the water column and in the sediment. Sedimentary nitrate

  15. Useful design tools?

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole

    2005-01-01

    vague and contested concept of sustainability into concrete concepts and building projects. It describes a typology of tools: process tools, impact assessment tools, multi-criteria tools and tools for monitoring. It includes a Danish paradigmatic case study of stakeholder participation in the planning...... of a new sustainable settlement. The use of design tools is discussed in relation to innovation and stakeholder participation, and it is stressed that the usefulness of design tools is context dependent....

  16. Modern Reduction Methods

    CERN Document Server

    Andersson, Pher G

    2008-01-01

    With its comprehensive overview of modern reduction methods, this book features high quality contributions allowing readers to find reliable solutions quickly and easily. The monograph treats the reduction of carbonyles, alkenes, imines and alkynes, as well as reductive aminations and cross and heck couplings, before finishing off with sections on kinetic resolutions and hydrogenolysis. An indispensable lab companion for every chemist.

  17. Sulfate reduction in freshwater peatlands

    Energy Technology Data Exchange (ETDEWEB)

    Oequist, M.

    1996-12-31

    This text consist of two parts: Part A is a literature review on microbial sulfate reduction with emphasis on freshwater peatlands, and part B presents the results from a study of the relative importance of sulfate reduction and methane formation for the anaerobic decomposition in a boreal peatland. The relative importance of sulfate reduction and methane production for the anaerobic decomposition was studied in a small raised bog situated in the boreal zone of southern Sweden. Depth distribution of sulfate reduction- and methane production rates were measured in peat sampled from three sites (A, B, and C) forming an minerotrophic-ombrotrophic gradient. SO{sub 4}{sup 2-} concentrations in the three profiles were of equal magnitude and ranged from 50 to 150 {mu}M. In contrast, rates of sulfate reduction were vastly different: Maximum rates in the three profiles were obtained at a depth of ca. 20 cm below the water table. In A it was 8 {mu}M h{sup -1} while in B and C they were 1 and 0.05 {mu}M h{sup -1}, respectively. Methane production rates, however, were more uniform across the three nutrient regimes. Maximum rates in A (ca. 1.5 {mu}g d{sup -1} g{sup -1}) were found 10 cm below the water table, in B (ca. 1.0 {mu}g d{sup -1} g{sup -1}) in the vicinity of the water table, and in C (0.75 {mu}g d{sup -1} g{sup -1}) 20 cm below the water table. In all profiles both sulfate reduction and methane production rates were negligible above the water table. The areal estimates of methane production for the profiles were 22.4, 9.0 and 6.4 mmol m{sup -2} d{sup -1}, while the estimates for sulfate reduction were 26.4, 2.5, and 0.1 mmol m{sup -2} d{sup -1}, respectively. The calculated turnover times at the sites were 1.2, 14.2, and 198.7 days, respectively. The study shows that sulfate reducing bacteria are important for the anaerobic degradation in the studied peatland, especially in the minerotrophic sites, while methanogenic bacteria dominate in ombrotrophic sites Examination

  18. CoC GIS Tools (GIS Tool)

    Data.gov (United States)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  19. Perceptual effects of noise reduction by time-frequency masking of noisy speech.

    Science.gov (United States)

    Brons, Inge; Houben, Rolph; Dreschler, Wouter A

    2012-10-01

    Time-frequency masking is a method for noise reduction that is based on the time-frequency representation of a speech in noise signal. Depending on the estimated signal-to-noise ratio (SNR), each time-frequency unit is either attenuated or not. A special type of a time-frequency mask is the ideal binary mask (IBM), which has access to the real SNR (ideal). The IBM either retains or removes each time-frequency unit (binary mask). The IBM provides large improvements in speech intelligibility and is a valuable tool for investigating how different factors influence intelligibility. This study extends the standard outcome measure (speech intelligibility) with additional perceptual measures relevant for noise reduction: listening effort, noise annoyance, speech naturalness, and overall preference. Four types of time-frequency masking were evaluated: the original IBM, a tempered version of the IBM (called ITM) which applies limited and non-binary attenuation, and non-ideal masking (also tempered) with two different types of noise-estimation algorithms. The results from ideal masking imply that there is a trade-off between intelligibility and sound quality, which depends on the attenuation strength. Additionally, the results for non-ideal masking suggest that subjective measures can show effects of noise reduction even if noise reduction does not lead to differences in intelligibility.

  20. Results Evaluation in Reduction Rhinoplasty

    Directory of Open Access Journals (Sweden)

    Arima, Lisandra Megumi

    2011-01-01

    Full Text Available Introduction: Final results evaluation after rhinoplasty is a not a topic widely studied from the patient's viewpoint. Objective:Evaluate the satisfaction of the patients submitted to reduction rhinoplasty, from the questionnaire Rhinoplasty Outcomes Evaluation (ROE. Method: Longitudinal study, retrospective cut type, of the preoperative and postoperative satisfaction. The sample was composed by 28 patients who were submitted to rhinoplasty and answered the ROE questionnaire. Three variables were obtained: satisfaction note that the patient had with his/her image before the surgery; note of satisfaction with the current appearance; the difference of the average satisfaction notes between postoperative and preoperative approaches. Results: The postoperative note was higher than the preoperative in all patients. We noticed a difference between the average of the postoperative and preoperative of 48.3 (p75 considered to be an excellent outcome (67.9%. Conclusions: The ROE questionnaire is a helpful tool to show the satisfaction of the patient submitted to reduction rhinoplasty. About 92% of the patients submitted to reduction rhinoplasty consider the postoperative result to be good or excellent.

  1. Multi-Criteria Decision Analysis as a tool to extract fishing footprints and estimate fishing pressure: application to small scale coastal fisheries and implications for management in the context of the Maritime Spatial Planning Directive

    Directory of Open Access Journals (Sweden)

    S. KAVADAS

    2015-07-01

    Full Text Available In the context of the Maritime Spatial Planning Directive and with the intention of contributing to the implementation of a future maritime spatial plan, it was decided to analyze data from the small scale coastal fisheries sector of Greece and estimate the actual extent of its activities, which is largely unknown to date. To this end we identified the most influential components affecting coastal fishing: fishing capacity, bathymetry, distance from coast, Sea Surface Chlorophyll (Chl-a concentration, legislation, marine traffic activity, trawlers and purse seiners fishing effort and no-take zones. By means of Multi-Criteria Decision Analysis (MCDA conducted through a stepwise procedure, the potential fishing footprint with the corresponding fishing intensity was derived. The method provides an innovative and cost-effective way to assess the impact of the, notoriously hard to assess, coastal fleet. It was further considered how the inclusion of all relevant anthropogenic activities (besides fishing could provide the background needed to plan future marine activities in the framework of Marine Spatial Planning (MSP and form the basis for a more realistic management approach.

  2. Geostatistics and remote sensing as predictive tools of tick distribution: a cokriging sy