WorldWideScience

Sample records for reduction estimation tool

  1. Spreadsheet tool for estimating noise reduction costs

    International Nuclear Information System (INIS)

    Frank, L.; Senden, V.; Leszczynski, Y.

    2009-01-01

    The Northeast Capital Industrial Association (NCIA) represents industry in Alberta's industrial heartland. The organization is in the process of developing a regional noise management plan (RNMP) for their member companies. The RNMP includes the development of a noise reduction cost spreadsheet tool to conduct reviews of practical noise control treatments available for individual plant equipment, inclusive of ranges of noise attenuation achievable, which produces a budgetary prediction of the installed cost of practical noise control treatments. This paper discussed the noise reduction cost spreadsheet tool, with particular reference to noise control best practices approaches and spreadsheet tool development such as prerequisite, assembling data required, approach, and unit pricing database. Use and optimization of the noise reduction cost spreadsheet tool was also discussed. It was concluded that the noise reduction cost spreadsheet tool is an easy interactive tool to estimate implementation costs related to different strategies and options of noise control mitigating measures and was very helpful in gaining insight for noise control planning purposes. 2 tabs.

  2. Tools for estimating VMT reductions from built environment changes.

    Science.gov (United States)

    2013-06-01

    Built environment characteristics are associated with walking, bicycling, transit use, and vehicle : miles traveled (VMT). Developing built environments supportive of walking, bicycling, and transit use : can help meet state VMT reduction goals. But ...

  3. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  4. Tools to support GHG emissions reduction : a regional effort, part 1 - carbon footprint estimation and decision support.

    Science.gov (United States)

    2010-09-01

    Tools are proposed for carbon footprint estimation of transportation construction projects and decision support : for construction firms that must make equipment choice and usage decisions that affect profits, project duration : and greenhouse gas em...

  5. Toxicity Estimation Software Tool (TEST)

    Science.gov (United States)

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  6. Field scale modeling to estimate phosphorus and sediment load reductions using a newly developed graphical user interface for soil and water assessment tool

    Science.gov (United States)

    Streams throughout the North Canadian River watershed in northwest Oklahoma, USA have elevated levels of nutrients and sediment. SWAT (Soil and Water Assessment Tool) was used to identify areas that likely contributed disproportionate amounts of phosphorus (P) and sediment to Lake Overholser, the re...

  7. Development of a simple estimation tool for LMFBR construction cost

    International Nuclear Information System (INIS)

    Yoshida, Kazuo; Kinoshita, Izumi

    1999-01-01

    A simple tool for estimating the construction costs of liquid-metal-cooled fast breeder reactors (LMFBRs), 'Simple Cost' was developed in this study. Simple Cost is based on a new estimation formula that can reduce the amount of design data required to estimate construction costs. Consequently, Simple cost can be used to estimate the construction costs of innovative LMFBR concepts for which detailed design has not been carried out. The results of test calculation show that Simple Cost provides cost estimations equivalent to those obtained with conventional methods within the range of plant power from 325 to 1500 MWe. Sensitivity analyses for typical design parameters were conducted using Simple Cost. The effects of four major parameters - reactor vessel diameter, core outlet temperature, sodium handling area and number of secondary loops - on the construction costs of LMFBRs were evaluated quantitatively. The results show that the reduction of sodium handling area is particularly effective in reducing construction costs. (author)

  8. Modelling stillbirth mortality reduction with the Lives Saved Tool

    Directory of Open Access Journals (Sweden)

    Hannah Blencowe

    2017-11-01

    Full Text Available Abstract Background The worldwide burden of stillbirths is large, with an estimated 2.6 million babies stillborn in 2015 including 1.3 million dying during labour. The Every Newborn Action Plan set a stillbirth target of ≤12 per 1000 in all countries by 2030. Planning tools will be essential as countries set policy and plan investment to scale up interventions to meet this target. This paper summarises the approach taken for modelling the impact of scaling-up health interventions on stillbirths in the Lives Saved tool (LiST, and potential future refinements. Methods The specific application to stillbirths of the general method for modelling the impact of interventions in LiST is described. The evidence for the effectiveness of potential interventions to reduce stillbirths are reviewed and the assumptions of the affected fraction of stillbirths who could potentially benefit from these interventions are presented. The current assumptions and their effects on stillbirth reduction are described and potential future improvements discussed. Results High quality evidence are not available for all parameters in the LiST stillbirth model. Cause-specific mortality data is not available for stillbirths, therefore stillbirths are modelled in LiST using an attributable fraction approach by timing of stillbirths (antepartum/ intrapartum. Of 35 potential interventions to reduce stillbirths identified, eight interventions are currently modelled in LiST. These include childbirth care, induction for prolonged pregnancy, multiple micronutrient and balanced energy supplementation, malaria prevention and detection and management of hypertensive disorders of pregnancy, diabetes and syphilis. For three of the interventions, childbirth care, detection and management of hypertensive disorders of pregnancy, and diabetes the estimate of effectiveness is based on expert opinion through a Delphi process. Only for malaria is coverage information available, with coverage

  9. Careers Education: An Effective Tool for Poverty Reduction | Okafor ...

    African Journals Online (AJOL)

    Careers Education: An Effective Tool for Poverty Reduction. ... Open Access DOWNLOAD FULL TEXT Subscription or Fee Access ... The research was carried out based mainly on the secondary source of data. Among other things, the study ...

  10. Establishing credible emission reduction estimates: GERT's experience

    International Nuclear Information System (INIS)

    Loseth, H.

    2001-01-01

    To address the challenge of reducing the greenhouse gas emissions in Canada, the federal and provincial governments are developing strategies and policies to reach that goal. One of the proposed solutions is the establishment of an emission trading system, which it is believed would encourage investment in lower-cost reductions. The Greenhouse Gas Emission Reduction Trading (GERT) pilot was established in 1998 to examine emission trading. It represents the collaborative efforts of government, industry, and non-governmental organizations. It is possible to establish emission reduction trading outside of a regulated environment. Emission reduction is defined as being an action which reduces emissions when compared to what they would have been otherwise. The functioning of GERT was described from the initial application by a buyer/seller to the review process. The assessment of projects is based on mandatory criteria: reductions of emissions must be real, measurable, verifiable and surplus. A section of the presentation was devoted to landfill gas recovery project issues, while another dealt with fuel substitution project issues. Section 5 discussed emission reductions from an off-site source electricity project issues. figs

  11. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  12. GIS Tools to Estimate Average Annual Daily Traffic

    Science.gov (United States)

    2012-06-01

    This project presents five tools that were created for a geographical information system to estimate Annual Average Daily : Traffic using linear regression. Three of the tools can be used to prepare spatial data for linear regression. One tool can be...

  13. Effective tool wear estimation through multisensory information ...

    African Journals Online (AJOL)

    On-line tool wear monitoring plays a significant role in industrial automation for higher productivity and product quality. In addition, an intelligent system is required to make a timely decision for tool change in machining systems in order to avoid the subsequent consequences on the dimensional accuracy and surface finish ...

  14. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    Science.gov (United States)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  15. Estimation of toxicity using the Toxicity Estimation Software Tool (TEST)

    Science.gov (United States)

    Tens of thousands of chemicals are currently in commerce, and hundreds more are introduced every year. Since experimental measurements of toxicity are extremely time consuming and expensive, it is imperative that alternative methods to estimate toxicity are developed.

  16. Estimate of Possible CO2 Emission Reduction in Slovenia

    International Nuclear Information System (INIS)

    Plavcak, V.-P.; Jevsek, F.; Tirsek, A.

    1998-01-01

    The first estimation of possible CO 2 emission reduction, according to the obligations from Kyoto Protocol, is prepared. The results show that the required 8% reduction of greenhouses gases in Slovenia in the period from 2008 to 2012 with regard to year 1986 will require a through analytical treatment not only in electric power sector but also in transport and industry sectors, which are the main pollutants. (author)

  17. TEST (Toxicity Estimation Software Tool) Ver 4.1

    Science.gov (United States)

    The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to allow users to easily estimate toxicity and physical properties using a variety of QSAR methodologies. T.E.S.T allows a user to estimate toxicity without requiring any external programs. Users can input a chem...

  18. Temporal rainfall estimation using input data reduction and model inversion

    Science.gov (United States)

    Wright, A. J.; Vrugt, J. A.; Walker, J. P.; Pauwels, V. R. N.

    2016-12-01

    Floods are devastating natural hazards. To provide accurate, precise and timely flood forecasts there is a need to understand the uncertainties associated with temporal rainfall and model parameters. The estimation of temporal rainfall and model parameter distributions from streamflow observations in complex dynamic catchments adds skill to current areal rainfall estimation methods, allows for the uncertainty of rainfall input to be considered when estimating model parameters and provides the ability to estimate rainfall from poorly gauged catchments. Current methods to estimate temporal rainfall distributions from streamflow are unable to adequately explain and invert complex non-linear hydrologic systems. This study uses the Discrete Wavelet Transform (DWT) to reduce rainfall dimensionality for the catchment of Warwick, Queensland, Australia. The reduction of rainfall to DWT coefficients allows the input rainfall time series to be simultaneously estimated along with model parameters. The estimation process is conducted using multi-chain Markov chain Monte Carlo simulation with the DREAMZS algorithm. The use of a likelihood function that considers both rainfall and streamflow error allows for model parameter and temporal rainfall distributions to be estimated. Estimation of the wavelet approximation coefficients of lower order decomposition structures was able to estimate the most realistic temporal rainfall distributions. These rainfall estimates were all able to simulate streamflow that was superior to the results of a traditional calibration approach. It is shown that the choice of wavelet has a considerable impact on the robustness of the inversion. The results demonstrate that streamflow data contains sufficient information to estimate temporal rainfall and model parameter distributions. The extent and variance of rainfall time series that are able to simulate streamflow that is superior to that simulated by a traditional calibration approach is a

  19. Bats: A new tool for AMS data reduction

    International Nuclear Information System (INIS)

    Wacker, L.; Christl, M.; Synal, H.-A.

    2010-01-01

    A data evaluation program was developed at ETH Zurich to meet the requirements of the new compact AMS systems MICADAS and TANDY in addition to the large EN-Tandem accelerator. The program, called 'BATS', is designed to automatically calculate standard and blank corrected results for measured samples. After almost one year of routine operation with the MICADAS C-14 system BATS has proven to be an easy to use data reduction tool that requires minimal user input. Here we present the fundamental principle and the algorithms used in BATS for standard-sized radiocarbon measurements.

  20. Bats: A new tool for AMS data reduction

    Energy Technology Data Exchange (ETDEWEB)

    Wacker, L., E-mail: Wacker@phys.ethz.c [Ion Beam Physics, ETH Zurich (Switzerland); Christl, M.; Synal, H.-A. [Ion Beam Physics, ETH Zurich (Switzerland)

    2010-04-15

    A data evaluation program was developed at ETH Zurich to meet the requirements of the new compact AMS systems MICADAS and TANDY in addition to the large EN-Tandem accelerator. The program, called 'BATS', is designed to automatically calculate standard and blank corrected results for measured samples. After almost one year of routine operation with the MICADAS C-14 system BATS has proven to be an easy to use data reduction tool that requires minimal user input. Here we present the fundamental principle and the algorithms used in BATS for standard-sized radiocarbon measurements.

  1. An Overview Of Tool For Response Action Cost Estimating (TRACE)

    International Nuclear Information System (INIS)

    Ferries, S.R.; Klink, K.L.; Ostapkowicz, B.

    2012-01-01

    Tools and techniques that provide improved performance and reduced costs are important to government programs, particularly in current times. An opportunity for improvement was identified for preparation of cost estimates used to support the evaluation of response action alternatives. As a result, CH2M HILL Plateau Remediation Company has developed Tool for Response Action Cost Estimating (TRACE). TRACE is a multi-page Microsoft Excel(reg s ign) workbook developed to introduce efficiencies into the timely and consistent production of cost estimates for response action alternatives. This tool combines costs derived from extensive site-specific runs of commercially available remediation cost models with site-specific and estimator-researched and derived costs, providing the best estimating sources available. TRACE also provides for common quantity and key parameter links across multiple alternatives, maximizing ease of updating estimates and performing sensitivity analyses, and ensuring consistency.

  2. RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

    Energy Technology Data Exchange (ETDEWEB)

    Robert S. Balch; Ron Broadhead

    2005-03-01

    Incomplete or sparse data such as geologic or formation characteristics introduce a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries have demonstrated beneficial results when working with sparse data. State-of-the-art expert exploration tools, relying on a database, and computer maps generated by neural networks and user inputs, have been developed through the use of ''fuzzy'' logic, a mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk has been reduced with the use of these properly verified and validated ''Fuzzy Expert Exploration (FEE) Tools.'' Through the course of this project, FEE Tools and supporting software were developed for two producing formations in southeast New Mexico. Tools of this type can be beneficial in many regions of the U.S. by enabling risk reduction in oil and gas prospecting as well as decreased prospecting and development costs. In today's oil industry environment, many smaller exploration companies lack the resources of a pool of expert exploration personnel. Downsizing, volatile oil prices, and scarcity of domestic exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the U.S. as reserves are depleted. The FEE Tools benefit a diverse group in the U.S., allowing a more efficient use of scarce funds, and potentially reducing dependence on foreign oil and providing lower product prices for consumers.

  3. Reduction of inequalities in health: assessing evidence-based tools

    Directory of Open Access Journals (Sweden)

    Shea Beverley

    2006-09-01

    Full Text Available Abstract Background The reduction of health inequalities is a focus of many national and international health organisations. The need for pragmatic evidence-based approaches has led to the development of a number of evidence-based equity initiatives. This paper describes a new program that focuses upon evidence- based tools, which are useful for policy initiatives that reduce inequities. Methods This paper is based on a presentation that was given at the "Regional Consultation on Policy Tools: Equity in Population Health Reports," held in Toronto, Canada in June 2002. Results Five assessment tools were presented. 1. A database of systematic reviews on the effects of educational, legal, social, and health interventions to reduce unfair inequalities is being established through the Cochrane and Campbell Collaborations. 2 Decision aids and shared decision making can be facilitated in disadvantaged groups by 'health coaches' to help people become better decision makers, negotiators, and navigators of the health system; a pilot study in Chile has provided proof of this concept. 3. The CIET Cycle: Combining adapted cluster survey techniques with qualitative methods, CIET's population based applications support evidence-based decision making at local and national levels. The CIET map generates maps directly from survey or routine institutional data, to be used as evidence-based decisions aids. Complex data can be displayed attractively, providing an important tool for studying and comparing health indicators among and between different populations. 4. The Ottawa Equity Gauge is applying the Global Equity Gauge Alliance framework to an industrialised country setting. 5 The Needs-Based Health Assessment Toolkit, established to assemble information on which clinical and health policy decisions can be based, is being expanded to ensure a focus on distribution and average health indicators. Conclusion Evidence-based planning tools have much to offer the

  4. Estimation of toxicity using a Java based software tool

    Science.gov (United States)

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  5. A Tool for Estimating Variability in Wood Preservative Treatment Retention

    Science.gov (United States)

    Patricia K. Lebow; Adam M. Taylor; Timothy M. Young

    2015-01-01

    Composite sampling is standard practice for evaluation of preservative retention levels in preservative-treated wood. Current protocols provide an average retention value but no estimate of uncertainty. Here we describe a statistical method for calculating uncertainty estimates using the standard sampling regime with minimal additional chemical analysis. This tool can...

  6. A MORET tool to assist code bias estimation

    International Nuclear Information System (INIS)

    Fernex, F.; Richet, Y.; Letang, E.

    2003-01-01

    This new Graphical User Interface (GUI) developed in JAVA is one of the post-processing tools for MORET4 code. It aims to help users to estimate the importance of the k eff bias due to the code in order to better define the upper safety limit. Moreover, it allows visualizing the distance between an actual configuration case and evaluated critical experiments. This tool depends on a validated experiments database, on sets of physical parameters and on various statistical tools allowing interpolating the calculation bias of the database or displaying the projections of experiments on a reduced base of parameters. The development of this tool is still in progress. (author)

  7. GARDEC, Estimation of dose-rates reduction by garden decontamination

    International Nuclear Information System (INIS)

    Togawa, Orihiko

    2006-01-01

    1 - Description of program or function: GARDEC estimates the reduction of dose rates by garden decontamination. It provides the effect of different decontamination Methods, the depth of soil to be considered, dose-rate before and after decontamination and the reduction factor. 2 - Methods: This code takes into account three Methods of decontamination : (i)digging a garden in a special way, (ii) a removal of the upper layer of soil, and (iii) covering with a shielding layer of soil. The dose-rate conversion factor is defined as the external dose-rate, in the air, at a given height above the ground from a unit concentration of a specific radionuclide in each soil layer

  8. Multidimensional Rank Reduction Estimator for Parametric MIMO Channel Models

    Directory of Open Access Journals (Sweden)

    Marius Pesavento

    2004-08-01

    Full Text Available A novel algebraic method for the simultaneous estimation of MIMO channel parameters from channel sounder measurements is developed. We consider a parametric multipath propagation model with P discrete paths where each path is characterized by its complex path gain, its directions of arrival and departure, time delay, and Doppler shift. This problem is treated as a special case of the multidimensional harmonic retrieval problem. While the well-known ESPRIT-type algorithms exploit shift-invariance between specific partitions of the signal matrix, the rank reduction estimator (RARE algorithm exploits their internal Vandermonde structure. A multidimensional extension of the RARE algorithm is developed, analyzed, and applied to measurement data recorded with the RUSK vector channel sounder in the 2 GHz band.

  9. Estimated emission reductions from California's enhanced Smog Check program.

    Science.gov (United States)

    Singer, Brett C; Wenzel, Thomas P

    2003-06-01

    The U.S. Environmental Protection Agency requires that states evaluate the effectiveness of their vehicle emissions inspection and maintenance (I/M) programs. This study demonstrates an evaluation approach that estimates mass emission reductions over time and includes the effect of I/M on vehicle deterioration. It includes a quantitative assessment of benefits from pre-inspection maintenance and repairs and accounts for the selection bias effect that occurs when intermittent high emitters are tested. We report estimates of one-cycle emission benefits of California's Enhanced Smog Check program, ca. 1999. Program benefits equivalent to metric tons per day of prevented emissions were calculated with a "bottom-up" approach that combined average per vehicle reductions in mass emission rates (g/gal) with average per vehicle activity, resolved by model year. Accelerated simulation mode test data from the statewide vehicle information database (VID) and from roadside Smog Check testing were used to determine 2-yr emission profiles of vehicles passing through Smog Check and infer emission profiles that would occur without Smog Check. The number of vehicles participating in Smog Check was also determined from the VID. We estimate that in 1999 Smog Check reduced tailpipe emissions of HC, CO, and NO(x) by 97, 1690, and 81 t/d, respectively. These correspond to 26, 34, and 14% of the HC, CO, and NO(x) that would have been emitted by vehicles in the absence of Smog Check. These estimates are highly sensitive to assumptions about vehicle deterioration in the absence of Smog Check. Considering the estimated uncertainty in these assumptions yields a range for calculated benefits: 46-128 t/d of HC, 860-2200 t/d of CO, and 60-91 t/d of NO(x). Repair of vehicles that failed an initial, official Smog Check appears to be the most important mechanism of emission reductions, but pre-inspection maintenance and repair also contributed substantially. Benefits from removal of nonpassing

  10. RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

    Energy Technology Data Exchange (ETDEWEB)

    William W. Weiss

    2000-06-30

    Incomplete or sparse information on geologic or formation characteristics introduces a high level of risk for oil exploration and development projects. Expert systems have been developed and used in several disciplines and industries, including medical diagnostics, with favorable results. A state-of-the-art exploration ''expert'' tool, relying on a computerized data base and computer maps generated by neural networks, is proposed through the use of ''fuzzy'' logic, a relatively new mathematical treatment of imprecise or non-explicit parameters and values. This project will develop an Artificial Intelligence system that will draw upon a wide variety of information to provide realistic estimates of risk. ''Fuzzy logic,'' a system of integrating large amounts of inexact, incomplete information with modern computational methods to derive usable conclusions, has been demonstrated as a cost-effective computational technology in many industrial applications. During project year 1, 90% of geologic, geophysical, production and price data were assimilated for installation into the database. Logs provided geologic data consisting of formation tops of the Brushy Canyon, Lower Brushy Canyon, and Bone Springs zones of 700 wells used to construct regional cross sections. Regional structure and isopach maps were constructed using kriging to interpolate between the measured points. One of the structure derivative maps (azimuth of curvature) visually correlates with Brushy Canyon fields on the maximum change contours. Derivatives of the regional geophysical data also visually correlate with the location of the fields. The azimuth of maximum dip approximately locates fields on the maximum change contours. In a similar manner the second derivative in the x-direction of the gravity map visually correlates with the alignment of the known fields. The visual correlations strongly suggest that neural network architectures will be

  11. Reduction of radiation exposure and image quality using dose reduction tool on computed tomography fluoroscopy

    International Nuclear Information System (INIS)

    Sakabe, Daisuke; Tochihara, Syuichi; Ono, Michiaki; Tokuda, Masaki; Kai, Noriyuki; Nakato, Kengo; Hashida, Masahiro; Funama, Yoshinori; Murazaki, Hiroo

    2012-01-01

    The purpose of our study was to measure the reduction rate of radiation dose and variability of image noise using the angular beam modulation (ABM) on computed tomography (CT) fluoroscopy. The Alderson-Rando phantom and the homemade phantom were used in our study. These phantoms were scanned at on-center and off-center positions at -12 cm along y-axis with and without ABM technique. Regarding the technique, the x-ray tube is turned off in a 100-degree angle sector at the center of 12 o'clock, 10 o'clock, and 2 o'clock positions during CT fluoroscopy. CT fluoroscopic images were obtained with tube voltages, 120 kV; tube current-time product per reconstructed image, 30 mAs; rotation time, 0.5 s/rot; slice thickness, 4.8 mm; and reconstruction kernel B30s in each scanning. After CT scanning, radiation exposure and image noise were measured and the image artifacts were evaluated with and without the technique. The reduction rate for radiation exposure was 75-80% with and without the technique at on-center position regardless of each angle position. In the case of the off-center position at -12 cm, the reduction rate was 50% with and without the technique. In contrast, image noise remained constant with and without the technique. Visual inspection for image artifacts almost have the same scores with and without the technique and no statistical significance was found in both techniques (p>0.05). ABM is an appropriate tool for reducing radiation exposure and maintaining image-noise and artifacts during CT fluoroscopy. (author)

  12. Development of pollution reduction strategies for Mexico City: Estimating cost and ozone reduction effectiveness

    International Nuclear Information System (INIS)

    Thayer, G.R.; Hardie, R.W.; Barrera-Roldan, A.

    1993-01-01

    This reports on the collection and preparation of data (costs and air quality improvement) for the strategic evaluation portion of the Mexico City Air Quality Research Initiative (MARI). Reports written for the Mexico City government by various international organizations were used to identify proposed options along with estimates of cost and emission reductions. Information from appropriate options identified by SCAQMD for Southem California were also used in the analysis. A linear optimization method was used to select a group of options or a strategy to be evaluated by decision analysis. However, the reduction of ozone levels is not a linear function of the reduction of hydrocarbon and NO x emissions. Therefore, a more detailed analysis was required for ozone. An equation for a plane on an isopleth calculated with a trajectory model was obtained using two endpoints that bracket the expected total ozone precursor reductions plus the starting concentrations for hydrocarbons and NO x . The relationship between ozone levels and the hydrocarbon and NO x concentrations was assumed to lie on this plane. This relationship was used in the linear optimization program to select the options comprising a strategy

  13. Estimating Tool-Tissue Forces Using a 3-Degree-of-Freedom Robotic Surgical Tool.

    Science.gov (United States)

    Zhao, Baoliang; Nelson, Carl A

    2016-10-01

    Robot-assisted minimally invasive surgery (MIS) has gained popularity due to its high dexterity and reduced invasiveness to the patient; however, due to the loss of direct touch of the surgical site, surgeons may be prone to exert larger forces and cause tissue damage. To quantify tool-tissue interaction forces, researchers have tried to attach different kinds of sensors on the surgical tools. This sensor attachment generally makes the tools bulky and/or unduly expensive and may hinder the normal function of the tools; it is also unlikely that these sensors can survive harsh sterilization processes. This paper investigates an alternative method by estimating tool-tissue interaction forces using driving motors' current, and validates this sensorless force estimation method on a 3-degree-of-freedom (DOF) robotic surgical grasper prototype. The results show that the performance of this method is acceptable with regard to latency and accuracy. With this tool-tissue interaction force estimation method, it is possible to implement force feedback on existing robotic surgical systems without any sensors. This may allow a haptic surgical robot which is compatible with existing sterilization methods and surgical procedures, so that the surgeon can obtain tool-tissue interaction forces in real time, thereby increasing surgical efficiency and safety.

  14. Revised estimates for ozone reduction by shuttle operation

    Science.gov (United States)

    Potter, A. E.

    1978-01-01

    Previous calculations by five different modeling groups of the effect of space shuttle operations on the ozone layer yielded an estimate of 0.2 percent ozone reduction for the Northern Hemisphere at 60 launches per year. Since these calculations were made, the accepted rate constant for the reaction between hydroperoxyl and nitric oxide to yield hydroxyl and nitrogen dioxide, HO2 + NO yields OH + NO2, was revised upward by more than an order of magnitude, with a resultant increase in the predicted ozone reduction for chlorofluoromethanes by a factor of approximately 2. New calculations of the shuttle effect were made with use of the new rate constant data, again by five different modeling groups. The new value of the shuttle effect on the ozone layer was found to be 0.25 percent. The increase resulting from the revised rate constant is considerably less for space shuttle operations than for chlorofluoromethane production, because the new rate constant also increases the calculated rate of downward transport of shuttle exhaust products out of the stratosphere.

  15. On Commitments and Other Uncertainty Reduction Tools in Joint Action

    Directory of Open Access Journals (Sweden)

    Michael John

    2015-01-01

    Full Text Available In this paper, we evaluate the proposal that a central function of commitments within joint action is to reduce various kinds of uncertainty, and that this accounts for the prevalence of commitments in joint action. While this idea is prima facie attractive, we argue that it faces two serious problems. First, commitments can only reduce uncertainty if they are credible, and accounting for the credibility of commitments proves not to be straightforward. Second, there are many other ways in which uncertainty is commonly reduced within joint actions, which raises the possibility that commitments may be superfluous. Nevertheless, we argue that the existence of these alternative uncertainty reduction processes does not make commitments superfluous after all but, rather, helps to explain how commitments may contribute in various ways to uncertainty reduction.

  16. Risk Reduction with a Fuzzy Expert Exploration Tool

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, William W.; Broadhead, Ron; Sung, Andrew

    2000-10-24

    This project developed an Artificial Intelligence system that drew up on a wide variety of information in providing realistic estimates of risk. ''Fuzzy logic,'' a system of integrating large amounts of inexact, incomplete information with modern computational methods derived usable conclusions, were demonstrated as a cost-effective computational technology in many industrial applications.

  17. Emerging Tools to Estimate and to Predict Exposures to ...

    Science.gov (United States)

    The timely assessment of the human and ecological risk posed by thousands of existing and emerging commercial chemicals is a critical challenge facing EPA in its mission to protect public health and the environment The US EPA has been conducting research to enhance methods used to estimate and forecast exposures for tens of thousands of chemicals. This research is aimed at both assessing risks and supporting life cycle analysis, by developing new models and tools for high throughput exposure screening and prioritization, as well as databases that support these and other tools, especially regarding consumer products. The models and data address usage, and take advantage of quantitative structural activity relationships (QSARs) for both inherent chemical properties and function (why the chemical is a product ingredient). To make them more useful and widely available, the new tools, data and models are designed to be: • Flexible • Intraoperative • Modular (useful to more than one, stand-alone application) • Open (publicly available software) Presented at the Society for Risk Analysis Forum: Risk Governance for Key Enabling Technologies, Venice, Italy, March 1-3, 2017

  18. Selection of portable tools for use in a size reduction facility

    International Nuclear Information System (INIS)

    Hawley, L.N.

    1986-07-01

    A range of portable tools are identified for development and eventual use within a remote operations facility for the size reduction of plutonium contaminated materials. The process of selection defines the work to be performed within the facility and matches this to the general categories of suitable tools. Specific commercial tools are then selected or, where none exists, proposals are made for the development of special tools. (author)

  19. Energy Saving Melting and Revert Reduction Technology (E-SMARRT): Design Support for Tooling Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Dongtao

    2011-09-23

    High pressure die casting is an intrinsically efficient net shape process and improvements in energy efficiency are strongly dependent on design and process improvements that reduce scrap rates so that more of the total consumed energy goes into acceptable, usable castings. Computer simulation has become widely used within the industry but use is not universal. Further, many key design decisions must be made before the simulation can be run and expense in terms of money and time often limits the number of decision iterations that can be explored. This work continues several years of work creating simple, very fast, design tools that can assist with the early stage design decisions so that the benefits of simulation can be maximized and, more importantly, so that the chances of first shot success are maximized. First shot success and better running processes contributes to less scrap and significantly better energy utilization by the process. This new technology was predicted to result in an average energy savings of 1.83 trillion BTUs/year over a 10 year period. Current (2011) annual energy saving estimates over a ten year period, based on commercial introduction in 2012, a market penetration of 30% by 2015 is 1.89 trillion BTUs/year by 2022. Along with these energy savings, reduction of scrap and improvement in yield will result in a reduction of the environmental emissions associated with the melting and pouring of the metal which will be saved as a result of this technology. The average annual estimate of CO2 reduction per year through 2022 is 0.037 Million Metric Tons of Carbon Equivalent (MM TCE).

  20. Duplicate laboratory test reduction using a clinical decision support tool.

    Science.gov (United States)

    Procop, Gary W; Yerian, Lisa M; Wyllie, Robert; Harrison, A Marc; Kottke-Marchant, Kandice

    2014-05-01

    Duplicate laboratory tests that are unwarranted increase unnecessary phlebotomy, which contributes to iatrogenic anemia, decreased patient satisfaction, and increased health care costs. We employed a clinical decision support tool (CDST) to block unnecessary duplicate test orders during the computerized physician order entry (CPOE) process. We assessed laboratory cost savings after 2 years and searched for untoward patient events associated with this intervention. This CDST blocked 11,790 unnecessary duplicate test orders in these 2 years, which resulted in a cost savings of $183,586. There were no untoward effects reported associated with this intervention. The movement to CPOE affords real-time interaction between the laboratory and the physician through CDSTs that signal duplicate orders. These interactions save health care dollars and should also increase patient satisfaction and well-being.

  1. An Accurate FFPA-PSR Estimator Algorithm and Tool for Software Effort Estimation

    Directory of Open Access Journals (Sweden)

    Senthil Kumar Murugesan

    2015-01-01

    Full Text Available Software companies are now keen to provide secure software with respect to accuracy and reliability of their products especially related to the software effort estimation. Therefore, there is a need to develop a hybrid tool which provides all the necessary features. This paper attempts to propose a hybrid estimator algorithm and model which incorporates quality metrics, reliability factor, and the security factor with a fuzzy-based function point analysis. Initially, this method utilizes a fuzzy-based estimate to control the uncertainty in the software size with the help of a triangular fuzzy set at the early development stage. Secondly, the function point analysis is extended by the security and reliability factors in the calculation. Finally, the performance metrics are added with the effort estimation for accuracy. The experimentation is done with different project data sets on the hybrid tool, and the results are compared with the existing models. It shows that the proposed method not only improves the accuracy but also increases the reliability, as well as the security, of the product.

  2. The Lives Saved Tool (LiST) as a model for diarrhea mortality reduction

    Science.gov (United States)

    2014-01-01

    Background Diarrhea is a leading cause of morbidity and mortality among children under five years of age. The Lives Saved Tool (LiST) is a model used to calculate deaths averted or lives saved by past interventions and for the purposes of program planning when costly and time consuming impact studies are not possible. Discussion LiST models the relationship between coverage of interventions and outputs, such as stunting, diarrhea incidence and diarrhea mortality. Each intervention directly prevents a proportion of diarrhea deaths such that the effect size of the intervention is multiplied by coverage to calculate lives saved. That is, the maximum effect size could be achieved at 100% coverage, but at 50% coverage only 50% of possible deaths are prevented. Diarrhea mortality is one of the most complex causes of death to be modeled. The complexity is driven by the combination of direct prevention and treatment interventions as well as interventions that operate indirectly via the reduction in risk factors, such as stunting and wasting. Published evidence is used to quantify the effect sizes for each direct and indirect relationship. Several studies have compared measured changes in mortality to LiST estimates of mortality change looking at different sets of interventions in different countries. While comparison work has generally found good agreement between the LiST estimates and measured mortality reduction, where data availability is weak, the model is less likely to produce accurate results. LiST can be used as a component of program evaluation, but should be coupled with more complete information on inputs, processes and outputs, not just outcomes and impact. Summary LiST is an effective tool for modeling diarrhea mortality and can be a useful alternative to large and expensive mortality impact studies. Predicting the impact of interventions or comparing the impact of more than one intervention without having to wait for the results of large and expensive

  3. Competitive kinetics as a tool to determine rate constants for reduction of ferrylmyoglobin by food components

    DEFF Research Database (Denmark)

    Jongberg, Sisse; Lund, Marianne Nissen; Pattison, David I.

    2016-01-01

    Competitive kinetics were applied as a tool to determine apparent rate constants for the reduction of hypervalent haem pigment ferrylmyoglobin (MbFe(IV)=O) by proteins and phenols in aqueous solution of pH 7.4 and I = 1.0 at 25 °C. Reduction of MbFe(IV)=O by a myofibrillar protein isolate (MPI) f...

  4. Use of GIS in the estimation and development of risk reduction technology

    International Nuclear Information System (INIS)

    Ha, Jae Joo

    1998-03-01

    The occurrence probability of a severe accident in the nuclear power plant is very small because the safety of a plant and the public is considered in the design and operation of a nuclear power plant. However, if a severe accident occurs, the establishment of a reduction strategy of damages resulting from it is essential because the effect of it on the human and the environment is very large. The important criterion which determines the severity of an accident is risk, which is defined as the product of its frequently and the consequence. The establishment of countermeasures in order to estimate and reduce risks quantitatively can be a very powerful tool to minimize the effect of an accident on the human and the environment. The research on the establishment of a framework which integrates a geographic information system (GIS), a database management system (DBMS), and decision making support system (DMSS) is considered very actively. Based on these systems, we can accomplish the estimation and display of risks and the development of reduction methodologies which are essential parts of an accident management of a nuclear power plant. The GIS plays a role to support users to systematize and comprehend spatial relationships of information which are necessary for the decision making. Through the DBMS, we can establish and manage spatial and attribute data, and use them in the query and selection. The DMSS is a computer-based information system which makes a necessary decision easily. In this study, we reviewed the fundamental concepts of a GIS and examined the methodology for the use of it in the estimation and display of risks. Also, we established the fundamental GIS platform of a Yonggwang site and the necessary database systems for the estimation of risks. (author). 17 refs., 9 tabs., 34 figs

  5. Visual tool for estimating the fractal dimension of images

    Science.gov (United States)

    Grossu, I. V.; Besliu, C.; Rusu, M. V.; Jipa, Al.; Bordeianu, C. C.; Felea, D.

    2009-10-01

    This work presents a new Visual Basic 6.0 application for estimating the fractal dimension of images, based on an optimized version of the box-counting algorithm. Following the attempt to separate the real information from "noise", we considered also the family of all band-pass filters with the same band-width (specified as parameter). The fractal dimension can be thus represented as a function of the pixel color code. The program was used for the study of paintings cracks, as an additional tool which can help the critic to decide if an artistic work is original or not. Program summaryProgram title: Fractal Analysis v01 Catalogue identifier: AEEG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 29 690 No. of bytes in distributed program, including test data, etc.: 4 967 319 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 30M Classification: 14 Nature of problem: Estimating the fractal dimension of images. Solution method: Optimized implementation of the box-counting algorithm. Use of a band-pass filter for separating the real information from "noise". User friendly graphical interface. Restrictions: Although various file-types can be used, the application was mainly conceived for the 8-bit grayscale, windows bitmap file format. Running time: In a first approximation, the algorithm is linear.

  6. Measurement reduction for mutual coupling calibration in DOA estimation

    Science.gov (United States)

    Aksoy, Taylan; Tuncer, T. Engin

    2012-01-01

    Mutual coupling is an important source of error in antenna arrays that should be compensated for super resolution direction-of-arrival (DOA) algorithms, such as Multiple Signal Classification (MUSIC) algorithm. A crucial step in array calibration is the determination of the mutual coupling coefficients for the antenna array. In this paper, a system theoretic approach is presented for the mutual coupling characterization of antenna arrays. The comprehension and implementation of this approach is simple leading to further advantages in calibration measurement reduction. In this context, a measurement reduction method for antenna arrays with omni-directional and identical elements is proposed which is based on the symmetry planes in the array geometry. The proposed method significantly decreases the number of measurements during the calibration process. This method is evaluated using different array types whose responses and the mutual coupling characteristics are obtained through numerical electromagnetic simulations. It is shown that a single calibration measurement is sufficient for uniform circular arrays. Certain important and interesting characteristics observed during the experiments are outlined.

  7. Estimation and reduction of harmonic currents from power converters

    DEFF Research Database (Denmark)

    Asiminoaei, Lucian

    -based method depends very much on the amount and accuracy of collected data in the development stage. The outcome of this investigation is a Harmonic Calculation Software compiled into a Graphical User Interface PC-software application, which can be applied for fast estimations of the harmonic currents...... control of the proposed topologies are given together with laboratory tests. One harmonic current mitigation solution found is to connect (two) smaller power APF's in parallel, sharing the same ac- and dc-bus. It is proven that parallel APF's may have lower passive components although other issues arises......, like circulation currents, which is removed here by common mode coils. Another harmonic solution is to use cascade connection of (two) independent APF's that cooperatively share the task of the harmonic mitigation. Two cooperative control methods are proposed called load-sharing and harmonic-sharing...

  8. SBAT: A Tool for Estimating Metal Bioaccessibility in Soils

    Energy Technology Data Exchange (ETDEWEB)

    Heuscher, S.A.

    2004-04-21

    Heavy metals such as chromium and arsenic are widespread in the environment due to their usage in many industrial processes. These metals may pose significant health risks to humans, especially children, due to their mutagenic and carcinogenic properties. Typically, the health risks associated with the ingestion of soil-bound metals are estimated by assuming that the metals are completely absorbed through the human intestinal tract (100% bioavailable). This assumption potentially overestimates the risk since soils are known to strongly sequester metals thereby potentially lowering their bioavailability. Beginning in 2000, researchers at Oak Ridge National Laboratory, with funding from the Strategic Environmental Research and Development Program (SERDP), studied the effect of soil properties on the bioaccessibility of soil-bound arsenic and chromium. Representative A and upper-B horizons from seven major U.S. soil orders were obtained from the U.S. Department of Agriculture's National Resources Conservation Service and the U.S. Department of Energy's Oak Ridge Reservation. The soils were spiked with known concentrations of arsenic (As(III) and As(V)) and chromium (Cr(III) and Cr(VI)), and the bioaccessibility was measured using a physiologically based extraction test that mimics the gastric activity of children. Linear regression models were then developed to relate the bioaccessibility measurements to the soil properties (Yang et al. 2002; Stewart et al. 2003a). Important results from these publications and other studies include: (1) Cr(VI) and As(III) are more toxic and bioavailable than Cr(III) and As(V) respectively. (2) Several favorable processes can occur in soils that promote the oxidation of As(III) to As(V) and the reduction of Cr(VI) to Cr(III), thereby lowering bioaccessibility. Iron and manganese oxides are capable of oxidizing As(III) to As(V), whereas organic matter and Fe(II)-bearing minerals are capable of reducing Cr(VI) to Cr(III). (3

  9. The Influence of Tool Texture on Friction and Lubrication in Strip Reduction Testing

    DEFF Research Database (Denmark)

    Sulaiman, Mohd Hafis Bin; Christiansen, Peter; Bay, Niels Oluf

    2017-01-01

    While texturing of workpiece surfaces to promote lubrication in metal forming has beenapplied for several decades, tool surface texturing is rather new. In the present paper, tool texturing is studied as a method to prevent galling. A strip reduction test was conducted with tools provided...... with shallow, longitudinal pockets oriented perpendicular to the sliding direction. The pockets had small angles to the workpiece surface and the distance between them were varied. The experiments reveal that the distance between pockets should be larger than the pocket width, thereby creating a topography...... similar to flat table mountains to avoid mechanical interlocking in the valleys; otherwise, an increase in drawing load and pick-up on the tools are observed. The textured tool surface lowers friction and improves lubrication performance, provided that the distance between pockets is 2–4 times larger than...

  10. MURMoT. Design and Application of Microbial Uranium Reduction Monitoring Tools

    Energy Technology Data Exchange (ETDEWEB)

    Loeffler, Frank E. [Univ. of Tennessee, Knoxville, TN (United States)

    2014-12-31

    Uranium (U) contamination in the subsurface is a major remediation challenge at many DOE sites. Traditional site remedies present enormous costs to DOE; hence, enhanced bioremediation technologies (i.e., biostimulation and bioaugmentation) combined with monitoring efforts are being considered as cost-effective corrective actions to address subsurface contamination. This research effort improved understanding of the microbial U reduction process and developed new tools for monitoring microbial activities. Application of these tools will promote science-based site management decisions that achieve contaminant detoxification, plume control, and long-term stewardship in the most efficient manner. The overarching hypothesis was that the design, validation and application of a suite of new molecular and biogeochemical tools advance process understanding, and improve environmental monitoring regimes to assess and predict in situ U immobilization. Accomplishments: This project (i) advanced nucleic acid-based approaches to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-detoxifying bacteria; (ii) developed proteomics workflows for detection of metal reduction biomarker proteins in laboratory cultures and contaminated site groundwater; (iii) developed and demonstrated the utility of U isotopic fractionation using high precision mass spectrometry to quantify U(VI) reduction for a range of reduction mechanisms and environmental conditions; and (iv) validated the new tools using field samples from U-contaminated IFRC sites, and demonstrated their prognostic and diagnostic capabilities in guiding decision making for environmental remediation and long-term site stewardship.

  11. Consequent use of IT tools as a driver for cost reduction and quality improvements

    Science.gov (United States)

    Hein, Stefan; Rapp, Roberto; Feustel, Andreas

    2013-10-01

    The semiconductor industry drives a lot of efforts in the field of cost reductions and quality improvements. The consequent use of IT tools is one possibility to support these goals. With the extensions of its 150mm Fab to 200mm Robert Bosch increased the systematic use of data analysis and Advanced Process Control (APC).

  12. Aircraft parameter estimation ± A tool for development of ...

    Indian Academy of Sciences (India)

    In addition, actuator performance and controller gains may be flight condition dependent. Moreover, this approach may result in open-loop parameter estimates with low accuracy. 6. Aerodynamic databases for high fidelity flight simulators. Estimation of a comprehensive aerodynamic model suitable for a flight simulator is an.

  13. High Accuracy Nonlinear Control and Estimation for Machine Tool Systems

    DEFF Research Database (Denmark)

    Papageorgiou, Dimitrios

    Component mass production has been the backbone of industry since the second industrial revolution, and machine tools are producing parts of widely varying size and design complexity. The ever-increasing level of automation in modern manufacturing processes necessitates the use of more...... sophisticated machine tool systems that are adaptable to different workspace conditions, while at the same time being able to maintain very narrow workpiece tolerances. The main topic of this thesis is to suggest control methods that can maintain required manufacturing tolerances, despite moderate wear and tear....... The purpose is to ensure that full accuracy is maintained between service intervals and to advice when overhaul is needed. The thesis argues that quality of manufactured components is directly related to the positioning accuracy of the machine tool axes, and it shows which low level control architectures...

  14. Estimating mortality risk reduction and economic benefits from controlling ozone air pollution

    National Research Council Canada - National Science Library

    Committee on Estimating Mortality Risk Reduction Benefits from Decreasing Tropospheric Ozone Exposure

    2008-01-01

    ... in life expectancy, and to assess methods for estimating the monetary value of the reduced risk of premature death and increased life expectancy in the context of health-benefits analysis. Estimating Mortality Risk Reduction and Economic Benefits from Controlling Ozone Air Pollution details the committee's findings and posits several recommendations to address these issues.

  15. Cost Estimating Cases: Educational Tools for Cost Analysts

    Science.gov (United States)

    1993-09-01

    only appropriate documentation should be provided. In other words, students should not submit all of the documentation possible using ACEIT , only that...case was their lack of understanding of the ACEIT software used to conduct the estimate. Specifically, many students misinterpreted the cost...estimating relationships (CERs) embedded in the 49 software. Additionally, few of the students were able to properly organize the ACEIT documentation output

  16. Artificial Neural Network-Based Clutter Reduction Systems for Ship Size Estimation in Maritime Radars

    Directory of Open Access Journals (Sweden)

    M. P. Jarabo-Amores

    2010-01-01

    Full Text Available The existence of clutter in maritime radars deteriorates the estimation of some physical parameters of the objects detected over the sea surface. For that reason, maritime radars should incorporate efficient clutter reduction techniques. Due to the intrinsic nonlinear dynamic of sea clutter, nonlinear signal processing is needed, what can be achieved by artificial neural networks (ANNs. In this paper, an estimation of the ship size using an ANN-based clutter reduction system followed by a fixed threshold is proposed. High clutter reduction rates are achieved using 1-dimensional (horizontal or vertical integration modes, although inaccurate ship width estimations are achieved. These estimations are improved using a 2-dimensional (rhombus integration mode. The proposed system is compared with a CA-CFAR system, denoting a great performance improvement and a great robustness against changes in sea clutter conditions and ship parameters, independently of the direction of movement of the ocean waves and ships.

  17. A generic tool for cost estimating in aircraft design

    NARCIS (Netherlands)

    Castagne, S.; Curran, R.; Rothwell, A.; Price, M.; Benard, E.; Raghunathan, S.

    2008-01-01

    A methodology to estimate the cost implications of design decisions by integrating cost as a design parameter at an early design stage is presented. The model is developed on a hierarchical basis, the manufacturing cost of aircraft fuselage panels being analysed in this paper. The manufacturing cost

  18. Development of a customised design flood estimation tool to ...

    African Journals Online (AJOL)

    The estimation of design flood events, i.e., floods characterised by a specific magnitude-frequency relationship, at a particular site in a specific region is necessary for the planning, design and operation of hydraulic structures. Both the occurrence and frequency of flood events, along with the uncertainty involved in the ...

  19. Brain Volume Estimation Enhancement by Morphological Image Processing Tools

    Directory of Open Access Journals (Sweden)

    Zeinali R.

    2017-12-01

    Full Text Available Background: Volume estimation of brain is important for many neurological applications. It is necessary in measuring brain growth and changes in brain in normal/ abnormal patients. Thus, accurate brain volume measurement is very important. Magnetic resonance imaging (MRI is the method of choice for volume quantification due to excellent levels of image resolution and between-tissue contrast. Stereology method is a good method for estimating volume but it requires to segment enough MRI slices and have a good resolution. In this study, it is desired to enhance stereology method for volume estimation of brain using less MRI slices with less resolution. Methods: In this study, a program for calculating volume using stereology method has been introduced. After morphologic method, dilation was applied and the stereology method enhanced. For the evaluation of this method, we used T1-wighted MR images from digital phantom in BrainWeb which had ground truth. Results: The volume of 20 normal brain extracted from BrainWeb, was calculated. The volumes of white matter, gray matter and cerebrospinal fluid with given dimension were estimated correctly. Volume calculation from Stereology method in different cases was made. In three cases, Root Mean Square Error (RMSE was measured. Case I with T=5, d=5, Case II with T=10, D=10 and Case III with T=20, d=20 (T=slice thickness, d=resolution as stereology parameters. By comparing these results of two methods, it is obvious that RMSE values for our proposed method are smaller than Stereology method. Conclusion: Using morphological operation, dilation allows to enhance the estimation volume method, Stereology. In the case with less MRI slices and less test points, this method works much better compared to Stereology method.

  20. Estimating CO2 Emission Reduction of Non-capture CO2 Utilization (NCCU) Technology

    International Nuclear Information System (INIS)

    Lee, Ji Hyun; Lee, Dong Woog; Gyu, Jang Se; Kwak, No-Sang; Lee, In Young; Jang, Kyung Ryoung; Shim, Jae-Goo; Choi, Jong Shin

    2015-01-01

    Estimating potential of CO 2 emission reduction of non-capture CO 2 utilization (NCCU) technology was evaluated. NCCU is sodium bicarbonate production technology through the carbonation reaction of CO 2 contained in the flue gas. For the estimating the CO 2 emission reduction, process simulation using process simulator (PRO/II) based on a chemical plant which could handle CO 2 of 100 tons per day was performed, Also for the estimation of the indirect CO 2 reduction, the solvay process which is a conventional technology for the production of sodium carbonate/sodium bicarbonate, was studied. The results of the analysis showed that in case of the solvay process, overall CO 2 emission was estimated as 48,862 ton per year based on the energy consumption for the production of NaHCO 3 (7.4 GJ/tNaHCO 3 ). While for the NCCU technology, the direct CO 2 reduction through the CO 2 carbonation was estimated as 36,500 ton per year and the indirect CO 2 reduction through the lower energy consumption was 46,885 ton per year which lead to 83,385 ton per year in total. From these results, it could be concluded that sodium bicarbonate production technology through the carbonation reaction of CO 2 contained in the flue was energy efficient and could be one of the promising technology for the low CO 2 emission technology.

  1. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  2. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  3. Mapping grey matter reductions in schizophrenia: an anatomical likelihood estimation analysis of voxel-based morphometry studies.

    Science.gov (United States)

    Fornito, A; Yücel, M; Patti, J; Wood, S J; Pantelis, C

    2009-03-01

    Voxel-based morphometry (VBM) is a popular tool for mapping neuroanatomical changes in schizophrenia patients. Several recent meta-analyses have identified the brain regions in which patients most consistently show grey matter reductions, although they have not examined whether such changes reflect differences in grey matter concentration (GMC) or grey matter volume (GMV). These measures assess different aspects of grey matter integrity, and may therefore reflect different pathological processes. In this study, we used the Anatomical Likelihood Estimation procedure to analyse significant differences reported in 37 VBM studies of schizophrenia patients, incorporating data from 1646 patients and 1690 controls, and compared the findings of studies using either GMC or GMV to index grey matter differences. Analysis of all studies combined indicated that grey matter reductions in a network of frontal, temporal, thalamic and striatal regions are among the most frequently reported in literature. GMC reductions were generally larger and more consistent than GMV reductions, and were more frequent in the insula, medial prefrontal, medial temporal and striatal regions. GMV reductions were more frequent in dorso-medial frontal cortex, and lateral and orbital frontal areas. These findings support the primacy of frontal, limbic, and subcortical dysfunction in the pathophysiology of schizophrenia, and suggest that the grey matter changes observed with MRI may not necessarily result from a unitary pathological process.

  4. An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System

    Science.gov (United States)

    1989-09-01

    C~4p DTIC S ELECTE fl JAN12 19 .1R ~OF S%. B -U AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L...Ohio go 91 022 AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L. Hanson Major, USAF...Department of Defense. AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Presented to the

  5. The Support Reduction Algorithm for Computing Non-Parametric Function Estimates in Mixture Models

    OpenAIRE

    GROENEBOOM, PIET; JONGBLOED, GEURT; WELLNER, JON A.

    2008-01-01

    In this paper, we study an algorithm (which we call the support reduction algorithm) that can be used to compute non-parametric M-estimators in mixture models. The algorithm is compared with natural competitors in the context of convex regression and the ‘Aspect problem’ in quantum physics.

  6. Estimation of tool wear during CNC milling using neural network-based sensor fusion

    Science.gov (United States)

    Ghosh, N.; Ravi, Y. B.; Patra, A.; Mukhopadhyay, S.; Paul, S.; Mohanty, A. R.; Chattopadhyay, A. B.

    2007-01-01

    Cutting tool wear degrades the product quality in manufacturing processes. Monitoring tool wear value online is therefore needed to prevent degradation in machining quality. Unfortunately there is no direct way of measuring the tool wear online. Therefore one has to adopt an indirect method wherein the tool wear is estimated from several sensors measuring related process variables. In this work, a neural network-based sensor fusion model has been developed for tool condition monitoring (TCM). Features extracted from a number of machining zone signals, namely cutting forces, spindle vibration, spindle current, and sound pressure level have been fused to estimate the average flank wear of the main cutting edge. Novel strategies such as, signal level segmentation for temporal registration, feature space filtering, outlier removal, and estimation space filtering have been proposed. The proposed approach has been validated by both laboratory and industrial implementations.

  7. Transit Boardings Estimation and Simulation Tool (TBEST) calibration for guideway and BRT modes.

    Science.gov (United States)

    2013-06-01

    This research initiative was motivated by a desire of the Florida Department of Transportation and the : Transit Boardings Estimation and Simulation Tool (TBEST) project team to enhance the value of TBEST to : the planning community by improving its ...

  8. An Evaluation of Growth Models as Predictive Tools for Estimates at Completion (EAC)

    National Research Council Canada - National Science Library

    Trahan, Elizabeth N

    2009-01-01

    ...) as the Estimates at Completion (EAC). Our research evaluates the prospect of nonlinear growth modeling as an alternative to the current predictive tools used for calculating EAC, such as the Cost Performance Index (CPI...

  9. Vapor Intrusion Estimation Tool for Unsaturated Zone Contaminant Sources. User’s Guide

    Science.gov (United States)

    2016-08-30

    estimation process when applying the tool. The tool described here is focused on vapor-phase diffusion from the current vadose zone source , and is not...from the current defined vadose zone source ). The estimated soil gas contaminant concentration obtained from the pre-modeled scenarios for a building...need a full site-specific numerical model to assess the impacts beyond the current vadose zone source . 35 5.0 References Brennan, R.A., N

  10. A "Carbon Reduction Challenge" as tool for undergraduate engagement on climate change

    Science.gov (United States)

    Cobb, K. M.; Toktay, B.

    2017-12-01

    Challenge represents a solutions-oriented, hands-on, project-based learning tool that has achieved significant pedagogical benefits while delivering real-world carbon reductions and cost savings to community stakeholders.

  11. Real-Time Estimation for Cutting Tool Wear Based on Modal Analysis of Monitored Signals

    Directory of Open Access Journals (Sweden)

    Yongjiao Chi

    2018-05-01

    Full Text Available There is a growing body of literature that recognizes the importance of product safety and the quality problems during processing. The working status of cutting tools may lead to project delay and cost overrun if broken down accidentally, and tool wear is crucial to processing precision in mechanical manufacturing, therefore, this study contributes to this growing area of research by monitoring condition and estimating wear. In this research, an effective method for tool wear estimation was constructed, in which, the signal features of machining process were extracted by ensemble empirical mode decomposition (EEMD and were used to estimate the tool wear. Based on signal analysis, vibration signals that had better linear relationship with tool wearing process were decomposed, then the intrinsic mode functions (IMFs, frequency spectrums of IMFs and the features relating to amplitude changes of frequency spectrum were obtained. The trend that tool wear changes with the features was fitted by Gaussian fitting function to estimate the tool wear. Experimental investigation was used to verify the effectiveness of this method and the results illustrated the correlation between tool wear and the modal features of monitored signals.

  12. Missing texture reconstruction method based on error reduction algorithm using Fourier transform magnitude estimation scheme.

    Science.gov (United States)

    Ogawa, Takahiro; Haseyama, Miki

    2013-03-01

    A missing texture reconstruction method based on an error reduction (ER) algorithm, including a novel estimation scheme of Fourier transform magnitudes is presented in this brief. In our method, Fourier transform magnitude is estimated for a target patch including missing areas, and the missing intensities are estimated by retrieving its phase based on the ER algorithm. Specifically, by monitoring errors converged in the ER algorithm, known patches whose Fourier transform magnitudes are similar to that of the target patch are selected from the target image. In the second approach, the Fourier transform magnitude of the target patch is estimated from those of the selected known patches and their corresponding errors. Consequently, by using the ER algorithm, we can estimate both the Fourier transform magnitudes and phases to reconstruct the missing areas.

  13. Cost estimation tools in Germany and the UK. Comparison of cost estimates and actual costs

    International Nuclear Information System (INIS)

    Pfeifer, W.; Gordelier, S.; Drake, V.

    2005-01-01

    Full text: Accurate cost estimation for future decommissioning projects is a matter of considerable importance, especially for ensuring that sufficient funds will be available at the time of project implementation. This paper looks at the experience of cost estimation and real implementation outcomes from two countries, Germany and the UK, and draws lessons for the future. In Germany, cost estimates for the decommissioning of power reactors are updated every two years. For this purpose, the STILLKO program of the NIS Company is used. So far, Forschungszentrum Karlsruhe has successfully decommissioned two prototype reactor facilities. Re-cultivation of the premises has already been completed. At the moment, the activated components of the multi-purpose research reactor (MZFR), the first pressurized water reactor in Germany that was moderated and cooled with heavy water, and of the prototype fast breeder reactor (KNK) are being dismantled remotely. Consequently, vast experience exists in particular for the updating of total costs on the basis of actually incurred expenses. The further the dismantling work proceeds, the more reliable is the total cost estimate. Here, the development of the estimated MZFR decommissioning costs shall be presented and compared with the estimates obtained for a German reference PWR-type power reactor of 1200 MW. In this way: - common features of the prototype reactor and power reactor shall be emphasized, - several parameters leading to an increase in the estimated costs shall be highlighted, - cost risks shall be outlined with the remote dismantling of the reactor pressure vessel serving as an example, - calculation parameters shall be presented, and - recommendations shall be made for a consistent estimation of costs. The United Kingdom Atomic Energy Authority (UKAEA) has a major programme for the environmental remediation of its former research and development sites at Dounreay, Windscale, Harwell and Winfrith together with the need to

  14. Parameter Estimation of the Thermal Network Model of a Machine Tool Spindle by Self-made Bluetooth Temperature Sensor Module

    Directory of Open Access Journals (Sweden)

    Yuan-Chieh Lo

    2018-02-01

    Full Text Available Thermal characteristic analysis is essential for machine tool spindles because sudden failures may occur due to unexpected thermal issue. This article presents a lumped-parameter Thermal Network Model (TNM and its parameter estimation scheme, including hardware and software, in order to characterize both the steady-state and transient thermal behavior of machine tool spindles. For the hardware, the authors develop a Bluetooth Temperature Sensor Module (BTSM which accompanying with three types of temperature-sensing probes (magnetic, screw, and probe. Its specification, through experimental test, achieves to the precision ±(0.1 + 0.0029|t| °C, resolution 0.00489 °C, power consumption 7 mW, and size Ø40 mm × 27 mm. For the software, the heat transfer characteristics of the machine tool spindle correlative to rotating speed are derived based on the theory of heat transfer and empirical formula. The predictive TNM of spindles was developed by grey-box estimation and experimental results. Even under such complicated operating conditions as various speeds and different initial conditions, the experiments validate that the present modeling methodology provides a robust and reliable tool for the temperature prediction with normalized mean square error of 99.5% agreement, and the present approach is transferable to the other spindles with a similar structure. For realizing the edge computing in smart manufacturing, a reduced-order TNM is constructed by Model Order Reduction (MOR technique and implemented into the real-time embedded system.

  15. Estimated reductions in hospitalizations and deaths from childhood diarrhea following implementation of rotavirus vaccination in Africa.

    Science.gov (United States)

    Shah, Minesh P; Tate, Jacqueline E; Mwenda, Jason M; Steele, A Duncan; Parashar, Umesh D

    2017-10-01

    Rotavirus is the leading cause of hospitalizations and deaths from diarrhea. 33 African countries had introduced rotavirus vaccines by 2016. We estimate reductions in rotavirus hospitalizations and deaths for countries using rotavirus vaccination in national immunization programs and the potential of vaccine introduction across the continent. Areas covered: Regional rotavirus burden data were reviewed to calculate hospitalization rates, and applied to under-5 population to estimate baseline hospitalizations. Rotavirus mortality was based on 2013 WHO estimates. Regional pre-licensure vaccine efficacy and post-introduction vaccine effectiveness studies were used to estimate summary effectiveness, and vaccine coverage was applied to calculate prevented hospitalizations and deaths. Uncertainties around input parameters were propagated using boot-strapping simulations. In 29 African countries that introduced rotavirus vaccination prior to end 2014, 134,714 (IQR 112,321-154,654) hospitalizations and 20,986 (IQR 18,924-22,822) deaths were prevented in 2016. If all African countries had introduced rotavirus vaccines at benchmark immunization coverage, 273,619 (47%) (IQR 227,260-318,102) hospitalizations and 47,741 (39%) (IQR 42,822-52,462) deaths would have been prevented. Expert commentary: Rotavirus vaccination has substantially reduced hospitalizations and deaths in Africa; further reductions are anticipated as additional countries implement vaccination. These estimates bolster wider introduction and continued support of rotavirus vaccination programs.

  16. Reduction of variance in spectral estimates for correction of ultrasonic aberration.

    Science.gov (United States)

    Astheimer, Jeffrey P; Pilkington, Wayne C; Waag, Robert C

    2006-01-01

    A variance reduction factor is defined to describe the rate of convergence and accuracy of spectra estimated from overlapping ultrasonic scattering volumes when the scattering is from a spatially uncorrelated medium. Assuming that the individual volumes are localized by a spherically symmetric Gaussian window and that centers of the volumes are located on orbits of an icosahedral rotation group, the factor is minimized by adjusting the weight and radius of each orbit. Conditions necessary for the application of the variance reduction method, particularly for statistical estimation of aberration, are examined. The smallest possible value of the factor is found by allowing an unlimited number of centers constrained only to be within a ball rather than on icosahedral orbits. Computations using orbits formed by icosahedral vertices, face centers, and edge midpoints with a constraint radius limited to a small multiple of the Gaussian width show that a significant reduction of variance can be achieved from a small number of centers in the confined volume and that this reduction is nearly the maximum obtainable from an unlimited number of centers in the same volume.

  17. Planning Tools For Estimating Radiation Exposure At The National Ignition Facility

    International Nuclear Information System (INIS)

    Verbeke, J.; Young, M.; Brereton, S.; Dauffy, L.; Hall, J.; Hansen, L.; Khater, H.; Kim, S.; Pohl, B.; Sitaraman, S.

    2010-01-01

    A set of computational tools was developed to help estimate and minimize potential radiation exposure to workers from material activation in the National Ignition Facility (NIF). AAMI (Automated ALARA-MCNP Interface) provides an efficient, automated mechanism to perform the series of calculations required to create dose rate maps for the entire facility with minimal manual user input. NEET (NIF Exposure Estimation Tool) is a web application that combines the information computed by AAMI with a given shot schedule to compute and display the dose rate maps as a function of time. AAMI and NEET are currently used as work planning tools to determine stay-out times for workers following a given shot or set of shots, and to help in estimating integrated doses associated with performing various maintenance activities inside the target bay. Dose rate maps of the target bay were generated following a low-yield 10 16 D-T shot and will be presented in this paper.

  18. Estimating the Condition of the Heat Resistant Lining in an Electrical Reduction Furnace

    Directory of Open Access Journals (Sweden)

    Jan G. Waalmann

    1988-01-01

    Full Text Available This paper presents a system for estimating the condition of the heat resistant lining in an electrical reduction furnace for ferrosilicon. The system uses temperature measured with thermocouples placed on the outside of the furnace-pot. These measurements are used together with a mathematical model of the temperature distribution in the lining in a recursive least squares algorithm to estimate the position of 'the transformation front'. The system is part of a monitoring system which is being developed in the AIP-project: 'Condition monitoring of strongly exposed process equipment in thc ferroalloy industry'. The estimator runs on-line, and results arc presented in colour-graphics on a display unit. The goal is to locate the transformation front with an accuracy of +- 5cm.

  19. SBML-PET-MPI: a parallel parameter estimation tool for Systems Biology Markup Language based models.

    Science.gov (United States)

    Zi, Zhike

    2011-04-01

    Parameter estimation is crucial for the modeling and dynamic analysis of biological systems. However, implementing parameter estimation is time consuming and computationally demanding. Here, we introduced a parallel parameter estimation tool for Systems Biology Markup Language (SBML)-based models (SBML-PET-MPI). SBML-PET-MPI allows the user to perform parameter estimation and parameter uncertainty analysis by collectively fitting multiple experimental datasets. The tool is developed and parallelized using the message passing interface (MPI) protocol, which provides good scalability with the number of processors. SBML-PET-MPI is freely available for non-commercial use at http://www.bioss.uni-freiburg.de/cms/sbml-pet-mpi.html or http://sites.google.com/site/sbmlpetmpi/.

  20. The cardiovascular event reduction tool (CERT)--a simplified cardiac risk prediction model developed from the West of Scotland Coronary Prevention Study (WOSCOPS).

    Science.gov (United States)

    L'Italien, G; Ford, I; Norrie, J; LaPuerta, P; Ehreth, J; Jackson, J; Shepherd, J

    2000-03-15

    The clinical decision to treat hypercholesterolemia is premised on an awareness of patient risk, and cardiac risk prediction models offer a practical means of determining such risk. However, these models are based on observational cohorts where estimates of the treatment benefit are largely inferred. The West of Scotland Coronary Prevention Study (WOSCOPS) provides an opportunity to develop a risk-benefit prediction model from the actual observed primary event reduction seen in the trial. Five-year Cox model risk estimates were derived from all WOSCOPS subjects (n = 6,595 men, aged 45 to 64 years old at baseline) using factors previously shown to be predictive of definite fatal coronary heart disease or nonfatal myocardial infarction. Model risk factors included age, diastolic blood pressure, total cholesterol/ high-density lipoprotein ratio (TC/HDL), current smoking, diabetes, family history of fatal coronary heart disease, nitrate use or angina, and treatment (placebo/ 40-mg pravastatin). All risk factors were expressed as categorical variables to facilitate risk assessment. Risk estimates were incorporated into a simple, hand-held slide rule or risk tool. Risk estimates were identified for 5-year age bands (45 to 65 years), 4 categories of TC/HDL ratio ( or = 7.5), 2 levels of diastolic blood pressure ( or = 90 mm Hg), from 0 to 3 additional risk factors (current smoking, diabetes, family history of premature fatal coronary heart disease, nitrate use or angina), and pravastatin treatment. Five-year risk estimates ranged from 2% in very low-risk subjects to 61% in the very high-risk subjects. Risk reduction due to pravastatin treatment averaged 31%. Thus, the Cardiovascular Event Reduction Tool (CERT) is a risk prediction model derived from the WOSCOPS trial. Its use will help physicians identify patients who will benefit from cholesterol reduction.

  1. An open tool for input function estimation and quantification of dynamic PET FDG brain scans.

    Science.gov (United States)

    Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro

    2016-08-01

    Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main

  2. A tool for the estimation of the distribution of landslide area in R

    Science.gov (United States)

    Rossi, M.; Cardinali, M.; Fiorucci, F.; Marchesini, I.; Mondini, A. C.; Santangelo, M.; Ghosh, S.; Riguer, D. E. L.; Lahousse, T.; Chang, K. T.; Guzzetti, F.

    2012-04-01

    We have developed a tool in R (the free software environment for statistical computing, http://www.r-project.org/) to estimate the probability density and the frequency density of landslide area. The tool implements parametric and non-parametric approaches to the estimation of the probability density and the frequency density of landslide area, including: (i) Histogram Density Estimation (HDE), (ii) Kernel Density Estimation (KDE), and (iii) Maximum Likelihood Estimation (MLE). The tool is available as a standard Open Geospatial Consortium (OGC) Web Processing Service (WPS), and is accessible through the web using different GIS software clients. We tested the tool to compare Double Pareto and Inverse Gamma models for the probability density of landslide area in different geological, morphological and climatological settings, and to compare landslides shown in inventory maps prepared using different mapping techniques, including (i) field mapping, (ii) visual interpretation of monoscopic and stereoscopic aerial photographs, (iii) visual interpretation of monoscopic and stereoscopic VHR satellite images and (iv) semi-automatic detection and mapping from VHR satellite images. Results show that both models are applicable in different geomorphological settings. In most cases the two models provided very similar results. Non-parametric estimation methods (i.e., HDE and KDE) provided reasonable results for all the tested landslide datasets. For some of the datasets, MLE failed to provide a result, for convergence problems. The two tested models (Double Pareto and Inverse Gamma) resulted in very similar results for large and very large datasets (> 150 samples). Differences in the modeling results were observed for small datasets affected by systematic biases. A distinct rollover was observed in all analyzed landslide datasets, except for a few datasets obtained from landslide inventories prepared through field mapping or by semi-automatic mapping from VHR satellite imagery

  3. Tool to estimate optical metrics from summary wave-front analysis data in the human eye

    NARCIS (Netherlands)

    Jansonius, Nomdo M.

    Purpose Studies in the field of cataract and refractive surgery often report only summary wave-front analysis data data that are too condensed to allow for a retrospective calculation of metrics relevant to visual perception. The aim of this study was to develop a tool that can be used to estimate

  4. Noise reduction and estimation in multiple micro-electro-mechanical inertial systems

    International Nuclear Information System (INIS)

    Waegli, Adrian; Skaloud, Jan; Guerrier, Stéphane; Parés, Maria Eulàlia; Colomina, Ismael

    2010-01-01

    This research studies the reduction and the estimation of the noise level within a redundant configuration of low-cost (MEMS-type) inertial measurement units (IMUs). Firstly, independent observations between units and sensors are assumed and the theoretical decrease in the system noise level is analyzed in an experiment with four MEMS-IMU triads. Then, more complex scenarios are presented in which the noise level can vary in time and for each sensor. A statistical method employed for studying the volatility of financial markets (GARCH) is adapted and tested for the usage with inertial data. This paper demonstrates experimentally and through simulations the benefit of direct noise estimation in redundant IMU setups

  5. Reducing catheter-related thrombosis using a risk reduction tool centered on catheter to vessel ratio.

    Science.gov (United States)

    Spencer, Timothy R; Mahoney, Keegan J

    2017-11-01

    In vascular access practices, the internal vessel size is considered important, and a catheter to vessel ratio (CVR) is recommended to assist clinicians in selecting the most appropriate-sized device for the vessel. In 2016, new practice recommendations stated that the CVR can increase from 33 to 45% of the vessels diameter. There has been evidence on larger diameter catheters and increased thrombosis risk in recent literature, while insufficient information established on what relationship to vessel size is appropriate for any intra-vascular device. Earlier references to clinical standards and guidelines did not clearly address vessel size in relation to the area consumed or external catheter diameter. The aim of this manuscript is to present catheter-related thrombosis evidence and develop a standardized process of ultrasound-guided vessel assessment, integrating CVR, Virchow's triad phenomenon and vessel health and preservation strategies, empowering an evidence-based approach to device placement. Through review, calculation and assessment on the areas of the 33 and 45% rule, a preliminary clinical tool was developed to assist clinicians make cognizant decisions when placing intravascular devices relating to target vessel size, focusing on potential reduction in catheter-related thrombosis. Increasing the understanding and utilization of CVRs will lead to a safer, more consistent approach to device placement, with potential thrombosis reduction strategies. The future of evidence-based data relies on the clinician to capture accurate vessel measurements and device-related outcomes. This will lead to a more dependable data pool, driving the relationship of catheter-related thrombosis and vascular assessment.

  6. Technical Note: On the efficiency of variance reduction techniques for Monte Carlo estimates of imaging noise.

    Science.gov (United States)

    Sharma, Diksha; Sempau, Josep; Badano, Aldo

    2018-02-01

    Monte Carlo simulations require large number of histories to obtain reliable estimates of the quantity of interest and its associated statistical uncertainty. Numerous variance reduction techniques (VRTs) have been employed to increase computational efficiency by reducing the statistical uncertainty. We investigate the effect of two VRTs for optical transport methods on accuracy and computing time for the estimation of variance (noise) in x-ray imaging detectors. We describe two VRTs. In the first, we preferentially alter the direction of the optical photons to increase detection probability. In the second, we follow only a fraction of the total optical photons generated. In both techniques, the statistical weight of photons is altered to maintain the signal mean. We use fastdetect2, an open-source, freely available optical transport routine from the hybridmantis package. We simulate VRTs for a variety of detector models and energy sources. The imaging data from the VRT simulations are then compared to the analog case (no VRT) using pulse height spectra, Swank factor, and the variance of the Swank estimate. We analyze the effect of VRTs on the statistical uncertainty associated with Swank factors. VRTs increased the relative efficiency by as much as a factor of 9. We demonstrate that we can achieve the same variance of the Swank factor with less computing time. With this approach, the simulations can be stopped when the variance of the variance estimates reaches the desired level of uncertainty. We implemented analytic estimates of the variance of Swank factor and demonstrated the effect of VRTs on image quality calculations. Our findings indicate that the Swank factor is dominated by the x-ray interaction profile as compared to the additional uncertainty introduced in the optical transport by the use of VRTs. For simulation experiments that aim at reducing the uncertainty in the Swank factor estimate, any of the proposed VRT can be used for increasing the relative

  7. Development, Validation, and Verification of a Self-Assessment Tool to Estimate Agnibala (Digestive Strength).

    Science.gov (United States)

    Singh, Aparna; Singh, Girish; Patwardhan, Kishor; Gehlot, Sangeeta

    2017-01-01

    According to Ayurveda, the traditional system of healthcare of Indian origin, Agni is the factor responsible for digestion and metabolism. Four functional states (Agnibala) of Agni have been recognized: regular, irregular, intense, and weak. The objective of the present study was to develop and validate a self-assessment tool to estimate Agnibala The developed tool was evaluated for its reliability and validity by administering it to 300 healthy volunteers of either gender belonging to 18 to 40-year age group. Besides confirming the statistical validity and reliability, the practical utility of the newly developed tool was also evaluated by recording serum lipid parameters of all the volunteers. The results show that the lipid parameters vary significantly according to the status of Agni The tool, therefore, may be used to screen normal population to look for possible susceptibility to certain health conditions. © The Author(s) 2016.

  8. Estimating the fiscal effects of public pharmaceutical expenditure reduction in Greece

    Directory of Open Access Journals (Sweden)

    Kyriakos eSouliotis

    2015-08-01

    Full Text Available The purpose of the present study is to estimate the impact of pharmaceutical spending reduction on public revenue, based on data from the national health accounts as well as on reports of Greece’s organizations. The methodology of the analysis is structured in two basic parts. The first part presents the urgency for rapid cutbacks on public pharmaceutical costs due to the financial crisis and provides a conceptual framework for the contribution of the Greek pharmaceutical branch to the country’s economy. In the second part, we perform a quantitative analysis for the estimation of multiplier effects of public pharmaceutical expenditure reduction on main revenue sources such as taxes and social contributions. We also fit projection models with multipliers as regressands for the evaluation of the efficiency of the particular fiscal measure in the short run. According to the results, near half of the gains from the measure’s application is offset by financially equivalent decreases in the government’s revenue, i.e. losses in tax revenues and social security contributions alone, not considering any other direct or indirect costs. The findings of multipliers’ high value and increasing short-term trend imply the measure’s inefficiency henceforward and signal the risk of vicious circles that will provoke the economy’s deprivation of useful resources.

  9. Estimating the Fiscal Effects of Public Pharmaceutical Expenditure Reduction in Greece.

    Science.gov (United States)

    Souliotis, Kyriakos; Papageorgiou, Manto; Politi, Anastasia; Frangos, Nikolaos; Tountas, Yiannis

    2015-01-01

    The purpose of the present study is to estimate the impact of pharmaceutical spending reduction on public revenue, based on data from the national health accounts as well as on reports of Greece's organizations. The methodology of the analysis is structured in two basic parts. The first part presents the urgency for rapid cutbacks on public pharmaceutical costs due to the financial crisis and provides a conceptual framework for the contribution of the Greek pharmaceutical branch to the country's economy. In the second part, we perform a quantitative analysis for the estimation of multiplier effects of public pharmaceutical expenditure reduction on main revenue sources, such as taxes and social contributions. We also fit projection models with multipliers as regressands for the evaluation of the efficiency of the particular fiscal measure in the short run. According to the results, nearly half of the gains from the measure's application is offset by financially equivalent decreases in the government's revenue, i.e., losses in tax revenues and social security contributions alone, not considering any other direct or indirect costs. The findings of multipliers' high value and increasing short-term trend imply the measure's inefficiency henceforward and signal the risk of vicious circles that will provoke the economy's deprivation of useful resources.

  10. An adaptive observer for on-line tool wear estimation in turning, Part I: Theory

    Science.gov (United States)

    Danai, Kourosh; Ulsoy, A. Galip

    1987-04-01

    On-line sensing of tool wear has been a long-standing goal of the manufacturing engineering community. In the absence of any reliable on-line tool wear sensors, a new model-based approach for tool wear estimation has been proposed. This approach is an adaptive observer, based on force measurement, which uses both parameter and state estimation techniques. The design of the adaptive observer is based upon a dynamic state model of tool wear in turning. This paper (Part I) presents the model, and explains its use as the basis for the adaptive observer design. This model uses flank wear and crater wear as state variables, feed as the input, and the cutting force as the output. The suitability of the model as the basis for adaptive observation is also verified. The implementation of the adaptive observer requires the design of a state observer and a parameter estimator. To obtain the model parameters for tuning the adaptive observer procedures for linearisation of the non-linear model are specified. The implementation of the adaptive observer in turning and experimental results are presented in a companion paper (Part II).

  11. POETICS OF TRANSCENDENCE: STYLISTIC REDUCTION AS A TOOL FOR REPRESENTATION OF SACRED MEANINGS

    Directory of Open Access Journals (Sweden)

    Elena Brazgovskaya

    2016-10-01

    Full Text Available The main direction of the work is connected to the representation of abstract (transcendent objects in music and literature. The article analyses "Cantus in Memoriam Benjamin Britten" by Arvo Pärt and some poems of Czesław Miłosz. The metaphysical dimension of reality involves forms and things, existing beyond the boundaries of empirical perception and, at first sight, beyond the descriptive practices. Abstract objects are available in intellectual experience, but culture must transform them into a symbolic form. As a rule, it is connected to the practice of art minimalism. The essence of minimalism is the reduction of number of stylistic tools and “purification” the perception from the visual / auditory images (not a mimetic use of language. For the representation of the sacred Pärt uses only mensural canon form, scale and chord. These “characters” are deprived of descriptive function, but have symbolic potential (canon as a sign of stopped time, the eternal return. The distinctive feature of the Miłoszʼs style is the pursuit to “clean” the signs (indexical and symbolic. There is the reverse side of language distillation: the rejection of the subjective position, emotional experience, the distance between the person and the object of representation.

  12. Environmental isotope balance of Lake Kinneret as a tool in evaporation rate estimation

    International Nuclear Information System (INIS)

    Lewis, S.

    1979-01-01

    The balance of environmental isotopes in Lake Kinneret has been used to obtain an independent estimate of the mean monthly evaporation rate. Direct calculation was precluded by the inadequacy of the isotope data in uniquely representing the system behaviour throughout the annual cycle. The approach adopted uses an automatic algorithm to seek an objective best fit of the isotope balance model to measured oxygen-18 data by optimizing the evaporation rate as a parameter. To this end, evaporation is described as a periodic function with two parameters. The sensitivity of the evaporation rate estimates to parameter uncertainty and data errors is stressed. Error analysis puts confidence limits on the estimates obtained. Projected improvements in data collection and analysis show that a significant reduction in uncertainty can be realized. Relative to energy balance estimates, currently obtainable data result in about 30% uncertainty. The most optimistic scenario would yield about 15% relative uncertainty. (author)

  13. Weight Estimation Tool for Children Aged 6 to 59 Months in Limited-Resource Settings.

    Science.gov (United States)

    Ralston, Mark E; Myatt, Mark A

    2016-01-01

    A simple, reliable anthropometric tool for rapid estimation of weight in children would be useful in limited-resource settings where current weight estimation tools are not uniformly reliable, nearly all global under-five mortality occurs, severe acute malnutrition is a significant contributor in approximately one-third of under-five mortality, and a weight scale may not be immediately available in emergencies to first-response providers. To determine the accuracy and precision of mid-upper arm circumference (MUAC) and height as weight estimation tools in children under five years of age in low-to-middle income countries. This was a retrospective observational study. Data were collected in 560 nutritional surveys during 1992-2006 using a modified Expanded Program of Immunization two-stage cluster sample design. Locations with high prevalence of acute and chronic malnutrition. A total of 453,990 children met inclusion criteria (age 6-59 months; weight ≤ 25 kg; MUAC 80-200 mm) and exclusion criteria (bilateral pitting edema; biologically implausible weight-for-height z-score (WHZ), weight-for-age z-score (WAZ), and height-for-age z-score (HAZ) values). Weight was estimated using Broselow Tape, Hong Kong formula, and database MUAC alone, height alone, and height and MUAC combined. Mean percentage difference between true and estimated weight, proportion of estimates accurate to within ± 25% and ± 10% of true weight, weighted Kappa statistic, and Bland-Altman bias were reported as measures of tool accuracy. Standard deviation of mean percentage difference and Bland-Altman 95% limits of agreement were reported as measures of tool precision. Database height was a more accurate and precise predictor of weight compared to Broselow Tape 2007 [B], Broselow Tape 2011 [A], and MUAC. Mean percentage difference between true and estimated weight was +0.49% (SD = 10.33%); proportion of estimates accurate to within ± 25% of true weight was 97.36% (95% CI 97.40%, 97.46%); and

  14. Carbon Footprint Estimation Tool for Residential Buildings for Non-Specialized Users: OERCO2 Project

    Directory of Open Access Journals (Sweden)

    Jaime Solís-Guzmán

    2018-04-01

    Full Text Available Existing tools for environmental certification of buildings are failing in their ability to reach the general public and to create social awareness, since they require not only specialized knowledge regarding construction and energy sources, but also environmental knowledge. In this paper, an open-source online tool for the estimation of the carbon footprint of residential buildings by non-specialized users is presented as a product from the OERCO2 Erasmus + project. The internal calculations, data management and operation of this tool are extensively explained. The ten most common building typologies built in the last decade in Spain are analysed by using the OERCO2 tool, and the order of magnitude of the results is analysed by comparing them to the ranges determined by other authors. The OERCO2 tool proves itself to be reliable, with its results falling within the defined logical value ranges. Moreover, the major simplification of the interface allows non-specialized users to evaluate the sustainability of buildings. Further research is oriented towards its inclusion in other environmental certification tools and in Building Information Modeling (BIM environments.

  15. Reducing uncertainty of estimated nitrogen load reductions to aquatic systems through spatially targeting agricultural mitigation measures using groundwater nitrogen reduction

    DEFF Research Database (Denmark)

    Hashemi, Fatemeh; Olesen, Jørgen Eivind; Jabloun, Mohamed

    2018-01-01

    variation across the landscape in natural N-reduction (denitrification) of leached nitrate in the groundwater and surface water systems. A critical basis for including spatial targeting in regulation of N-load in Denmark is the uncertainty associated with the effect of spatially targeting measures, since......The need to further abate agricultural nitrate (N)-loadings to coastal waters in Denmark represents the main driver for development of a new spatially targeted regulation that focus on locating N-mitigation measures in agricultural areas with high N-load. This targeting makes use of the spatial...... the effect will be critically affected by uncertainty in the quantification of the spatial variation in N-reduction. In this study, we used 30 equally plausible N-reduction maps, at 100 m grid and sub-catchment resolutions, for the 85-km2 groundwater dominated Norsminde catchment in Denmark, applying set...

  16. Regression tools for CO2 inversions: application of a shrinkage estimator to process attribution

    International Nuclear Information System (INIS)

    Shaby, Benjamin A.; Field, Christopher B.

    2006-01-01

    In this study we perform an atmospheric inversion based on a shrinkage estimator. This method is used to estimate surface fluxes of CO 2 , first partitioned according to constituent geographic regions, and then according to constituent processes that are responsible for the total flux. Our approach differs from previous approaches in two important ways. The first is that the technique of linear Bayesian inversion is recast as a regression problem. Seen as such, standard regression tools are employed to analyse and reduce errors in the resultant estimates. A shrinkage estimator, which combines standard ridge regression with the linear 'Bayesian inversion' model, is introduced. This method introduces additional bias into the model with the aim of reducing variance such that errors are decreased overall. Compared with standard linear Bayesian inversion, the ridge technique seems to reduce both flux estimation errors and prediction errors. The second divergence from previous studies is that instead of dividing the world into geographically distinct regions and estimating the CO 2 flux in each region, the flux space is divided conceptually into processes that contribute to the total global flux. Formulating the problem in this manner adds to the interpretability of the resultant estimates and attempts to shed light on the problem of attributing sources and sinks to their underlying mechanisms

  17. A Cost Simulation Tool for Estimating the Cost of Operating Government Owned and Operated Ships

    Science.gov (United States)

    1994-09-01

    Horngren , C.T., Foster, G., Datar, S.M., Cost Accounting : A Management Emphasis, Prentice-Hall, Englewood Cliffs, NJ, 1994 IBM Corporation, A Graphical...4. TITLE AND SUBTITLE A COST SIMULATION TOOL FOR 5. FUNDING NUMBERS ESTIMATING THE COST OF OPERATING GOVERNMENT OWNED AND OPERATED SHIPS 6. AUTHOR( S ...normally does not present a problem to the accounting department. The final category, the cost of operating the government owned and operated ships is

  18. Comparing Fatigue Life Estimations of Composite Wind Turbine Blades using different Fatigue Analysis Tools

    DEFF Research Database (Denmark)

    Ardila, Oscar Gerardo Castro; Lennie, Matthew; Branner, Kim

    2015-01-01

    In this paper, fatigue lifetime prediction of NREL 5MW reference wind turbine is presented. The fatigue response of materials used in selected blade cross sections was obtained by applying macroscopic fatigue approaches and assuming uniaxial stress states. Power production and parked load cases...... suggested by the IEC 61400-1 standard were studied employing different load time intervals and by using two novel fatigue tools called ALBdeS and BECAS+F. The aeroelastic loads were defined thought aeroelastic simulations performed with both FAST and HAWC2 tools. The stress spectra at each layer were...... calculated employing laminated composite theory and beam cross section methods. The Palmgren-Miner linear damage rule was used to calculate the accumulation damage. The theoretical results produced by both fatigue tools proved a prominent effect of analysed design load conditions on the estimated lifetime...

  19. Estimation and reduction of CO2 emissions from crude oil distillation units

    International Nuclear Information System (INIS)

    Gadalla, M.; Olujic, Z.; Jobson, M.; Smith, R.

    2006-01-01

    Distillation systems are energy-intensive processes, and consequently contribute significantly to the greenhouse gases emissions (e.g. carbon dioxide (CO 2 ). A simple model for the estimation of CO 2 emissions associated with operation of heat-integrated distillation systems as encountered in refineries is introduced. In conjunction with a shortcut distillation model, this model has been used to optimize the process conditions of an existing crude oil atmospheric tower unit aiming at minimization of CO 2 emissions. Simulation results indicate that the total CO 2 emissions of the existing crude oil unit can be cut down by 22%, just by changing the process conditions accordingly, and that the gain in this respect can be doubled by integrating a gas turbine. In addition, emissions reduction is accompanied by substantial profit increase due to utility saving and/or export

  20. Estimating the benefits of greenhouse gas emission reduction from agricultural policy reform

    International Nuclear Information System (INIS)

    Adger, W.N.; Moran, D.C.

    1993-01-01

    Land use and agricultural activities contribute directly to the increased concentrations of atmospheric greenhouse gases. Economic support in industrialized countries generally increases agriculture's contribution to global greenhouse gas concentrations through fluxes associated with land use change and other sources. Changes in economic support offers opportunities to reduce net emissions, through this so far has gone unaccounted. Estimates are presented here of emissions of methane from livestock in the UK and show that, in monetary terms, when compared to the costs of reducing support, greenhouse gases are a significant factor. As signatory parties to the Climate Change Convection are required to stabilize emissions of all greenhouse gases, options for reduction of emissions of methane and other trace gases from the agricultural sector should form part of these strategies

  1. Estimating effectiveness of crop management for reduction of soil erosion and runoff

    Science.gov (United States)

    Hlavcova, K.; Studvova, Z.; Kohnova, S.; Szolgay, J.

    2017-10-01

    The paper focuses on erosion processes in the Svacenický Creek catchment which is a small sub-catchment of the Myjava River basin. To simulate soil loss and sediment transport the USLE/SDR and WaTEM/SEDEM models were applied. The models were validated by comparing the simulated results with the actual bathymetry of a polder at the catchment outlet. Methods of crop management based on rotation and strip cropping were applied for the reduction of soil loss and sediment transport. The comparison shows that the greatest intensities of soil loss were achieved by the bare soil without vegetation and from the planting of maize for corn. The lowest values were achieved from the planting of winter wheat. At the end the effectiveness of row crops and strip cropping for decreasing design floods from the catchment was estimated.

  2. Development and application of a decision support tool for reduction of product losses in the food-processing industry

    NARCIS (Netherlands)

    Akkerman, Renzo; van Donk, Dirk Pieter

    2008-01-01

    In food-processing industries, reduction of product losses is important for improving profitability and sustainability. This paper presents a decision support tool for analyzing the effects of planning decisions on the amount of product losses in the food-processing industry. We created a research

  3. Statistical tools applied for the reduction of the defect rate of coffee degassing valves

    Directory of Open Access Journals (Sweden)

    Giorgio Olmi

    2015-04-01

    Full Text Available Coffee is a very common beverage exported all over the world: just after roasting, coffee beans are packed in plastic or paper bags, which then experience long transfers with long storage times. Fresh roasted coffee emits large amounts of CO2 for several weeks. This gas must be gradually released, to prevent package over-inflation and to preserve aroma, moreover beans must be protected from oxygen coming from outside. Therefore, one-way degassing valves are applied to each package: their correct functionality is strictly related to the interference coupling between their bodies and covers and to the correct assembly of the other involved parts. This work takes inspiration from an industrial problem: a company that assembles valve components, supplied by different manufacturers, observed a high level of defect rate, affecting its valve production. An integrated approach, consisting in the adoption of quality charts, in an experimental campaign for the dimensional analysis of the mating parts and in the statistical processing of the data, was necessary to tackle the question. In particular, a simple statistical tool was made available to predict the defect rate and to individuate the best strategy for its reduction. The outcome was that requiring a strict protocol, regarding the combinations of parts from different manufacturers for assembly, would have been almost ineffective. Conversely, this study led to the individuation of the weak point in the manufacturing process of the mating components and to the suggestion of a slight improvement to be performed, with the final result of a significant (one order of magnitude decrease of the defect rate.

  4. A Web-Based Tool to Estimate Pollutant Loading Using LOADEST

    Directory of Open Access Journals (Sweden)

    Youn Shik Park

    2015-09-01

    Full Text Available Collecting and analyzing water quality samples is costly and typically requires significant effort compared to streamflow data, thus water quality data are typically collected at a low frequency. Regression models, identifying a relationship between streamflow and water quality data, are often used to estimate pollutant loads. A web-based tool using LOAD ESTimator (LOADEST as a core engine with four modules was developed to provide user-friendly interfaces and input data collection via web access. The first module requests and receives streamflow and water quality data from the U.S. Geological Survey. The second module retrieves watershed area for computation of pollutant loads per unit area. The third module examines potential error of input datasets for LOADEST runs, and the last module computes estimated and allowable annual average pollutant loads and provides tabular and graphical LOADEST outputs. The web-based tool was applied to two watersheds in this study, one agriculturally-dominated and one urban-dominated. It was found that annual sediment load at the urban-dominant watershed exceeded the target load; therefore, the web-based tool identified correctly the watershed requiring best management practices to reduce pollutant loads.

  5. A Planning Tool for Estimating Waste Generated by a Radiological Incident and Subsequent Decontamination Efforts - 13569

    International Nuclear Information System (INIS)

    Boe, Timothy; Lemieux, Paul; Schultheisz, Daniel; Peake, Tom; Hayes, Colin

    2013-01-01

    Management of debris and waste from a wide-area radiological incident would probably constitute a significant percentage of the total remediation cost and effort. The U.S. Environmental Protection Agency's (EPA's) Waste Estimation Support Tool (WEST) is a unique planning tool for estimating the potential volume and radioactivity levels of waste generated by a radiological incident and subsequent decontamination efforts. The WEST was developed to support planners and decision makers by generating a first-order estimate of the quantity and characteristics of waste resulting from a radiological incident. The tool then allows the user to evaluate the impact of various decontamination/demolition strategies on the waste types and volumes generated. WEST consists of a suite of standalone applications and Esri R ArcGIS R scripts for rapidly estimating waste inventories and levels of radioactivity generated from a radiological contamination incident as a function of user-defined decontamination and demolition approaches. WEST accepts Geographic Information System (GIS) shape-files defining contaminated areas and extent of contamination. Building stock information, including square footage, building counts, and building composition estimates are then generated using the Federal Emergency Management Agency's (FEMA's) Hazus R -MH software. WEST then identifies outdoor surfaces based on the application of pattern recognition to overhead aerial imagery. The results from the GIS calculations are then fed into a Microsoft Excel R 2007 spreadsheet with a custom graphical user interface where the user can examine the impact of various decontamination/demolition scenarios on the quantity, characteristics, and residual radioactivity of the resulting waste streams. (authors)

  6. A Planning Tool for Estimating Waste Generated by a Radiological Incident and Subsequent Decontamination Efforts - 13569

    Energy Technology Data Exchange (ETDEWEB)

    Boe, Timothy [Oak Ridge Institute for Science and Education, Research Triangle Park, NC 27711 (United States); Lemieux, Paul [U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Schultheisz, Daniel; Peake, Tom [U.S. Environmental Protection Agency, Washington, DC 20460 (United States); Hayes, Colin [Eastern Research Group, Inc, Morrisville, NC 26560 (United States)

    2013-07-01

    Management of debris and waste from a wide-area radiological incident would probably constitute a significant percentage of the total remediation cost and effort. The U.S. Environmental Protection Agency's (EPA's) Waste Estimation Support Tool (WEST) is a unique planning tool for estimating the potential volume and radioactivity levels of waste generated by a radiological incident and subsequent decontamination efforts. The WEST was developed to support planners and decision makers by generating a first-order estimate of the quantity and characteristics of waste resulting from a radiological incident. The tool then allows the user to evaluate the impact of various decontamination/demolition strategies on the waste types and volumes generated. WEST consists of a suite of standalone applications and Esri{sup R} ArcGIS{sup R} scripts for rapidly estimating waste inventories and levels of radioactivity generated from a radiological contamination incident as a function of user-defined decontamination and demolition approaches. WEST accepts Geographic Information System (GIS) shape-files defining contaminated areas and extent of contamination. Building stock information, including square footage, building counts, and building composition estimates are then generated using the Federal Emergency Management Agency's (FEMA's) Hazus{sup R}-MH software. WEST then identifies outdoor surfaces based on the application of pattern recognition to overhead aerial imagery. The results from the GIS calculations are then fed into a Microsoft Excel{sup R} 2007 spreadsheet with a custom graphical user interface where the user can examine the impact of various decontamination/demolition scenarios on the quantity, characteristics, and residual radioactivity of the resulting waste streams. (authors)

  7. An error reduction algorithm to improve lidar turbulence estimates for wind energy

    Directory of Open Access Journals (Sweden)

    J. F. Newman

    2017-02-01

    Full Text Available Remote-sensing devices such as lidars are currently being investigated as alternatives to cup anemometers on meteorological towers for the measurement of wind speed and direction. Although lidars can measure mean wind speeds at heights spanning an entire turbine rotor disk and can be easily moved from one location to another, they measure different values of turbulence than an instrument on a tower. Current methods for improving lidar turbulence estimates include the use of analytical turbulence models and expensive scanning lidars. While these methods provide accurate results in a research setting, they cannot be easily applied to smaller, vertically profiling lidars in locations where high-resolution sonic anemometer data are not available. Thus, there is clearly a need for a turbulence error reduction model that is simpler and more easily applicable to lidars that are used in the wind energy industry. In this work, a new turbulence error reduction algorithm for lidars is described. The Lidar Turbulence Error Reduction Algorithm, L-TERRA, can be applied using only data from a stand-alone vertically profiling lidar and requires minimal training with meteorological tower data. The basis of L-TERRA is a series of physics-based corrections that are applied to the lidar data to mitigate errors from instrument noise, volume averaging, and variance contamination. These corrections are applied in conjunction with a trained machine-learning model to improve turbulence estimates from a vertically profiling WINDCUBE v2 lidar. The lessons learned from creating the L-TERRA model for a WINDCUBE v2 lidar can also be applied to other lidar devices. L-TERRA was tested on data from two sites in the Southern Plains region of the United States. The physics-based corrections in L-TERRA brought regression line slopes much closer to 1 at both sites and significantly reduced the sensitivity of lidar turbulence errors to atmospheric stability. The accuracy of machine

  8. PREMIM and EMIM: tools for estimation of maternal, imprinting and interaction effects using multinomial modelling

    Directory of Open Access Journals (Sweden)

    Howey Richard

    2012-06-01

    Full Text Available Abstract Background Here we present two new computer tools, PREMIM and EMIM, for the estimation of parental and child genetic effects, based on genotype data from a variety of different child-parent configurations. PREMIM allows the extraction of child-parent genotype data from standard-format pedigree data files, while EMIM uses the extracted genotype data to perform subsequent statistical analysis. The use of genotype data from the parents as well as from the child in question allows the estimation of complex genetic effects such as maternal genotype effects, maternal-foetal interactions and parent-of-origin (imprinting effects. These effects are estimated by EMIM, incorporating chosen assumptions such as Hardy-Weinberg equilibrium or exchangeability of parental matings as required. Results In application to simulated data, we show that the inference provided by EMIM is essentially equivalent to that provided by alternative (competing software packages such as MENDEL and LEM. However, PREMIM and EMIM (used in combination considerably outperform MENDEL and LEM in terms of speed and ease of execution. Conclusions Together, EMIM and PREMIM provide easy-to-use command-line tools for the analysis of pedigree data, giving unbiased estimates of parental and child genotype relative risks.

  9. Comparing the Advanced REACH Tool's (ART) Estimates With Switzerland's Occupational Exposure Data.

    Science.gov (United States)

    Savic, Nenad; Gasic, Bojan; Schinkel, Jody; Vernez, David

    2017-10-01

    The Advanced REACH Tool (ART) is the most sophisticated tool used for evaluating exposure levels under the European Union's Registration, Evaluation, Authorisation and restriction of CHemicals (REACH) regulations. ART provides estimates at different percentiles of exposure and within different confidence intervals (CIs). However, its performance has only been tested on a limited number of exposure data. The present study compares ART's estimates with exposure measurements collected over many years in Switzerland. Measurements from 584 cases of exposure to vapours, mists, powders, and abrasive dusts (wood/stone and metal) were extracted from a Swiss database. The corresponding exposures at the 50th and 90th percentiles were calculated in ART. To characterize the model's performance, the 90% CI of the estimates was considered. ART's performance at the 50th percentile was only found to be insufficiently conservative with regard to exposure to wood/stone dusts, whereas the 90th percentile showed sufficient conservatism for all the types of exposure processed. However, a trend was observed with the residuals, where ART overestimated lower exposures and underestimated higher ones. The median was more precise, however, and the majority (≥60%) of real-world measurements were within a factor of 10 from ART's estimates. We provide recommendations based on the results and suggest further, more comprehensive, investigations. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  10. Effect of tube current modulation for dose estimation using a simulation tool on body CT examination

    International Nuclear Information System (INIS)

    Kawaguchi, Ai; Matsunaga, Yuta; Kobayashi, Masanao; Suzuki, Shoichi; Matsubara, Kosuke; Chida, Koichi

    2015-01-01

    The purpose of this study was to evaluate the effect of tube current modulation for dose estimation of a body computed tomography (CT) examination using a simulation tool. The authors also compared longitudinal variations in tube current values between iterative reconstruction (IR) and filtered back-projection (FBP) reconstruction algorithms. One hundred patients underwent body CT examinations. The tube current values around 10 organ regions were recorded longitudinally from tube current information. The organ and effective doses were simulated by average tube current values and longitudinal modulated tube current values. The organ doses for the bladder and breast estimated by longitudinal modulated tube current values were 20 % higher and 25 % lower than those estimated using the average tube current values, respectively. The differences in effective doses were small (mean, 0.7 mSv). The longitudinal variations in tube current values were almost the same for the IR and FBP algorithms. (authors)

  11. Usefulness of the automatic quantitative estimation tool for cerebral blood flow: clinical assessment of the application software tool AQCEL.

    Science.gov (United States)

    Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi

    2011-01-01

    AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation

  12. Mathematical simulation for estimating reduction of breast cancer mortality in mass screening using mammography

    International Nuclear Information System (INIS)

    Iinuma, Takeshi; Matsumoto, Tohru; Tateno, Yukio

    1999-01-01

    In Japan it is considered that mammography should be introduced with physical examination for the mass screening of breast cancer instead of physical examination alone, which is performed at present. Before the introduction of mammography, a mathematical simulation should be performed to show the reduction in breast cancer mortality by mass screening compared with an unscreened population. A mathematical model of cancer screening devised by the authors was used to estimate the number of deaths due to breast cancer (A) in the screened group and those (B) in the unscreened group within the same population. Then the relative risk (RR) and attributable risk (RD) were calculated as (A/B) and (B-A) respectively. Three methods of mass screening were compared: (1) physical examination (1-year interval), (2) mammography with physical examination (1-year interval), (3) mammography with physical examination (2-year interval). The calculated RR values were 0.85 for (1), 0.60 for (2) and 0.69 for (3). Assuming that the incidence of breast cancer was 100/10 5 person-years, the calculated RD values were 3.0, 8.1 and 6.2 persons/10 5 person-years for (1), (2) and (3), respectively. The 95% confidence interval of RR for three methods was over 1.0, and thus the reduction of breast cancer mortality was not statistically significant in the present population. In conclusion, mammography with physical examination may reduce breast cancer mortality in comparison with physical examination alone, but a larger number of women must be screened in order to obtain a significant RR value. (author)

  13. CoCoa: a software tool for estimating the coefficient of coancestry from multilocus genotype data.

    Science.gov (United States)

    Maenhout, Steven; De Baets, Bernard; Haesaert, Geert

    2009-10-15

    Phenotypic data collected in breeding programs and marker-trait association studies are often analyzed by means of linear mixed models. In these models, the covariance between the genetic background effects of all genotypes under study is modeled by means of pairwise coefficients of coancestry. Several marker-based coancestry estimation procedures allow to estimate this covariance matrix, but generally introduce a certain amount of bias when the examined genotypes are part of a breeding program. CoCoa implements the most commonly used marker-based coancestry estimation procedures and as such, allows to select the best fitting covariance structure for the phenotypic data at hand. This better model fit translates into an increased power and improved type I error control in association studies and an improved accuracy in phenotypic prediction studies. The presented software package also provides an implementation of the new Weighted Alikeness in State (WAIS) estimator for use in hybrid breeding programs. Besides several matrix manipulation tools, CoCoa implements two different bending heuristics, in case the inverse of an ill-conditioned coancestry matrix estimate is needed. The software package CoCoa is freely available at http://webs.hogent.be/cocoa. Source code, manual, binaries for 32 and 64-bit Linux systems and an installer for Microsoft Windows are provided. The core components of CoCoa are written in C++, while the graphical user interface is written in Java.

  14. Parameter and State Estimation of Large-Scale Complex Systems Using Python Tools

    Directory of Open Access Journals (Sweden)

    M. Anushka S. Perera

    2015-07-01

    Full Text Available This paper discusses the topics related to automating parameter, disturbance and state estimation analysis of large-scale complex nonlinear dynamic systems using free programming tools. For large-scale complex systems, before implementing any state estimator, the system should be analyzed for structural observability and the structural observability analysis can be automated using Modelica and Python. As a result of structural observability analysis, the system may be decomposed into subsystems where some of them may be observable --- with respect to parameter, disturbances, and states --- while some may not. The state estimation process is carried out for those observable subsystems and the optimum number of additional measurements are prescribed for unobservable subsystems to make them observable. In this paper, an industrial case study is considered: the copper production process at Glencore Nikkelverk, Kristiansand, Norway. The copper production process is a large-scale complex system. It is shown how to implement various state estimators, in Python, to estimate parameters and disturbances, in addition to states, based on available measurements.

  15. S-bases as a tool to solve reduction problems for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, A.V.; Smirnov, V.A.

    2006-01-01

    We suggest a mathematical definition of the notion of master integrals and present a brief review of algorithmic methods to solve reduction problems for Feynman integrals based on integration by parts relations. In particular, we discuss a recently suggested reduction algorithm which uses Groebner bases. New results obtained with its help for a family of three-loop Feynman integrals are outlined

  16. S-bases as a tool to solve reduction problems for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, A.V. [Scientific Research Computing Center of Moscow State University, Moscow 119992 (Russian Federation); Smirnov, V.A. [Nuclear Physics Institute of Moscow State University, Moscow 119992 (Russian Federation)

    2006-10-15

    We suggest a mathematical definition of the notion of master integrals and present a brief review of algorithmic methods to solve reduction problems for Feynman integrals based on integration by parts relations. In particular, we discuss a recently suggested reduction algorithm which uses Groebner bases. New results obtained with its help for a family of three-loop Feynman integrals are outlined.

  17. Decision support tools to improve the effectiveness of hazardous fuel reduction treatments in the New Jersey Pine Barrens

    Science.gov (United States)

    Kenneth L. Clark; Nicholas Skowronski; John Hom; Matthew Duveneck; Yude Pan; Stephen Van Tuyl; Jason Cole; Matthew Patterson; Stephen Maurer

    2009-01-01

    Our goal is to assist the New Jersey Forest Fire Service and federal wildland fire managers in the New Jersey Pine Barrens evaluate where and when to conduct hazardous fuel reduction treatments. We used remotely sensed LIDAR (Light Detection and Ranging System) data and field sampling to estimate fuel loads and consumption during prescribed fire treatments. This...

  18. Design Tool for Estimating Chemical Hydrogen Storage System Characteristics for Light-Duty Fuel Cell Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Thornton, Matthew J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sprik, Samuel [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Brooks, Kriston P. [Pacific Northwest National Laboratory; Tamburello, David A. [Savannah River National Laboratory

    2018-04-07

    The U.S. Department of Energy (DOE) developed a vehicle Framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to Technical Targets established by DOE for four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be estimated easily. To address this challenge, a design tool has been developed that allows researchers to directly enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates system parameters required to run the storage system model. Additionally, the design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the Framework model. These models will be explained and exercised with the representative hydrogen storage materials exothermic ammonia borane (NH3BH3) and endothermic alane (AlH3).

  19. Design Tool for Estimating Chemical Hydrogen Storage System Characteristics for Light-Duty Fuel Cell Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, Kriston P.; Sprik, Sam; Tamburello, David; Thornton, Matthew

    2018-05-03

    The U.S. Department of Energy (DOE) has developed a vehicle framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to DOE’s Technical Targets using four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework model for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be easily estimated. To address this challenge, a design tool has been developed that allows researchers to directly enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates the systems parameters required to run the storage system model. Additionally, this design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the framework model and compare it to the DOE Technical Targets. These models will be explained and exercised with existing hydrogen storage materials.

  20. Recov'Heat: An estimation tool of urban waste heat recovery potential in sustainable cities

    Science.gov (United States)

    Goumba, Alain; Chiche, Samuel; Guo, Xiaofeng; Colombert, Morgane; Bonneau, Patricia

    2017-02-01

    Waste heat recovery is considered as an efficient way to increase carbon-free green energy utilization and to reduce greenhouse gas emission. Especially in urban area, several sources such as sewage water, industrial process, waste incinerator plants, etc., are still rarely explored. Their integration into a district heating system providing heating and/or domestic hot water could be beneficial for both energy companies and local governments. EFFICACITY, a French research institute focused on urban energy transition, has developed an estimation tool for different waste heat sources potentially explored in a sustainable city. This article presents the development method of such a decision making tool which, by giving both energetic and economic analysis, helps local communities and energy service companies to make preliminary studies in heat recovery projects.

  1. Qualitative: Python Tool for MT Quality Estimation Supporting Server Mode and Hybrid MT

    Directory of Open Access Journals (Sweden)

    Avramidis Eleftherios

    2016-10-01

    Full Text Available We are presenting the development contributions of the last two years to our Python opensource Quality Estimation tool, a tool that can function in both experiment-mode and online web-service mode. The latest version provides a new MT interface, which communicates with SMT and rule-based translation engines and supports on-the-fly sentence selection. Additionally, we present an improved Machine Learning interface allowing more efficient communication with several state-of-the-art toolkits. Additions also include a more informative training process, a Python re-implementation of QuEst baseline features, a new LM toolkit integration, an additional PCFG parser and alignments of syntactic nodes.

  2. Observation-based estimation of aerosol-induced reduction of planetary boundary layer height

    Science.gov (United States)

    Zou, Jun; Sun, Jianning; Ding, Aijun; Wang, Minghuai; Guo, Weidong; Fu, Congbin

    2017-09-01

    Radiative aerosols are known to influence the surface energy budget and hence the evolution of the planetary boundary layer. In this study, we develop a method to estimate the aerosol-induced reduction in the planetary boundary layer height (PBLH) based on two years of ground-based measurements at a site, the Station for Observing Regional Processes of the Earth System (SORPES), at Nanjing University, China, and radiosonde data from the meteorological station of Nanjing. The observations show that increased aerosol loads lead to a mean decrease of 67.1 W m-2 for downward shortwave radiation (DSR) and a mean increase of 19.2 W m-2 for downward longwave radiation (DLR), as well as a mean decrease of 9.6 Wm-2 for the surface sensible heat flux (SHF) in the daytime. The relative variations of DSR, DLR and SHF are shown as a function of the increment of column mass concentration of particulate matter (PM2.5). High aerosol loading can significantly increase the atmospheric stability in the planetary boundary layer during both daytime and nighttime. Based on the statistical relationship between SHF and PM2.5 column mass concentrations, the SHF under clean atmospheric conditions (same as the background days) is derived. In this case, the derived SHF, together with observed SHF, are then used to estimate changes in the PBLH related to aerosols. Our results suggest that the PBLH decreases more rapidly with increasing aerosol loading at high aerosol loading. When the daytime mean column mass concentration of PM2.5 reaches 200 mg m-2, the decrease in the PBLH at 1600 LST (local standard time) is about 450 m.

  3. Optimal Wavelength Selection in Ultraviolet Spectroscopy for the Estimation of Toxin Reduction Ratio during Hemodialysis

    Directory of Open Access Journals (Sweden)

    Amir Ghanifar

    2016-06-01

    Full Text Available Introduction The concentration of substances, including urea, creatinine, and uric acid, can be used as an index to measure toxic uremic solutes in the blood during dialysis and interdialytic intervals. The on-line monitoring of toxin concentration allows for the clearance measurement of some low-molecular-weight solutes at any time during hemodialysis.The aim of this study was to determine the optimal wavelength for estimating the changes in urea, creatinine, and uric acid in dialysate, using ultraviolet (UV spectroscopy. Materials and Methods In this study, nine uremic patients were investigated, using on-line spectrophotometry. The on-line absorption measurements (UV radiation were performed with a spectrophotometer module, connected to the fluid outlet of the dialysis machine. Dialysate samples were obtained and analyzed, using standard biochemical methods. Optimal wavelengths for both creatinine and uric acid were selected by using a combination of genetic algorithms (GAs, i.e., GA-partial least squares (GA-PLS and interval partial least squares (iPLS. Results The Artifitial Neural Network (ANN sensitivity analysis determined the wavelengths of the UV band most suitable for estimating the concentration of creatinine and uric acid. The two optimal wavelengths were 242 and 252 nm for creatinine and 295 and 298 nm for uric acid. Conclusion It can be concluded that the reduction ratio of creatinine and uric acid (dialysis efficiency could be continuously monitored during hemodialysis by UV spectroscopy.Compared to the conventional method, which is particularly sensitive to the sampling technique and involves post-dialysis blood sampling, iterative measurements throughout the dialysis session can yield more reliable data.

  4. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Directory of Open Access Journals (Sweden)

    Christoph Heesen

    Full Text Available BACKGROUND: Prognostic counseling in multiple sclerosis (MS is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. OBJECTIVE: The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110 and their physicians (n = 6 and compared them with the estimates of OLAP. RESULTS: Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. CONCLUSION: While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic

  5. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Science.gov (United States)

    Heesen, Christoph; Gaissmaier, Wolfgang; Nguyen, Franziska; Stellmann, Jan-Patrick; Kasper, Jürgen; Köpke, Sascha; Lederer, Christian; Neuhaus, Anneke; Daumer, Martin

    2013-01-01

    Prognostic counseling in multiple sclerosis (MS) is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP) tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110) and their physicians (n = 6) and compared them with the estimates of OLAP. Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic estimates and clarify its usefulness for patients and physicians

  6. Effective dysphonia detection using feature dimension reduction and kernel density estimation for patients with Parkinson's disease.

    Directory of Open Access Journals (Sweden)

    Shanshan Yang

    Full Text Available Detection of dysphonia is useful for monitoring the progression of phonatory impairment for patients with Parkinson's disease (PD, and also helps assess the disease severity. This paper describes the statistical pattern analysis methods to study different vocal measurements of sustained phonations. The feature dimension reduction procedure was implemented by using the sequential forward selection (SFS and kernel principal component analysis (KPCA methods. Four selected vocal measures were projected by the KPCA onto the bivariate feature space, in which the class-conditional feature densities can be approximated with the nonparametric kernel density estimation technique. In the vocal pattern classification experiments, Fisher's linear discriminant analysis (FLDA was applied to perform the linear classification of voice records for healthy control subjects and PD patients, and the maximum a posteriori (MAP decision rule and support vector machine (SVM with radial basis function kernels were employed for the nonlinear classification tasks. Based on the KPCA-mapped feature densities, the MAP classifier successfully distinguished 91.8% voice records, with a sensitivity rate of 0.986, a specificity rate of 0.708, and an area value of 0.94 under the receiver operating characteristic (ROC curve. The diagnostic performance provided by the MAP classifier was superior to those of the FLDA and SVM classifiers. In addition, the classification results indicated that gender is insensitive to dysphonia detection, and the sustained phonations of PD patients with minimal functional disability are more difficult to be correctly identified.

  7. Phase-processing as a tool for speckle reduction in pulse-echo images

    DEFF Research Database (Denmark)

    Healey, AJ; Leeman, S; Forsberg, F

    1991-01-01

    . Traditional speckle reduction procedures regard speckle correction as a stochastic process and trade image smoothing (resolution loss) for speckle reduction. Recently, a new phase acknowledging technique has been proposed that is unique in its ability to correct for speckle interference with no image......Due to the coherent nature of conventional ultrasound medical imaging systems interference artefacts occur in pulse echo images. These artefacts are generically termed 'speckle'. The phenomenon may severely limit low contrast resolution with clinically relevant information being obscured...

  8. MURMoT: Design and Application of Microbial Uranium Reduction Monitoring Tools

    Energy Technology Data Exchange (ETDEWEB)

    Pennell, Kurt [Tufts Univ., Medford, MA (United States)

    2014-12-31

    The overarching project goal of the MURMoT project was the design of tools to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-transforming bacteria. To accomplish these objectives, an integrated approach that combined nucleic acid-based tools, proteomic workflows, uranium isotope measurements, and U(IV) speciation and structure analyses using the Advanced Photon Source (APS) at Argonne National Laboratory was developed.

  9. MURMoT: Design and Application of Microbial Uranium Reduction Monitoring Tools

    International Nuclear Information System (INIS)

    Pennell, Kurt

    2014-01-01

    The overarching project goal of the MURMoT project was the design of tools to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-transforming bacteria. To accomplish these objectives, an integrated approach that combined nucleic acid-based tools, proteomic workflows, uranium isotope measurements, and U(IV) speciation and structure analyses using the Advanced Photon Source (APS) at Argonne National Laboratory was developed.

  10. Treatment Cost Analysis Tool (TCAT) for estimating costs of outpatient treatment services.

    Science.gov (United States)

    Flynn, Patrick M; Broome, Kirk M; Beaston-Blaakman, Aaron; Knight, Danica K; Horgan, Constance M; Shepard, Donald S

    2009-02-01

    A Microsoft Excel-based workbook designed for research analysts to use in a national study was retooled for treatment program directors and financial officers to allocate, analyze, and estimate outpatient treatment costs in the U.S. This instrument can also be used as a planning and management tool to optimize resources and forecast the impact of future changes in staffing, client flow, program design, and other resources. The Treatment Cost Analysis Tool (TCAT) automatically provides feedback and generates summaries and charts using comparative data from a national sample of non-methadone outpatient providers. TCAT is being used by program staff to capture and allocate both economic and accounting costs, and outpatient service costs are reported for a sample of 70 programs. Costs for an episode of treatment in regular, intensive, and mixed types of outpatient treatment were $882, $1310, and $1381 respectively (based on 20% trimmed means and 2006 dollars). An hour of counseling cost $64 in regular, $85 intensive, and $86 mixed. Group counseling hourly costs per client were $8, $11, and $10 respectively for regular, intensive, and mixed. Future directions include use of a web-based interview version, much like some of the commercially available tax preparation software tools, and extensions for use in other modalities of treatment.

  11. A correction in the CDM methodological tool for estimating methane emissions from solid waste disposal sites.

    Science.gov (United States)

    Santos, M M O; van Elk, A G P; Romanel, C

    2015-12-01

    Solid waste disposal sites (SWDS) - especially landfills - are a significant source of methane, a greenhouse gas. Although having the potential to be captured and used as a fuel, most of the methane formed in SWDS is emitted to the atmosphere, mainly in developing countries. Methane emissions have to be estimated in national inventories. To help this task the Intergovernmental Panel on Climate Change (IPCC) has published three sets of guidelines. In addition, the Kyoto Protocol established the Clean Development Mechanism (CDM) to assist the developed countries to offset their own greenhouse gas emissions by assisting other countries to achieve sustainable development while reducing emissions. Based on methodologies provided by the IPCC regarding SWDS, the CDM Executive Board has issued a tool to be used by project developers for estimating baseline methane emissions in their project activities - on burning biogas from landfills or on preventing biomass to be landfilled and so avoiding methane emissions. Some inconsistencies in the first two IPCC guidelines have already been pointed out in an Annex of IPCC latest edition, although with hidden details. The CDM tool uses a model for methane estimation that takes on board parameters, factors and assumptions provided in the latest IPCC guidelines, while using in its core equation the one of the second IPCC edition with its shortcoming as well as allowing a misunderstanding of the time variable. Consequences of wrong ex-ante estimation of baseline emissions regarding CDM project activities can be of economical or environmental type. Example of the first type is the overestimation of 18% in an actual project on biogas from landfill in Brazil that harms its developers; of the second type, the overestimation of 35% in a project preventing municipal solid waste from being landfilled in China, which harms the environment, not for the project per se but for the undue generated carbon credits. In a simulated landfill - the same

  12. Estimation of snow albedo reduction by light absorbing impurities using Monte Carlo radiative transfer model

    Science.gov (United States)

    Sengupta, D.; Gao, L.; Wilcox, E. M.; Beres, N. D.; Moosmüller, H.; Khlystov, A.

    2017-12-01

    Radiative forcing and climate change greatly depends on earth's surface albedo and its temporal and spatial variation. The surface albedo varies greatly depending on the surface characteristics ranging from 5-10% for calm ocean waters to 80% for some snow-covered areas. Clean and fresh snow surfaces have the highest albedo and are most sensitive to contamination with light absorbing impurities that can greatly reduce surface albedo and change overall radiative forcing estimates. Accurate estimation of snow albedo as well as understanding of feedbacks on climate from changes in snow-covered areas is important for radiative forcing, snow energy balance, predicting seasonal snowmelt, and run off rates. Such information is essential to inform timely decision making of stakeholders and policy makers. Light absorbing particles deposited onto the snow surface can greatly alter snow albedo and have been identified as a major contributor to regional climate forcing if seasonal snow cover is involved. However, uncertainty associated with quantification of albedo reduction by these light absorbing particles is high. Here, we use Mie theory (under the assumption of spherical snow grains) to reconstruct the single scattering parameters of snow (i.e., single scattering albedo ῶ and asymmetry parameter g) from observation-based size distribution information and retrieved refractive index values. The single scattering parameters of impurities are extracted with the same approach from datasets obtained during laboratory combustion of biomass samples. Instead of using plane-parallel approximation methods to account for multiple scattering, we have used the simple "Monte Carlo ray/photon tracing approach" to calculate the snow albedo. This simple approach considers multiple scattering to be the "collection" of single scattering events. Using this approach, we vary the effective snow grain size and impurity concentrations to explore the evolution of snow albedo over a wide

  13. Precipitation areal-reduction factor estimation using an annual-maxima centered approach

    Science.gov (United States)

    Asquith, W.H.; Famiglietti, J.S.

    2000-01-01

    The adjustment of precipitation depth of a point storm to an effective (mean) depth over a watershed is important for characterizing rainfall-runoff relations and for cost-effective designs of hydraulic structures when design storms are considered. A design storm is the precipitation point depth having a specified duration and frequency (recurrence interval). Effective depths are often computed by multiplying point depths by areal-reduction factors (ARF). ARF range from 0 to 1, vary according to storm characteristics, such as recurrence interval; and are a function of watershed characteristics, such as watershed size, shape, and geographic location. This paper presents a new approach for estimating ARF and includes applications for the 1-day design storm in Austin, Dallas, and Houston, Texas. The approach, termed 'annual-maxima centered,' specifically considers the distribution of concurrent precipitation surrounding an annual-precipitation maxima, which is a feature not seen in other approaches. The approach does not require the prior spatial averaging of precipitation, explicit determination of spatial correlation coefficients, nor explicit definition of a representative area of a particular storm in the analysis. The annual-maxima centered approach was designed to exploit the wide availability of dense precipitation gauge data in many regions of the world. The approach produces ARF that decrease more rapidly than those from TP-29. Furthermore, the ARF from the approach decay rapidly with increasing recurrence interval of the annual-precipitation maxima. (C) 2000 Elsevier Science B.V.The adjustment of precipitation depth of a point storm to an effective (mean) depth over a watershed is important for characterizing rainfall-runoff relations and for cost-effective designs of hydraulic structures when design storms are considered. A design storm is the precipitation point depth having a specified duration and frequency (recurrence interval). Effective depths are

  14. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    Science.gov (United States)

    Sirirojvisuth, Apinut

    In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this

  15. Binomial Distribution Sample Confidence Intervals Estimation 7. Absolute Risk Reduction and ARR-like Expressions

    Directory of Open Access Journals (Sweden)

    Andrei ACHIMAŞ CADARIU

    2004-08-01

    Full Text Available Assessments of a controlled clinical trial suppose to interpret some key parameters as the controlled event rate, experimental event date, relative risk, absolute risk reduction, relative risk reduction, number needed to treat when the effect of the treatment are dichotomous variables. Defined as the difference in the event rate between treatment and control groups, the absolute risk reduction is the parameter that allowed computing the number needed to treat. The absolute risk reduction is compute when the experimental treatment reduces the risk for an undesirable outcome/event. In medical literature when the absolute risk reduction is report with its confidence intervals, the method used is the asymptotic one, even if it is well know that may be inadequate. The aim of this paper is to introduce and assess nine methods of computing confidence intervals for absolute risk reduction and absolute risk reduction – like function.Computer implementations of the methods use the PHP language. Methods comparison uses the experimental errors, the standard deviations, and the deviation relative to the imposed significance level for specified sample sizes. Six methods of computing confidence intervals for absolute risk reduction and absolute risk reduction-like functions were assessed using random binomial variables and random sample sizes.The experiments shows that the ADAC, and ADAC1 methods obtains the best overall performance of computing confidence intervals for absolute risk reduction.

  16. On-Line Flutter Prediction Tool for Wind Tunnel Flutter Testing using Parameter Varying Estimation Methodology, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology, Inc. (ZONA) proposes to develop an on-line flutter prediction tool for wind tunnel model using the parameter varying estimation (PVE) technique to...

  17. Randomized Comparison of Mobile and Web-Tools to Provide Dementia Risk Reduction Education: Use, Engagement and Participant Satisfaction.

    Science.gov (United States)

    O'Connor, Elodie; Farrow, Maree; Hatherly, Chris

    2014-01-01

    Encouraging middle-aged adults to maintain their physical and cognitive health may have a significant impact on reducing the prevalence of dementia in the future. Mobile phone apps and interactive websites may be one effective way to target this age group. However, to date there has been little research investigating the user experience of dementia risk reduction tools delivered in this way. The aim of this study was to explore participant engagement and evaluations of three different targeted smartphone and Web-based dementia risk reduction tools following a four-week intervention. Participants completed a Web-based screening questionnaire to collect eligibility information. Eligible participants were asked to complete a Web-based baseline questionnaire and were then randomly assigned to use one of the three dementia risk reduction tools for a period of four weeks: (1) a mobile phone application; (2) an information-based website; and (3) an interactive website. User evaluations were obtained via a Web-based follow-up questionnaire after completion of the intervention. Of 415 eligible participants, 370 (89.16%) completed the baseline questionnaire and were assigned to an intervention group; 200 (54.05%) completed the post-intervention questionnaire. The average age of participants was 52 years, and 149 (75%) were female. Findings indicated that participants from all three intervention groups reported a generally positive impression of the tools across a range of domains. Participants using the information-based website reported higher ratings of their overall impression of the tool, F2,191=4.12, P=.02; how interesting the information was, F2,189=3.53, P=.03; how helpful the information was, F2,192=4.15, P=.02; and how much they learned, F2,188=3.86, P=.02. Group differences were significant between the mobile phone app and information-based website users, but not between the interactive website users and the other two groups. Additionally, participants using the

  18. Randomized Comparison of Mobile and Web-Tools to Provide Dementia Risk Reduction Education: Use, Engagement and Participant Satisfaction

    Science.gov (United States)

    O'Connor, Elodie; Hatherly, Chris

    2014-01-01

    Background Encouraging middle-aged adults to maintain their physical and cognitive health may have a significant impact on reducing the prevalence of dementia in the future. Mobile phone apps and interactive websites may be one effective way to target this age group. However, to date there has been little research investigating the user experience of dementia risk reduction tools delivered in this way. Objective The aim of this study was to explore participant engagement and evaluations of three different targeted smartphone and Web-based dementia risk reduction tools following a four-week intervention. Methods Participants completed a Web-based screening questionnaire to collect eligibility information. Eligible participants were asked to complete a Web-based baseline questionnaire and were then randomly assigned to use one of the three dementia risk reduction tools for a period of four weeks: (1) a mobile phone application; (2) an information-based website; and (3) an interactive website. User evaluations were obtained via a Web-based follow-up questionnaire after completion of the intervention. Results Of 415 eligible participants, 370 (89.16%) completed the baseline questionnaire and were assigned to an intervention group; 200 (54.05%) completed the post-intervention questionnaire. The average age of participants was 52 years, and 149 (75%) were female. Findings indicated that participants from all three intervention groups reported a generally positive impression of the tools across a range of domains. Participants using the information-based website reported higher ratings of their overall impression of the tool, F2,191=4.12, P=.02; how interesting the information was, F2,189=3.53, P=.03; how helpful the information was, F2,192=4.15, P=.02; and how much they learned, F2,188=3.86, P=.02. Group differences were significant between the mobile phone app and information-based website users, but not between the interactive website users and the other two groups

  19. Large biases in regression-based constituent flux estimates: causes and diagnostic tools

    Science.gov (United States)

    Hirsch, Robert M.

    2014-01-01

    It has been documented in the literature that, in some cases, widely used regression-based models can produce severely biased estimates of long-term mean river fluxes of various constituents. These models, estimated using sample values of concentration, discharge, and date, are used to compute estimated fluxes for a multiyear period at a daily time step. This study compares results of the LOADEST seven-parameter model, LOADEST five-parameter model, and the Weighted Regressions on Time, Discharge, and Season (WRTDS) model using subsampling of six very large datasets to better understand this bias problem. This analysis considers sample datasets for dissolved nitrate and total phosphorus. The results show that LOADEST-7 and LOADEST-5, although they often produce very nearly unbiased results, can produce highly biased results. This study identifies three conditions that can give rise to these severe biases: (1) lack of fit of the log of concentration vs. log discharge relationship, (2) substantial differences in the shape of this relationship across seasons, and (3) severely heteroscedastic residuals. The WRTDS model is more resistant to the bias problem than the LOADEST models but is not immune to them. Understanding the causes of the bias problem is crucial to selecting an appropriate method for flux computations. Diagnostic tools for identifying the potential for bias problems are introduced, and strategies for resolving bias problems are described.

  20. Reduction of potassium content of green bean pods and chard by culinary processing. Tools for chronic kidney disease.

    Science.gov (United States)

    Martínez-Pineda, Montserrat; Yagüe-Ruiz, Cristina; Caverni-Muñoz, Alberto; Vercet-Tormo, Antonio

    2016-01-01

    In order to prevent a possible hyperkalemia, chronic renal patients, especially in advanced stages, must follow a low potassium diet. So dietary guidelines for chronic kidney disease recommend limiting the consumption of many vegetables, as well as to apply laborious culinary techniques to maximize the reduction of potassium. The aim of this work is to analyze potassium content from several vegetable, fresh products, frozen and preserved, as well as check and compare the effectiveness in potassium reduction of different culinary processes, some of them recommended in dietary guidelines such as soaking or double cooking. Sample potassium content was analyzed by triplicate using flamephotometry. The results showed significant reductions in potassium content in all culinary processes studied. The degree of loss varied depending on the type of vegetable and processing applied. Frozen products achieved greater reductions than the fresh ones, obtaining in some cases losses greater than 90%. In addition, it was observed how in many cases the single application of a normal cooking reached potassium reductions to acceptable levels for its inclusion in renal patient diet. The results shown in this study are very positive because they provide tools for professionals who deal with this kind of patients. They allow them to adapt more easily to the needs and preferences of their patients and increase dietary variety. Copyright © 2016 Sociedad Española de Nefrología. Published by Elsevier España, S.L.U. All rights reserved.

  1. Children's estimates of food portion size: the development and evaluation of three portion size assessment tools for use with children.

    Science.gov (United States)

    Foster, E; Matthews, J N S; Lloyd, J; Marshall, L; Mathers, J C; Nelson, M; Barton, K L; Wrieden, W L; Cornelissen, P; Harris, J; Adamson, A J

    2008-01-01

    A number of methods have been developed to assist subjects in providing an estimate of portion size but their application in improving portion size estimation by children has not been investigated systematically. The aim was to develop portion size assessment tools for use with children and to assess the accuracy of children's estimates of portion size using the tools. The tools were food photographs, food models and an interactive portion size assessment system (IPSAS). Children (n 201), aged 4-16 years, were supplied with known quantities of food to eat, in school. Food leftovers were weighed. Children estimated the amount of each food using each tool, 24 h after consuming the food. The age-specific portion sizes represented were based on portion sizes consumed by children in a national survey. Significant differences were found between the accuracy of estimates using the three tools. Children of all ages performed well using the IPSAS and food photographs. The accuracy and precision of estimates made using the food models were poor. For all tools, estimates of the amount of food served were more accurate than estimates of the amount consumed. Issues relating to reporting of foods left over which impact on estimates of the amounts of foods actually consumed require further study. The IPSAS has shown potential for assessment of dietary intake with children. Before practical application in assessment of dietary intake of children the tool would need to be expanded to cover a wider range of foods and to be validated in a 'real-life' situation.

  2. A Unified tool to estimate Distances, Ages, and Masses (UniDAM) from spectrophotometric data

    Science.gov (United States)

    Mints, Alexey; Hekker, Saskia

    2017-08-01

    Context. Galactic archaeology, the study of the formation and evolution of the Milky Way by reconstructing its past from its current constituents, requires precise and accurate knowledge of stellar parameters for as many stars as possible. To achieve this, a number of large spectroscopic surveys have been undertaken and are still ongoing. Aims: So far consortia carrying out the different spectroscopic surveys have used different tools to determine stellar parameters of stars from their derived effective temperatures (Teff), surface gravities (log g), and metallicities ([Fe/H]); the parameters can be combined with photometric, astrometric, interferometric, or asteroseismic information. Here we aim to homogenise the stellar characterisation by applying a unified tool to a large set of publicly available spectrophotometric data. Methods: We used spectroscopic data from a variety of large surveys combined with infrared photometry from 2MASS and AllWISE and compared these in a Bayesian manner with PARSEC isochrones to derive probability density functions (PDFs) for stellar masses, ages, and distances. We treated PDFs of pre-helium-core burning, helium-core burning, and post helium-core burning solutions as well as different peaks in multimodal PDFs (I.e. each unimodal sub-PDF) of the different evolutionary phases separately. Results: For over 2.5 million stars we report mass, age, and distance estimates for each evolutionary phase and unimodal sub-PDF. We report Gaussian, skewed, Gaussian, truncated Gaussian, modified truncated exponential distribution or truncated Student's t-distribution functions to represent each sub-PDF, allowing us to reconstruct detailed PDFs. Comparisons with stellar parameter estimates from the literature show good agreement within uncertainties. Conclusions: We present UniDAM, the unified tool applicable to spectrophotometric data of different surveys, to obtain a homogenised set of stellar parameters. The unified tool and the tables with

  3. RealCalc : a real time Java calculation tool. Application to HVSR estimation

    Science.gov (United States)

    Hloupis, G.; Vallianatos, F.

    2009-04-01

    Java computation platform is not a newcomer in the seismology field. It is mainly used for applications regarding collecting, requesting, spreading and visualizing seismological data because it is productive, safe and has low maintenance costs. Although it has very attractive characteristics for the engineers, Java didn't used frequently in real time applications where prediction and reliability required as a reaction to real world events. The main reasons for this are the absence of priority support (such as priority ceiling or priority inversion) and the use of an automated memory management (called garbage collector). To overcome these problems a number of extensions have been proposed with the Real Time Specification for Java (RTSJ) being the most promising and used one. In the current study we used the RTSJ to build an application that receives data continuously and provides estimations in real time. The application consists of four main modules: incoming data, preprocessing, estimation and publication. As an application example we present real time HVSR estimation. Microtremors recordings are collected continuously from the incoming data module. The preprocessing module consists of a window selector tool based on wavelets which is applied on the incoming data stream in order derive the most stationary parts. The estimation module provides all the necessary calculations according to user specifications. Finally the publication module except the results presentation it also calculates attributes and relevant statistics for each site (temporal variations, HVSR stability). Acknowledgements This work is partially supported by the Greek General Secretariat of Research and Technology in the frame of Crete Regional Project 2000- 2006 (M1.2): "TALOS: An integrated system of seismic hazard monitoring and management in the front of the Hellenic Arc", CRETE PEP7 (KP_7).

  4. An innovative multivariate tool for fuel consumption and costs estimation of agricultural operations

    Directory of Open Access Journals (Sweden)

    Mirko Guerrieri

    2016-12-01

    Full Text Available The estimation of operating costs of agricultural and forestry machineries is a key factor in both planning agricultural policies and farm management. Few works have tried to estimate operating costs and the produced models are normally based on deterministic approaches. Conversely, in the statistical model randomness is present and variable states are not described by unique values, but rather by probability distributions. In this study, for the first time, a multivariate statistical model based on Partial Least Squares (PLS was adopted to predict the fuel consumption and costs of six agricultural operations such as: ploughing, harrowing, fertilization, sowing, weed control and shredding. The prediction was conducted on two steps: first of all few initial selected parameters (time per surface-area unit, maximum engine power, purchase price of the tractor and purchase price of the operating machinery were used to estimate the fuel consumption; then the predicted fuel consumption together with the initial parameters were used to estimate the operational costs. Since the obtained models were based on an input dataset very heterogeneous, these resulted to be extremely efficient and so generalizable and robust. In details the results show prediction values in the test with r always ≥ 0.91. Thus, the approach may results extremely useful for both farmers (in terms of economic advantages and at institutional level (representing an innovative and efficient tool for planning future Rural Development Programmes and the Common Agricultural Policy. In light of these advantages the proposed approach may as well be implemented on a web platform and made available to all the stakeholders.

  5. An innovative multivariate tool for fuel consumption and costs estimation of agricultural operations

    International Nuclear Information System (INIS)

    Guerrieri, M.; Fedrizzi, M.; Antonucci, F.; Pallottino, F.; Sperandio, G.; Pagano, M.; Figorilli, S.; Menesatti, P.; Costa, C.

    2016-01-01

    The estimation of operating costs of agricultural and forestry machineries is a key factor in both planning agricultural policies and farm management. Few works have tried to estimate operating costs and the produced models are normally based on deterministic approaches. Conversely, in the statistical model randomness is present and variable states are not described by unique values, but rather by probability distributions. In this study, for the first time, a multivariate statistical model based on Partial Least Squares (PLS) was adopted to predict the fuel consumption and costs of six agricultural operations such as: ploughing, harrowing, fertilization, sowing, weed control and shredding. The prediction was conducted on two steps: first of all few initial selected parameters (time per surface-area unit, maximum engine power, purchase price of the tractor and purchase price of the operating machinery) were used to estimate the fuel consumption; then the predicted fuel consumption together with the initial parameters were used to estimate the operational costs. Since the obtained models were based on an input dataset very heterogeneous, these resulted to be extremely efficient and so generalizable and robust. In details the results show prediction values in the test with r always ≥ 0.91. Thus, the approach may results extremely useful for both farmers (in terms of economic advantages) and at institutional level (representing an innovative and efficient tool for planning future Rural Development Programmes and the Common Agricultural Policy). In light of these advantages the proposed approach may as well be implemented on a web platform and made available to all the stakeholders.

  6. An innovative multivariate tool for fuel consumption and costs estimation of agricultural operations

    Energy Technology Data Exchange (ETDEWEB)

    Guerrieri, M.; Fedrizzi, M.; Antonucci, F.; Pallottino, F.; Sperandio, G.; Pagano, M.; Figorilli, S.; Menesatti, P.; Costa, C.

    2016-07-01

    The estimation of operating costs of agricultural and forestry machineries is a key factor in both planning agricultural policies and farm management. Few works have tried to estimate operating costs and the produced models are normally based on deterministic approaches. Conversely, in the statistical model randomness is present and variable states are not described by unique values, but rather by probability distributions. In this study, for the first time, a multivariate statistical model based on Partial Least Squares (PLS) was adopted to predict the fuel consumption and costs of six agricultural operations such as: ploughing, harrowing, fertilization, sowing, weed control and shredding. The prediction was conducted on two steps: first of all few initial selected parameters (time per surface-area unit, maximum engine power, purchase price of the tractor and purchase price of the operating machinery) were used to estimate the fuel consumption; then the predicted fuel consumption together with the initial parameters were used to estimate the operational costs. Since the obtained models were based on an input dataset very heterogeneous, these resulted to be extremely efficient and so generalizable and robust. In details the results show prediction values in the test with r always ≥ 0.91. Thus, the approach may results extremely useful for both farmers (in terms of economic advantages) and at institutional level (representing an innovative and efficient tool for planning future Rural Development Programmes and the Common Agricultural Policy). In light of these advantages the proposed approach may as well be implemented on a web platform and made available to all the stakeholders.

  7. Estimating CO{sub 2} Emission Reduction of Non-capture CO{sub 2} Utilization (NCCU) Technology

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ji Hyun; Lee, Dong Woog; Gyu, Jang Se; Kwak, No-Sang; Lee, In Young; Jang, Kyung Ryoung; Shim, Jae-Goo [KEPCO Research Institute, Daejon (Korea, Republic of); Choi, Jong Shin [Korea East-West Power Co., LTD(ETP), Ulsan (Korea, Republic of)

    2015-10-15

    Estimating potential of CO{sub 2} emission reduction of non-capture CO{sub 2} utilization (NCCU) technology was evaluated. NCCU is sodium bicarbonate production technology through the carbonation reaction of CO{sub 2} contained in the flue gas. For the estimating the CO{sub 2} emission reduction, process simulation using process simulator (PRO/II) based on a chemical plant which could handle CO{sub 2} of 100 tons per day was performed, Also for the estimation of the indirect CO{sub 2} reduction, the solvay process which is a conventional technology for the production of sodium carbonate/sodium bicarbonate, was studied. The results of the analysis showed that in case of the solvay process, overall CO{sub 2} emission was estimated as 48,862 ton per year based on the energy consumption for the production of NaHCO{sub 3} (7.4 GJ/tNaHCO{sub 3}). While for the NCCU technology, the direct CO{sub 2} reduction through the CO{sub 2} carbonation was estimated as 36,500 ton per year and the indirect CO{sub 2} reduction through the lower energy consumption was 46,885 ton per year which lead to 83,385 ton per year in total. From these results, it could be concluded that sodium bicarbonate production technology through the carbonation reaction of CO{sub 2} contained in the flue was energy efficient and could be one of the promising technology for the low CO{sub 2} emission technology.

  8. Single Tree Vegetation Depth Estimation Tool for Satellite Services Link Design

    Directory of Open Access Journals (Sweden)

    Z. Hasirci

    2016-04-01

    Full Text Available Attenuation caused by tree shadowing is an important factor for describing the propagation channel of satellite services. Thus, vegetation effects should be determined by experimental studies or empirical formulations. In this study, tree types in the Black Sea Region of Turkey are classified based on their geometrical shapes into four groups such as conic, ellipsoid, spherical and hemispherical. The variations of the vegetation depth according to different tree shapes are calculated with ray tracing method. It is showed that different geometrical shapes have different vegetation depths even if they have same foliage volume for different elevation angles. The proposed method is validated with the related literature in terms of average single tree attenuation. On the other hand, due to decrease system requirements (speed, memory usage etc. of ray tracing method, an artificial neural network is proposed as an alternative. A graphical user interface is created for the above processes in MATLAB environment named vegetation depth estimation tool (VdET.

  9. ResilSIM—A Decision Support Tool for Estimating Resilience of Urban Systems

    Directory of Open Access Journals (Sweden)

    Sarah Irwin

    2016-09-01

    Full Text Available Damages to urban systems as a result of water-related natural disasters have escalated in recent years. The observed trend is expected to increase in the future as the impacts of population growth, rapid urbanization and climate change persist. To alleviate the damages associated with these impacts, it is recommended to integrate disaster management methods into planning, design and operational policies under all levels of government. This manuscript proposes the concept of ResilSIM: A decision support tool that rapidly estimates the resilience (a modern disaster management measure that is dynamic in time and space of an urban system to the consequences of natural disasters. The web-based tool (with mobile access operates in near real-time. It is designed to assist decision makers in selecting the best options for integrating adaptive capacity into their communities to protect against the negative impacts of a hazard. ResilSIM is developed for application in Toronto and London, Ontario, Canada; however, it is only demonstrated for use in the city of London, which is susceptible to riverine flooding. It is observed how the incorporation of different combinations of adaptation options maintain or strengthen London’s basic structures and functions in the event of a flood.

  10. Application of the Streamflow Prediction Tool to Estimate Sediment Dredging Volumes in Texas Coastal Waterways

    Science.gov (United States)

    Yeates, E.; Dreaper, G.; Afshari, S.; Tavakoly, A. A.

    2017-12-01

    Over the past six fiscal years, the United States Army Corps of Engineers (USACE) has contracted an average of about a billion dollars per year for navigation channel dredging. To execute these funds effectively, USACE Districts must determine which navigation channels need to be dredged in a given year. Improving this prioritization process results in more efficient waterway maintenance. This study uses the Streamflow Prediction Tool, a runoff routing model based on global weather forecast ensembles, to estimate dredged volumes. This study establishes regional linear relationships between cumulative flow and dredged volumes over a long-term simulation covering 30 years (1985-2015), using drainage area and shoaling parameters. The study framework integrates the National Hydrography Dataset (NHDPlus Dataset) with parameters from the Corps Shoaling Analysis Tool (CSAT) and dredging record data from USACE District records. Results in the test cases of the Houston Ship Channel and the Sabine and Port Arthur Harbor waterways in Texas indicate positive correlation between the simulated streamflows and actual dredging records.

  11. Heuristic and probabilistic wind power availability estimation procedures: Improved tools for technology and site selection

    Energy Technology Data Exchange (ETDEWEB)

    Nigim, K.A. [University of Waterloo, Waterloo, Ont. (Canada). Department of Electrical and Computer Engineering; Parker, Paul [University of Waterloo, Waterloo, Ont. (Canada). Department of Geography, Environmental Studies

    2007-04-15

    The paper describes two investigative procedures to estimate wind power from measured wind velocities. Wind velocity data are manipulated to visualize the site potential by investigating the probable wind power availability and its capacity to meet a targeted demand. The first procedure is an availability procedure that looks at the wind characteristics and its probable energy capturing profile. This profile of wind enables the probable maximum operating wind velocity profile for a selected wind turbine design to be predicted. The structured procedures allow for a consequent adjustment, sorting and grouping of the measured wind velocity data taken at different time intervals and hub heights. The second procedure is the adequacy procedure that investigates the probable degree of availability and the application consequences. Both procedures are programmed using MathCAD symbolic mathematical software. The math tool is used to generate a visual interpolation of the data as well as numerical results from extensive data sets that exceed the capacity of conventional spreadsheet tools. Two sites located in Southern Ontario, Canada are investigated using the procedures. Successful implementation of the procedures supports informed decision making where a hill site is shown to have much higher wind potential than that measured at the local airport. The process is suitable for a wide spectrum of users who are considering the energy potential for either a grid-tied or off-grid wind energy system. (author)

  12. A Useful Tool for Atmospheric Correction and Surface Temperature Estimation of Landsat Infrared Thermal Data

    Science.gov (United States)

    Rivalland, Vincent; Tardy, Benjamin; Huc, Mireille; Hagolle, Olivier; Marcq, Sébastien; Boulet, Gilles

    2016-04-01

    Land Surface temperature (LST) is a critical variable for studying the energy and water budgets at the Earth surface, and is a key component of many aspects of climate research and services. The Landsat program jointly carried out by NASA and USGS has been providing thermal infrared data for 40 years, but no associated LST product has been yet routinely proposed to community. To derive LST values, radiances measured at sensor-level need to be corrected for the atmospheric absorption, the atmospheric emission and the surface emissivity effect. Until now, existing LST products have been generated with multi channel methods such as the Temperature/Emissivity Separation (TES) adapted to ASTER data or the generalized split-window algorithm adapted to MODIS multispectral data. Those approaches are ill-adapted to the Landsat mono-window data specificity. The atmospheric correction methodology usually used for Landsat data requires detailed information about the state of the atmosphere. This information may be obtained from radio-sounding or model atmospheric reanalysis and is supplied to a radiative transfer model in order to estimate atmospheric parameters for a given coordinate. In this work, we present a new automatic tool dedicated to Landsat thermal data correction which improves the common atmospheric correction methodology by introducing the spatial dimension in the process. The python tool developed during this study, named LANDARTs for LANDsat Automatic Retrieval of surface Temperature, is fully automatic and provides atmospheric corrections for a whole Landsat tile. Vertical atmospheric conditions are downloaded from the ERA Interim dataset from ECMWF meteorological organization which provides them at 0.125 degrees resolution, at a global scale and with a 6-hour-time step. The atmospheric correction parameters are estimated on the atmospheric grid using the commercial software MODTRAN, then interpolated to 30m resolution. We detail the processing steps

  13. Examination of an indicative tool for rapidly estimating viable organism abundance in ballast water

    Science.gov (United States)

    Vanden Byllaardt, Julie; Adams, Jennifer K.; Casas-Monroy, Oscar; Bailey, Sarah A.

    2018-03-01

    Regulatory discharge standards stipulating a maximum allowable number of viable organisms in ballast water have led to a need for rapid, easy and accurate compliance assessment tools and protocols. Some potential tools presume that organisms present in ballast water samples display the same characteristics of life as the native community (e.g. rates of fluorescence). This presumption may not prove true, particularly when ships' ballast tanks present a harsh environment and long transit times, negatively impacting organism health. Here, we test the accuracy of a handheld pulse amplitude modulated (PAM) fluorometer, the Hach BW680, for detecting photosynthetic protists at concentrations above or below the discharge standard (< 10 cells·ml- 1) in comparison to microscopic counts using fluorescein diacetate as a viability probe. Testing was conducted on serial dilutions of freshwater harbour samples in the lab and in situ untreated ballast water samples originating from marine, freshwater and brackish sources utilizing three preprocessing techniques to target organisms in the size range of ≥ 10 and < 50 μm. The BW680 numeric estimates were in agreement with microscopic counts when analyzing freshly collected harbour water at all but the lowest concentrations (< 38 cells·ml- 1). Chi-square tests determined that error is not independent of preprocessing methods: using the filtrate method or unfiltered water, in addition to refining the conversion factor of raw fluorescence to cell size, can decrease the grey area where exceedance of the discharge standard cannot be measured with certainty (at least for the studied populations). When examining in situ ballast water, the BW680 detected significantly fewer viable organisms than microscopy, possibly due to factors such as organism size or ballast water age. Assuming both the BW680 and microscopy with FDA stain were measuring fluorescence and enzymatic activity/membrane integrity correctly, the observed discrepancy

  14. ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures

    Science.gov (United States)

    Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.

    2008-08-01

    This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.

  15. An estimate of the cost of burnout on early retirement and reduction in clinical hours of practicing physicians in Canada

    Science.gov (United States)

    2014-01-01

    Background Interest in the impact of burnout on physicians has been growing because of the possible burden this may have on health care systems. The objective of this study is to estimate the cost of burnout on early retirement and reduction in clinical hours of practicing physicians in Canada. Methods Using an economic model, the costs related to early retirement and reduction in clinical hours of physicians were compared for those who were experiencing burnout against a scenario in which they did not experience burnout. The January 2012 Canadian Medical Association Masterfile was used to determine the number of practicing physicians. Transition probabilities were estimated using 2007–2008 Canadian Physician Health Survey and 2007 National Physician Survey data. Adjustments were also applied to outcome estimates based on ratio of actual to planned retirement and reduction in clinical hours. Results The total cost of burnout for all physicians practicing in Canada is estimated to be $213.1 million ($185.2 million due to early retirement and $27.9 million due to reduced clinical hours). Family physicians accounted for 58.8% of the burnout costs, followed by surgeons for 24.6% and other specialists for 16.6%. Conclusion The cost of burnout associated with early retirement and reduction in clinical hours is substantial and a significant proportion of practicing physicians experience symptoms of burnout. As health systems struggle with human resource shortages and expanding waiting times, this estimate sheds light on the extent to which the burden could be potentially decreased through prevention and promotion activities to address burnout among physicians. PMID:24927847

  16. Continuous improvement process and waste reduction through a QFD tool: the case of a metallurgic plant

    Directory of Open Access Journals (Sweden)

    Leoni Pentiado Godoy

    2013-05-01

    Full Text Available This paper proposes the use of QFD for the continuous improvement of production processes and waste reduction actions. To collect the information we used the simple observation and questionnaire with closed questions applied to employees, representing 88.75% of the population that works in the production processes of an industry of metal-mechanic sector, located inRio Grandedo Sul. QFD is an effective method of quality planning, because it provides a diagnosis that underpins the definition of improvement actions aimed at combating waste. Actions were set providing improved communication between the sectors, enabling the delivery of products with specifications that meet customer requirements, on time and the right amounts, at a minimum cost and satisfaction of those involved with the company. The implementation of these actions reduces waste, minimizes the extra work, maximizes effective labor and increases profitability.

  17. Organohalide Respiring Bacteria and Reductive Dehalogenases: Key Tools in Organohalide Bioremediation

    Directory of Open Access Journals (Sweden)

    Bat-Erdene eJugder

    2016-03-01

    Full Text Available Organohalides are recalcitrant pollutants that have been responsible for substantial contamination of soils and groundwater. Organohalide-respiring bacteria (ORB provide a potential solution to remediate contaminated sites, through their ability to use organohalides as terminal electron acceptors to yield energy for growth (i.e. organohalide respiration. Ideally, this process results in non- or lesser-halogenated compounds that are mostly less toxic to the environment or more easily degraded. At the heart of these processes are reductive dehalogenases (RDase, which are membrane bound enzymes coupled with other components that facilitate dehalogenation of organohalides to generate cellular energy. This review focuses RDases, concentrating on those which have been purified (partially or wholly and functionally characterized. Further, the paper reviews the major bacteria involved in organohalide breakdown and the evidence for microbial evolution of RDases. Finally, the capacity for using ORB in a bioremediation and bioaugmentation capacity are discussed.

  18. Cancellation of elective surgeries in a Brazilian public hospital: reasons and estimated reduction.

    Science.gov (United States)

    Santos, Gisele Aparecida Alves Corral Dos; Bocchi, Silvia Cristina Mangini

    2017-01-01

    To characterize cancellations of elective surgeries according to clinical and non-clinical reasons, as well as to verify seasonal influence and determine the estimated reduction of the index. Quantitative, descriptive and retrospective study with secondary data extracted from the Public Hospital of the State of São Paulo database. Out of the 8,443 (100%) elective surgeries scheduled, 7,870 (93.21%) were performed and 573 (6.79%) were canceled. Out of these 573 (100%) people, 48.33% were canceled for clinical reasons and 46.40% were for non-clinical reasons. Among the non-clinical reasons for surgery cancellations, those related to medical reasons stood out: at the request of the surgeon/change of approach (17.93%), followed by non-hospitalized patient (8.96%). There was no indication of seasonality regarding the reasons for cancellation in the assessed period. Although the rate of elective surgeries cancellations is lower than that of other hospitals with similar characteristics, it is still possible to reduce it from 6.79% to 1.36%, considering that 80% of the reasons for cancellation are avoidable. Caracterizar cancelamentos cirúrgicos eletivos segundo motivos clínicos e não clínicos, assim como verificar a influência sazonal e a estimativa de redução do índice. Estudo quantitativo, descritivo e retrospectivo com dados secundários, extraídos de banco de dados de Hospital Público do Estado de São Paulo. Das 8.443 (100%) cirurgias eletivas agendadas, realizaram-se 7.870 (93,21%) e suspenderam-se 573 (6,79%). Destas 573 (100%), 48,33% foram por razões clínicas e 46,40% não clínicas. Dentre os motivos não clínicos de cancelamento cirúrgico, preponderaram os relacionados às razões médicas, categorizadas como: a pedido do cirurgião/mudança de conduta (17,93%), seguida por paciente não internou (8,96%). Não houve indicação de sazonalidade quanto à ocorrência de motivos de cancelamento no período analisado. Apesar de a taxa de cancelamento

  19. Data Reduction with Quantization Constraints for Decentralized Estimation in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yang Weng

    2014-01-01

    Full Text Available The unknown vector estimation problem with bandwidth constrained wireless sensor network is considered. In such networks, sensor nodes make distributed observations on the unknown vector and collaborate with a fusion center to generate a final estimate. Due to power and communication bandwidth limitations, each sensor node must compress its data and transmit to the fusion center. In this paper, both centralized and decentralized estimation frameworks are developed. The closed-form solution for the centralized estimation framework is proposed. The computational complexity of decentralized estimation problem is proven to be NP-hard and a Gauss-Seidel algorithm to search for an optimal solution is also proposed. Simulation results show the good performance of the proposed algorithms.

  20. Estimating the Value of Price Risk Reduction in Energy Efficiency Investments in Buildings

    Directory of Open Access Journals (Sweden)

    Pekka Tuominen

    2017-10-01

    Full Text Available This paper presents a method for calculating the value of price risk reduction to a consumer that can be achieved with investments in energy efficiency. The value of price risk reduction is discussed to some length in general terms in the literature reviewed but, so far, no methodology for calculating the value has been presented. Here we suggest such a method. The problem of valuating price risk reduction is approached using a variation of the Black–Scholes model by considering a hypothetical financial instrument that a consumer would purchase to insure herself against unexpected price hikes. This hypothetical instrument is then compared with an actual energy efficiency investment that reaches the same level of price risk reduction. To demonstrate the usability of the method, case examples are calculated for typical single-family houses in Finland. The results show that the price risk entailed in household energy consumption can be reduced by a meaningful amount with energy efficiency investments, and that the monetary value of this reduction can be calculated. It is argued that this often-overlooked benefit of energy efficiency investments merits more consideration in future studies.

  1. A tool to estimate bar patterns and flow conditions in estuaries when limited data is available

    Science.gov (United States)

    Leuven, J.; Verhoeve, S.; Bruijns, A. J.; Selakovic, S.; van Dijk, W. M.; Kleinhans, M. G.

    2017-12-01

    The effects of human interventions, natural evolution of estuaries and rising sea-level on food security and flood safety are largely unknown. In addition, ecologists require quantified habitat area to study future evolution of estuaries, but they lack predictive capability of bathymetry and hydrodynamics. For example, crucial input required for ecological models are values of intertidal area, inundation time, peak flow velocities and salinity. While numerical models can reproduce these spatial patterns, their computational times are long and for each case a new model must be developed. Therefore, we developed a comprehensive set of relations that accurately predict the hydrodynamics and the patterns of channels and bars, using a combination of the empirical relations derived from approximately 50 estuaries and theory for bars and estuaries. The first step is to predict local tidal prisms, which is the tidal prism that flows through a given cross-section. Second, the channel geometry is predicted from tidal prism and hydraulic geometry relations. Subsequently, typical flow velocities can be estimated from the channel geometry and tidal prism. Then, an ideal estuary shape is fitted to the measured planform: the deviation from the ideal shape, which is defined as the excess width, gives a measure of the locations where tidal bars form and their summed width (Leuven et al., 2017). From excess width, typical hypsometries can be predicted per cross-section. In the last step, flow velocities are calculated for the full range of occurring depths and salinity is calculated based on the estuary shape. Here, we will present a prototype tool that predicts equilibrium bar patterns and typical flow conditions. The tool is easy to use because the only input required is the estuary outline and tidal amplitude. Therefore it can be used by policy makers and researchers from multiple disciplines, such as ecologists, geologists and hydrologists, for example for paleogeographic

  2. Neural networks for the dimensionality reduction of GOME measurement vector in the estimation of ozone profiles

    International Nuclear Information System (INIS)

    Del Frate, F.; Iapaolo, M.; Casadio, S.; Godin-Beekmann, S.; Petitdidier, M.

    2005-01-01

    Dimensionality reduction can be of crucial importance in the application of inversion schemes to atmospheric remote sensing data. In this study the problem of dimensionality reduction in the retrieval of ozone concentration profiles from the radiance measurements provided by the instrument Global Ozone Monitoring Experiment (GOME) on board of ESA satellite ERS-2 is considered. By means of radiative transfer modelling, neural networks and pruning algorithms, a complete procedure has been designed to extract the GOME spectral ranges most crucial for the inversion. The quality of the resulting retrieval algorithm has been evaluated by comparing its performance to that yielded by other schemes and co-located profiles obtained with lidar measurements

  3. PROMAB-GIS: A GIS based Tool for Estimating Runoff and Sediment Yield in running Waters

    Science.gov (United States)

    Jenewein, S.; Rinderer, M.; Ploner, A.; Sönser, T.

    2003-04-01

    In recent times settlements have expanded, traffic and tourist activities have increased in most alpine regions. As a consequence, on the one hand humans and goods are affected by natural hazard processes more often, while on the other hand the demand for protection by both technical constructions and planning measures carried out by public authorities is growing. This situation results in an ever stronger need of reproducibility, comparability, transparency of all methods applied in modern natural hazard management. As a contribution to a new way of coping this situation Promab-GIS Version 1.0 has been developed. Promab-Gis has been designed as a model for time- and space-dependent determination of both runoff and bedload transport in rivers of small alpine catchment areas. The estimation of the unit hydrograph relies upon the "rational formula" and the time-area curves of the watershed. The time area diagram is a graph of cumulative drainage area contributing to discharge at the watershed outlet within a specified time of travel. The sediment yield is estimated for each cell of the channel network by determining the actual process type (erosion, transport or accumulation). Two types of transport processes are considered, sediment transport and debris flows. All functions of Promab-GIS are integrated in the graphical user interface of ArcView as pull-up menus and tool buttons. Hence the application of Promab-GIS does not rely on a sophisticated knowledge of GIS in general, respectively the ArcView software. However, despite the use of computer assistance, Promab-GIS still is an expert support system. In order to obtain plausible results, the users must be familiar with all the relevant processes controlling runoff and sediment yield in torrent catchments.

  4. Chemiluminescence analyzer of NOx as a high-throughput screening tool in selective catalytic reduction of NO

    International Nuclear Information System (INIS)

    Oh, Kwang Seok; Woo, Seong Ihl

    2011-01-01

    A chemiluminescence-based analyzer of NO x gas species has been applied for high-throughput screening of a library of catalytic materials. The applicability of the commercial NO x analyzer as a rapid screening tool was evaluated using selective catalytic reduction of NO gas. A library of 60 binary alloys composed of Pt and Co, Zr, La, Ce, Fe or W on Al 2 O 3 substrate was tested for the efficiency of NO x removal using a home-built 64-channel parallel and sequential tubular reactor. The NO x concentrations measured by the NO x analyzer agreed well with the results obtained using micro gas chromatography for a reference catalyst consisting of 1 wt% Pt on γ-Al 2 O 3 . Most alloys showed high efficiency at 275 °C, which is typical of Pt-based catalysts for selective catalytic reduction of NO. The screening with NO x analyzer allowed to select Pt-Ce (X) (X=1–3) and Pt–Fe (2) as the optimal catalysts for NO x removal: 73% NO x conversion was achieved with the Pt–Fe (2) alloy, which was much better than the results for the reference catalyst and the other library alloys. This study demonstrates a sequential high-throughput method of practical evaluation of catalysts for the selective reduction of NO.

  5. Microfinance As A Tool For Poverty Reduction: A Study Of Jordan

    Directory of Open Access Journals (Sweden)

    Žiaková M.

    2015-12-01

    Full Text Available The aim of this study is to evaluate the impact of microfinance on the poor, particularly in the specific areas of economic and social development of people and their employment. The research was carried out in Jordan, a country with a well-developed microfinance sector. The results have shown that microfinance has led to an improvement in the financial and social situation of the poor, especially for female clients of microfinance institutions. Interestingly, the higher income of clients has not caused higher expenditure on their basic needs, but rather people have generated saving for their future and used the additional money for education. According to the results of the microfinance impact assessment, it can be assumed that people, particularly females, prefer to improve the social situation for future generations. Based on this finding, we consider microfinance an effective tool for breaking the vicious circles of poverty, especially in Jordan. Furthermore, microcredits have facilitated in increasing employment for the poor, mainly through self-employment. It is believed that there exists a direct connection to the future expansion of microcredits that will lead to the development of small businesses with a promising impact on employability throughout the population structure.

  6. Stress Reduction in Postcardiac Surgery Family Members: Implementation of a Postcardiac Surgery Tool Kit.

    Science.gov (United States)

    Breisinger, Lauren; Macci Bires, Angela; Cline, Thomas W

    The intensive care unit (ICU) can be a place of stress, anxiety, and emotional instability for both patients and families. Medical and nursing care during this acute time is patient focused, and family members are often left in the dark. Unintentional exclusion from information results in high levels of stress, anxiety, and uncertainty for families. Due to the acuity of illness, family members of cardiac surgery patients experience the highest levels of stress. Spouses may experience intense psychosomatic symptoms such as depression, anxiety, and fear for several months after the surgery. The purpose of this study was aimed at decreasing those feelings of anxiety in family members with postcardiac surgery through the use of a cardiac surgery tool kit. The study was a quality improvement project utilizing a convenience sample of 83 participants 18 years and older. Participants were asked to use the State Trait Anxiety Inventory (STAI) Form Y-1 (state anxiety) to rate their anxiety level preintervention and then again postintervention. Data were collected over a 6-month period. Descriptive data including age, education level, ethnicity, relationship, experience in the ICU, and active diagnoses of mental disorders did not affect the changes in the pre- and posttest data. A paired t test was conducted on the sample to assess changes in state anxiety, using the STAI Form Y-1. The results were statistically significant (t = 11.97, df = 81, P family members of postcardiac surgery patients.

  7. Iterative PSF Estimation and Its Application to Shift Invariant and Variant Blur Reduction

    OpenAIRE

    Seung-Won Jung; Byeong-Doo Choi; Sung-Jea Ko

    2009-01-01

    Among image restoration approaches, image deconvolution has been considered a powerful solution. In image deconvolution, a point spread function (PSF), which describes the blur of the image, needs to be determined. Therefore, in this paper, we propose an iterative PSF estimation algorithm which is able to estimate an accurate PSF. In real-world motion-blurred images, a simple parametric model of the PSF fails when a camera moves in an arbitrary direction with an inconsistent speed during an e...

  8. Exploring the effects of dimensionality reduction in deep networks for force estimation in robotic-assisted surgery

    Science.gov (United States)

    Aviles, Angelica I.; Alsaleh, Samar; Sobrevilla, Pilar; Casals, Alicia

    2016-03-01

    Robotic-Assisted Surgery approach overcomes the limitations of the traditional laparoscopic and open surgeries. However, one of its major limitations is the lack of force feedback. Since there is no direct interaction between the surgeon and the tissue, there is no way of knowing how much force the surgeon is applying which can result in irreversible injuries. The use of force sensors is not practical since they impose different constraints. Thus, we make use of a neuro-visual approach to estimate the applied forces, in which the 3D shape recovery together with the geometry of motion are used as input to a deep network based on LSTM-RNN architecture. When deep networks are used in real time, pre-processing of data is a key factor to reduce complexity and improve the network performance. A common pre-processing step is dimensionality reduction which attempts to eliminate redundant and insignificant information by selecting a subset of relevant features to use in model construction. In this work, we show the effects of dimensionality reduction in a real-time application: estimating the applied force in Robotic-Assisted Surgeries. According to the results, we demonstrated positive effects of doing dimensionality reduction on deep networks including: faster training, improved network performance, and overfitting prevention. We also show a significant accuracy improvement, ranging from about 33% to 86%, over existing approaches related to force estimation.

  9. Mobile Health Devices as Tools for Worldwide Cardiovascular Risk Reduction and Disease Management

    Science.gov (United States)

    Piette, John D.; List, Justin; Rana, Gurpreet K.; Townsend, Whitney; Striplin, Dana; Heisler, Michele

    2016-01-01

    We examined evidence on whether mobile health (mHealth) tools, including Interactive Voice Response (IVR) calls, short message service (SMS) or text messaging, and smartphones, can improve lifestyle behaviors and management related to cardiovascular diseases throughout the world. We conducted a state-of-the-art review and literature synthesis of peer-reviewed and grey literature published since 2004. The review prioritized randomized trials and studies focused on cardiovascular diseases and risk factors, but included other reports when they represented the best available evidence. The search emphasized reports on the potential benefits of mHealth interventions implemented in low- and middle-income countries (LMICs). IVR and SMS interventions can improve cardiovascular preventive care in developed countries by addressing risk factors including weight, smoking, and physical activity. IVR and SMS-based interventions for cardiovascular disease management also have shown benefits with respect to hypertension management, hospital readmissions, and diabetic glycemic control. Multi-modal interventions including web-based communication with clinicians and mHealth-enabled clinical monitoring with feedback also have shown benefits. The evidence regarding the potential benefits of interventions using smartphones and social media is still developing. Studies of mHealth interventions have been conducted in more than 30 LMICs, and evidence to date suggests that programs are feasible and may improve medication adherence and disease outcomes. Emerging evidence suggests that mHealth interventions may improve cardiovascular-related lifestyle behaviors and disease management. Next generation mHealth programs developed worldwide should be based on evidence-based behavioral theories and incorporate advances in artificial intelligence for adapting systems automatically to patients’ unique and changing needs. PMID:26596977

  10. Mobile Health Devices as Tools for Worldwide Cardiovascular Risk Reduction and Disease Management.

    Science.gov (United States)

    Piette, John D; List, Justin; Rana, Gurpreet K; Townsend, Whitney; Striplin, Dana; Heisler, Michele

    2015-11-24

    We examined evidence on whether mobile health (mHealth) tools, including interactive voice response calls, short message service, or text messaging, and smartphones, can improve lifestyle behaviors and management related to cardiovascular diseases throughout the world. We conducted a state-of-the-art review and literature synthesis of peer-reviewed and gray literature published since 2004. The review prioritized randomized trials and studies focused on cardiovascular diseases and risk factors, but included other reports when they represented the best available evidence. The search emphasized reports on the potential benefits of mHealth interventions implemented in low- and middle-income countries. Interactive voice response and short message service interventions can improve cardiovascular preventive care in developed countries by addressing risk factors including weight, smoking, and physical activity. Interactive voice response and short message service-based interventions for cardiovascular disease management also have shown benefits with respect to hypertension management, hospital readmissions, and diabetic glycemic control. Multimodal interventions including Web-based communication with clinicians and mHealth-enabled clinical monitoring with feedback also have shown benefits. The evidence regarding the potential benefits of interventions using smartphones and social media is still developing. Studies of mHealth interventions have been conducted in >30 low- and middle-income countries, and evidence to date suggests that programs are feasible and may improve medication adherence and disease outcomes. Emerging evidence suggests that mHealth interventions may improve cardiovascular-related lifestyle behaviors and disease management. Next-generation mHealth programs developed worldwide should be based on evidence-based behavioral theories and incorporate advances in artificial intelligence for adapting systems automatically to patients' unique and changing needs

  11. Transit boardings estimation and simulation tool (TBEST) calibration for guideway and BRT modes : [summary].

    Science.gov (United States)

    2013-06-01

    As demand for public transportation grows, planning tools used by Florida Department of Transportation (FDOT) and other transit agencies must evolve to effectively predict changing patterns of ridership. A key tool for this purpose is the Transit Boa...

  12. Pedestrians' estimates of their own nighttime conspicuity are unaffected by severe reductions in headlight illumination.

    Science.gov (United States)

    Whetsel Borzendowski, Stephanie A; Rosenberg, Rachel L; Sewall, Ashley Stafford; Tyrrell, Richard A

    2013-12-01

    At night pedestrians tend to overestimate their conspicuity to oncoming drivers, but little is known about factors affecting pedestrians' conspicuity estimates. This study examines how headlamp intensity and pedestrians' clothing influence judgments of their own conspicuity. Forty-eight undergraduate students estimated their own conspicuity on an unilluminated closed road by walking in front of a stationary vehicle to the point at which they judged that they were just recognizable to the driver. Unknown to the participants, high beam intensity was manipulated between subjects by placing neutral density filters on the headlamps. Estimated conspicuity distances did not significantly vary with changes in headlamp intensity even when only 3% of the illumination from the headlamps was present. These findings underscore the need to educate pedestrians about the visual challenges that drivers face at night and the need to minimize pedestrians' exposure to traffic flow at night. © 2013.

  13. Filtering Methods for Error Reduction in Spacecraft Attitude Estimation Using Quaternion Star Trackers

    Science.gov (United States)

    Calhoun, Philip C.; Sedlak, Joseph E.; Superfin, Emil

    2011-01-01

    Precision attitude determination for recent and planned space missions typically includes quaternion star trackers (ST) and a three-axis inertial reference unit (IRU). Sensor selection is based on estimates of knowledge accuracy attainable from a Kalman filter (KF), which provides the optimal solution for the case of linear dynamics with measurement and process errors characterized by random Gaussian noise with white spectrum. Non-Gaussian systematic errors in quaternion STs are often quite large and have an unpredictable time-varying nature, particularly when used in non-inertial pointing applications. Two filtering methods are proposed to reduce the attitude estimation error resulting from ST systematic errors, 1) extended Kalman filter (EKF) augmented with Markov states, 2) Unscented Kalman filter (UKF) with a periodic measurement model. Realistic assessments of the attitude estimation performance gains are demonstrated with both simulation and flight telemetry data from the Lunar Reconnaissance Orbiter.

  14. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    Science.gov (United States)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  15. Effect of large weight reductions on measured and estimated kidney function

    DEFF Research Database (Denmark)

    von Scholten, Bernt Johan; Persson, Frederik; Svane, Maria S

    2017-01-01

    GFR (creatinine-based equations), whereas measured GFR (mGFR) and cystatin C-based eGFR would be unaffected if adjusted for body surface area. METHODS: Prospective, intervention study including 19 patients. All attended a baseline visit before gastric bypass surgery followed by a visit six months post-surgery. m...... for body surface area was unchanged. Estimates of GFR based on creatinine overestimate renal function likely due to changes in muscle mass, whereas cystatin C based estimates are unaffected. TRIAL REGISTRATION: ClinicalTrials.gov, NCT02138565 . Date of registration: March 24, 2014....

  16. Clinical evaluation of a commercial orthopedic metal artifact reduction tool for CT simulations in radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Li Hua; Noel, Camille; Chen, Haijian; Harold Li, H.; Low, Daniel; Moore, Kevin; Klahr, Paul; Michalski, Jeff; Gay, Hiram A.; Thorstad, Wade; Mutic, Sasa [Department of Radiation Oncology, Washington University, St. Louis, Missouri 63110 (United States); Department of Radiation Oncology, University of California Los Angeles, Los Angeles, California 90095 (United States); Department of Radiation Oncology, University of California San Diego, San Diego, California 92093 (United States); Philips Healthcare System, Cleveland, Ohio 44143 (United States); Department of Radiation Oncology, Washington University, St. Louis, Missouri 63110 (United States)

    2012-12-15

    Purpose: Severe artifacts in kilovoltage-CT simulation images caused by large metallic implants can significantly degrade the conspicuity and apparent CT Hounsfield number of targets and anatomic structures, jeopardize the confidence of anatomical segmentation, and introduce inaccuracies into the radiation therapy treatment planning process. This study evaluated the performance of the first commercial orthopedic metal artifact reduction function (O-MAR) for radiation therapy, and investigated its clinical applications in treatment planning. Methods: Both phantom and clinical data were used for the evaluation. The CIRS electron density phantom with known physical (and electron) density plugs and removable titanium implants was scanned on a Philips Brilliance Big Bore 16-slice CT simulator. The CT Hounsfield numbers of density plugs on both uncorrected and O-MAR corrected images were compared. Treatment planning accuracy was evaluated by comparing simulated dose distributions computed using the true density images, uncorrected images, and O-MAR corrected images. Ten CT image sets of patients with large hip implants were processed with the O-MAR function and evaluated by two radiation oncologists using a five-point score for overall image quality, anatomical conspicuity, and CT Hounsfield number accuracy. By utilizing the same structure contours delineated from the O-MAR corrected images, clinical IMRT treatment plans for five patients were computed on the uncorrected and O-MAR corrected images, respectively, and compared. Results: Results of the phantom study indicated that CT Hounsfield number accuracy and noise were improved on the O-MAR corrected images, especially for images with bilateral metal implants. The {gamma} pass rates of the simulated dose distributions computed on the uncorrected and O-MAR corrected images referenced to those of the true densities were higher than 99.9% (even when using 1% and 3 mm distance-to-agreement criterion), suggesting that dose

  17. Using the soil and water assessment tool to estimate achievable water quality targets through implementation of beneficial management practices in an agricultural watershed.

    Science.gov (United States)

    Yang, Qi; Benoy, Glenn A; Chow, Thien Lien; Daigle, Jean-Louis; Bourque, Charles P-A; Meng, Fan-Rui

    2012-01-01

    Runoff from crop production in agricultural watersheds can cause widespread soil loss and degradation of surface water quality. Beneficial management practices (BMPs) for soil conservation are often implemented as remedial measures because BMPs can reduce soil erosion and improve water quality. However, the efficacy of BMPs may be unknown because it can be affected by many factors, such as farming practices, land-use, soil type, topography, and climatic conditions. As such, it is difficult to estimate the impacts of BMPs on water quality through field experiments alone. In this research, the Soil and Water Assessment Tool was used to estimate achievable performance targets of water quality indicators (sediment and soluble P loadings) after implementation of combinations of selected BMPs in the Black Brook Watershed in northwestern New Brunswick, Canada. Four commonly used BMPs (flow diversion terraces [FDTs], fertilizer reductions, tillage methods, and crop rotations), were considered individually and in different combinations. At the watershed level, the best achievable sediment loading was 1.9 t ha(-1) yr(-1) (89% reduction compared with default scenario), with a BMP combination of crop rotation, FDT, and no-till. The best achievable soluble P loading was 0.5 kg ha(-1) yr(-1) (62% reduction), with a BMP combination of crop rotation and FDT and fertilizer reduction. Targets estimated through nonpoint source water quality modeling can be used to evaluate BMP implementation initiatives and provide milestones for the rehabilitation of streams and rivers in agricultural regions. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  18. Reduction of air pollutants - a tool for control of atmospheric corrosion

    Directory of Open Access Journals (Sweden)

    Kucera, V.

    2003-12-01

    Full Text Available In most urban areas in Europe and Northern America serious corrosion impacts on buildings and cultural monuments have been caused by emissions of pollutants. The rapidly increasing pollution levels in many of the developing countries also exert a serious threat to materials. Beside the very important role of SO2 also the direct or synergistic effect of NOx and O3, the particulates and rain acidity may contribute in an important way to materials degradation. Results from extensive international field exposure programs i.e. within the UN/ECE have enabled development of dose-response relations which describe the effect of dry and wet deposition of pollutants on corrosion of different material groups. In most of the industrialized countries decreasing trends of sulphur and nitrogen pollutants and of acidity of precipitation have resulted in decreased corrosion rates. The concept of acceptable levels of pollutants is a useful tool in planning of abatement strategies and for defining of conditions for a suitable development in the field of corrosion of constructions in the atmosphere.

    La contaminación de la atmósfera ha sido la principal razón del grave deterioro de las edificaciones y de los monumentos en numerosas ciudades de Europa y Norteamérica. De otro lado, el acelerado incremento de los niveles de contaminación en los países menos desarrollados está poniendo en peligro la estabilidad de los materiales utilizados. Además del importante papel que en este sentido juega el SO2, la acción directa o el efecto sinérgico de los NOx y el O3, al igual que el material particulado y las lluvias acidas contribuyen a agravar el problema. Resultados de vastos programas internacionales de investigación como, por ejemplo, el UN/ECE, han permitido desarrollar relaciones dosis-respuesta que describen el efecto de la deposición de los contaminantes sobre la corrosión de

  19. An automated A-value measurement tool for accurate cochlear duct length estimation.

    Science.gov (United States)

    Iyaniwura, John E; Elfarnawany, Mai; Ladak, Hanif M; Agrawal, Sumit K

    2018-01-22

    There has been renewed interest in the cochlear duct length (CDL) for preoperative cochlear implant electrode selection and postoperative generation of patient-specific frequency maps. The CDL can be estimated by measuring the A-value, which is defined as the length between the round window and the furthest point on the basal turn. Unfortunately, there is significant intra- and inter-observer variability when these measurements are made clinically. The objective of this study was to develop an automated A-value measurement algorithm to improve accuracy and eliminate observer variability. Clinical and micro-CT images of 20 cadaveric cochleae specimens were acquired. The micro-CT of one sample was chosen as the atlas, and A-value fiducials were placed onto that image. Image registration (rigid affine and non-rigid B-spline) was applied between the atlas and the 19 remaining clinical CT images. The registration transform was applied to the A-value fiducials, and the A-value was then automatically calculated for each specimen. High resolution micro-CT images of the same 19 specimens were used to measure the gold standard A-values for comparison against the manual and automated methods. The registration algorithm had excellent qualitative overlap between the atlas and target images. The automated method eliminated the observer variability and the systematic underestimation by experts. Manual measurement of the A-value on clinical CT had a mean error of 9.5 ± 4.3% compared to micro-CT, and this improved to an error of 2.7 ± 2.1% using the automated algorithm. Both the automated and manual methods correlated significantly with the gold standard micro-CT A-values (r = 0.70, p value measurement tool using atlas-based registration methods was successfully developed and validated. The automated method eliminated the observer variability and improved accuracy as compared to manual measurements by experts. This open-source tool has the potential to benefit

  20. Reduction of Topographic Effect for Curve Number Estimated from Remotely Sensed Imagery

    Science.gov (United States)

    Zhang, Wen-Yan; Lin, Chao-Yuan

    2016-04-01

    The Soil Conservation Service Curve Number (SCS-CN) method is commonly used in hydrology to estimate direct runoff volume. The CN is the empirical parameter which corresponding to land use/land cover, hydrologic soil group and antecedent soil moisture condition. In large watersheds with complex topography, satellite remote sensing is the appropriate approach to acquire the land use change information. However, the topographic effect have been usually found in the remotely sensed imageries and resulted in land use classification. This research selected summer and winter scenes of Landsat-5 TM during 2008 to classified land use in Chen-You-Lan Watershed, Taiwan. The b-correction, the empirical topographic correction method, was applied to Landsat-5 TM data. Land use were categorized using K-mean classification into 4 groups i.e. forest, grassland, agriculture and river. Accuracy assessment of image classification was performed with national land use map. The results showed that after topographic correction, the overall accuracy of classification was increased from 68.0% to 74.5%. The average CN estimated from remotely sensed imagery decreased from 48.69 to 45.35 where the average CN estimated from national LULC map was 44.11. Therefore, the topographic correction method was recommended to normalize the topographic effect from the satellite remote sensing data before estimating the CN.

  1. Iterative PSF Estimation and Its Application to Shift Invariant and Variant Blur Reduction

    Directory of Open Access Journals (Sweden)

    Seung-Won Jung

    2009-01-01

    Full Text Available Among image restoration approaches, image deconvolution has been considered a powerful solution. In image deconvolution, a point spread function (PSF, which describes the blur of the image, needs to be determined. Therefore, in this paper, we propose an iterative PSF estimation algorithm which is able to estimate an accurate PSF. In real-world motion-blurred images, a simple parametric model of the PSF fails when a camera moves in an arbitrary direction with an inconsistent speed during an exposure time. Moreover, the PSF normally changes with spatial location. In order to accurately estimate the complex PSF of a real motion blurred image, we iteratively update the PSF by using a directional spreading operator. The directional spreading is applied to the PSF when it reduces the amount of the blur and the restoration artifacts. Then, to generalize the proposed technique to the linear shift variant (LSV model, a piecewise invariant approach is adopted by the proposed image segmentation method. Experimental results show that the proposed method effectively estimates the PSF and restores the degraded images.

  2. Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    Science.gov (United States)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average

  3. Upscaling Our Approach to Peatland Carbon Sequestration: Remote Sensing as a Tool for Carbon Flux Estimation.

    Science.gov (United States)

    Lees, K.; Khomik, M.; Clark, J. M.; Quaife, T. L.; Artz, R.

    2017-12-01

    Peatlands are an important part of the Earth's carbon cycle, comprising approximately a third of the global terrestrial carbon store. However, peatlands are sensitive to climatic change and human mismanagement, and many are now degraded and acting as carbon sources. Restoration work is being undertaken at many sites around the world, but monitoring the success of these schemes can be difficult and costly using traditional methods. A landscape-scale alternative is to use satellite data in order to assess the condition of peatlands and estimate carbon fluxes. This work focuses on study sites in Northern Scotland, where parts of the largest blanket bog in Europe are being restored from forest plantations. A combination of laboratory and fieldwork has been used to assess the Net Ecosystem Exchange (NEE), Gross Primary Productivity (GPP) and respiration of peatland sites in different conditions, and the climatic vulnerability of key peat-forming Sphagnum species. The results from these studies have been compared with spectral data in order to evaluate the extent to which remote sensing can function as a source of information for peatland health and carbon flux models. This work considers particularly the effects of scale in calculating peatland carbon flux. Flux data includes chamber and eddy covariance measurements of carbon dioxide, and radiometric observations include both handheld spectroradiometer results and satellite images. Results suggest that despite the small-scale heterogeneity and unique ecosystem factors in blanket bogs, remote sensing can be a useful tool in monitoring peatland health and carbon sequestration. In particular, this study gives unique insights into the relationships between peatland vegetation, carbon flux and spectral reflectance.

  4. Binaural noise reduction via cue-preserving MMSE filter and adaptive-blocking-based noise PSD estimation

    Science.gov (United States)

    Azarpour, Masoumeh; Enzner, Gerald

    2017-12-01

    Binaural noise reduction, with applications for instance in hearing aids, has been a very significant challenge. This task relates to the optimal utilization of the available microphone signals for the estimation of the ambient noise characteristics and for the optimal filtering algorithm to separate the desired speech from the noise. The additional requirements of low computational complexity and low latency further complicate the design. A particular challenge results from the desired reconstruction of binaural speech input with spatial cue preservation. The latter essentially diminishes the utility of multiple-input/single-output filter-and-sum techniques such as beamforming. In this paper, we propose a comprehensive and effective signal processing configuration with which most of the aforementioned criteria can be met suitably. This relates especially to the requirement of efficient online adaptive processing for noise estimation and optimal filtering while preserving the binaural cues. Regarding noise estimation, we consider three different architectures: interaural (ITF), cross-relation (CR), and principal-component (PCA) target blocking. An objective comparison with two other noise PSD estimation algorithms demonstrates the superiority of the blocking-based noise estimators, especially the CR-based and ITF-based blocking architectures. Moreover, we present a new noise reduction filter based on minimum mean-square error (MMSE), which belongs to the class of common gain filters, hence being rigorous in terms of spatial cue preservation but also efficient and competitive for the acoustic noise reduction task. A formal real-time subjective listening test procedure is also developed in this paper. The proposed listening test enables a real-time assessment of the proposed computationally efficient noise reduction algorithms in a realistic acoustic environment, e.g., considering time-varying room impulse responses and the Lombard effect. The listening test outcome

  5. Something from nothing: Estimating consumption rates using propensity scores, with application to emissions reduction policies.

    Directory of Open Access Journals (Sweden)

    Nicholas Bardsley

    Full Text Available Consumption surveys often record zero purchases of a good because of a short observation window. Measures of distribution are then precluded and only mean consumption rates can be inferred. We show that Propensity Score Matching can be applied to recover the distribution of consumption rates. We demonstrate the method using the UK National Travel Survey, in which c.40% of motorist households purchase no fuel. Estimated consumption rates are plausible judging by households' annual mileages, and highly skewed. We apply the same approach to estimate CO2 emissions and outcomes of a carbon cap or tax. Reliance on means apparently distorts analysis of such policies because of skewness of the underlying distributions. The regressiveness of a simple tax or cap is overstated, and redistributive features of a revenue-neutral policy are understated.

  6. Improvement of Bragg peak shift estimation using dimensionality reduction techniques and predictive linear modeling

    Science.gov (United States)

    Xing, Yafei; Macq, Benoit

    2017-11-01

    With the emergence of clinical prototypes and first patient acquisitions for proton therapy, the research on prompt gamma imaging is aiming at making most use of the prompt gamma data for in vivo estimation of any shift from expected Bragg peak (BP). The simple problem of matching the measured prompt gamma profile of each pencil beam with a reference simulation from the treatment plan is actually made complex by uncertainties which can translate into distortions during treatment. We will illustrate this challenge and demonstrate the robustness of a predictive linear model we proposed for BP shift estimation based on principal component analysis (PCA) method. It considered the first clinical knife-edge slit camera design in use with anthropomorphic phantom CT data. Particularly, 4115 error scenarios were simulated for the learning model. PCA was applied to the training input randomly chosen from 500 scenarios for eliminating data collinearities. A total variance of 99.95% was used for representing the testing input from 3615 scenarios. This model improved the BP shift estimation by an average of 63+/-19% in a range between -2.5% and 86%, comparing to our previous profile shift (PS) method. The robustness of our method was demonstrated by a comparative study conducted by applying 1000 times Poisson noise to each profile. 67% cases obtained by the learning model had lower prediction errors than those obtained by PS method. The estimation accuracy ranged between 0.31 +/- 0.22 mm and 1.84 +/- 8.98 mm for the learning model, while for PS method it ranged between 0.3 +/- 0.25 mm and 20.71 +/- 8.38 mm.

  7. The ISO 50001 Impact Estimator Tool (IET 50001 V1.1.4) - User Guide and Introduction to the ISO 50001 Impacts Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Therkelsen, Peter L. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Rao, Prakash [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee T. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-08-01

    ISO 50001-Energy management systems – Requirements with guidance for use, is an internationally developed standard that provides organizations with a flexible framework for implementing an energy management system (EnMS) with the goal of continual energy performance improvement. The ISO 50001 standard was first published in 2011 and has since seen growth in the number of certificates issued around the world, primarily in the industrial (agriculture, manufacturing, and mining) and service (commercial) sectors. Policy makers in many regions and countries are looking to or are already using ISO 50001 as a basis for energy efficiency, carbon reduction, and other energy performance improvement schemes. The Impact Estimator Tool 50001 (IET 50001 Tool) is a computational model developed to assist researchers and policy makers determine the potential impact of ISO 50001 implementation in the industrial and service (commercial) sectors for a given region or country. The IET 50001 Tool is based upon a methodology initially developed by the Lawrence Berkeley National Laboratory that has been improved upon and vetted by a group of international researchers. By using a commonly accepted and transparent methodology, users of the IET 50001 Tool can easily and clearly communicate the potential impact of ISO 50001 for a region or country.

  8. A technical review of urban land use - transportation models as tools for evaluating vehicle travel reduction strategies

    Energy Technology Data Exchange (ETDEWEB)

    Southworth, F.

    1995-07-01

    The continued growth of highway traffic in the United States has led to unwanted urban traffic congestion as well as to noticeable urban air quality problems. These problems include emissions covered by the 1990 Clean Air Act Amendments (CAAA) and 1991 Intermodal Surface Transportation Efficiency Act (ISTEA), as well as carbon dioxide and related {open_quotes}greenhouse gas{close_quotes} emissions. Urban travel also creates a major demand for imported oil. Therefore, for economic as well as environmental reasons, transportation planning agencies at both the state and metropolitan area level are focussing a good deal of attention on urban travel reduction policies. Much discussed policy instruments include those that encourage fewer trip starts, shorter trip distances, shifts to higher-occupancy vehicles or to nonvehicular modes, and shifts in the timing of trips from the more to the less congested periods of the day or week. Some analysts have concluded that in order to bring about sustainable reductions in urban traffic volumes, significant changes will be necessary in the way our households and businesses engage in daily travel. Such changes are likely to involve changes in the ways we organize and use traffic-generating and-attracting land within our urban areas. The purpose of this review is to evaluate the ability of current analytic methods and models to support both the evaluation and possibly the design of such vehicle travel reduction strategies, including those strategies involving the reorganization and use of urban land. The review is organized into three sections. Section 1 describes the nature of the problem we are trying to model, Section 2 reviews the state of the art in operational urban land use-transportation simulation models, and Section 3 provides a critical assessment of such models as useful urban transportation planning tools. A number of areas are identified where further model development or testing is required.

  9. A user's manual of Tools for Error Estimation of Complex Number Matrix Computation (Ver.1.0)

    International Nuclear Information System (INIS)

    Ichihara, Kiyoshi.

    1997-03-01

    'Tools for Error Estimation of Complex Number Matrix Computation' is a subroutine library which aids the users in obtaining the error ranges of the complex number linear system's solutions or the Hermitian matrices' eigen values. This library contains routines for both sequential computers and parallel computers. The subroutines for linear system error estimation calulate norms of residual vectors, matrices's condition numbers, error bounds of solutions and so on. The error estimation subroutines for Hermitian matrix eigen values' derive the error ranges of the eigen values according to the Korn-Kato's formula. This user's manual contains a brief mathematical background of error analysis on linear algebra and usage of the subroutines. (author)

  10. Statistical analysis of electrical resistivity as a tool for estimating cement type of 12-year-old concrete specimens

    NARCIS (Netherlands)

    Polder, R.B.; Morales-Napoles, O.; Pacheco, J.

    2012-01-01

    Statistical tests on values of concrete resistivity can be used as a fast tool for estimating the cement type of old concrete. Electrical resistivity of concrete is a material property that describes the electrical resistance of concrete in a unit cell. Influences of binder type, water-to-binder

  11. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    Directory of Open Access Journals (Sweden)

    Demeter Lisa

    2010-05-01

    Full Text Available Abstract Background The replication rate (or fitness between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV. HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models, a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1. Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  12. Estimation of Power Production Potential from Natural Gas Pressure Reduction Stations in Pakistan Using ASPEN HYSYS

    Directory of Open Access Journals (Sweden)

    Imran Nazir Unar

    2015-07-01

    Full Text Available Pakistan is a gas rich but power poor country. It consumes approximately 1, 559 Billion cubic feet of natural gas annually. Gas is transported around the country in a system of pressurized transmission pipelines under a pressure range of 600-1000 psig exclusively operated by two state owned companies i.e. SNGPL (Sui Northern Gas Pipelines Limited and SSGCL (Sui Southern Gas Company Limited. The gas is distributed by reducing from the transmission pressure into distribution pressure up to maximum level of 150 psig at the city gate stations normally called SMS (Sales Metering Station. As a normal practice gas pressure reduction at those SMSs is accomplished in pressure regulators (PCVs or in throttle valves where isenthalpic expansion takes place without producing any energy. Pressure potential of natural gas is an untapped energy resource which is currently wasted by its throttling. This pressure reduction at SMS (pressure drop through SMS may also be achieved by expansion of natural gas in TE, which converts its pressure into the mechanical energy, which can be transmitted any loading device for example electric generator. The aim of present paper is to explore the expected power production potential of various Sales Metering Stations of SSGCL company in Pakistan. The model of sales metering station was developed in a standard flow sheeting software Aspen HYSYS®7.1 to calculate power and study other parameters when an expansion turbine is used instead of throttling valves. It was observed from the simulation results that a significant power (more than 140 KW can be produced at pressure reducing stations of SSGC network with gas flows more than 2.2 MMSCFD and pressure ration more than 1.3.

  13. Estimation of power production potential from natural gas pressure reduction stations in pakistan using aspen hysys

    International Nuclear Information System (INIS)

    Unar, I.N.; Aftab, A.

    2015-01-01

    Pakistan is a gas rich but power poor country. It consumes approximately 1, 559 Billion cubic feet of natural gas annually. Gas is transported around the country in a system of pressurized transmission pipelines under a pressure-range of 600-1 000 psig exclusively operated by two state owned companies i.e. SNGPL (Sui Northern Gas Pipelines Limited) and SSGCL (Sui Southern Gas Company Limited). The gas is distributed by reducing from the transmission pressure into distribution pressure up to maximum level of 150 psig at the city gate stations normally called SMS (Sales Metering Station). As a normal practice gas pressure reduction at those SMSs is accomplished in pressure regulators (PCVs or in of natural gas is an untapped energy resource which is currently wasted by its throttling. This pressure reduction at SMS (pressure drop through SMS) may also be achieved by expansion of natural gas in TE, which converts its pressure into the mechanical energy, which can be transmitted any loading device for example electric generator. The aim of present paper is to explore the expected power production potential of various Sales Metering Stations of SSGCL company in Pakistan. The model of sales metering station was developed in a standard flow sheeting software Aspen HYSYS at the rate 7.1 to calculate power and study other parameters when an expansion turbine is used instead of throttling valves. It was observed from the simulation results that a significant power (more than 140 KW) can be produced at pressure reducing stations of SSGC network with gas flows more than 2.2 MMSCFD and pressure ration more than 1.3. (author)

  14. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    Directory of Open Access Journals (Sweden)

    S. A. Archfield

    2013-01-01

    Full Text Available Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  15. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    Science.gov (United States)

    Archfield, Stacey A.; Steeves, Peter A.; Guthrie, John D.; Ries, Kernell G.

    2013-01-01

    Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals) at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  16. Review of hardware cost estimation methods, models and tools applied to early phases of space mission planning

    Science.gov (United States)

    Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.

    2012-08-01

    The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation

  17. Quantitative genetic tools for insecticide resistance risk assessment: estimating the heritability of resistance

    Science.gov (United States)

    Michael J. Firko; Jane Leslie Hayes

    1990-01-01

    Quantitative genetic studies of resistance can provide estimates of genetic parameters not available with other types of genetic analyses. Three methods are discussed for estimating the amount of additive genetic variation in resistance to individual insecticides and subsequent estimation of heritability (h2) of resistance. Sibling analysis and...

  18. U-AVLIS feed conversion using continuous metallothermic reduction of UF4: System description and cost estimate

    International Nuclear Information System (INIS)

    1994-04-01

    The purpose of this document is to present a system description and develop baseline capital and operating cost estimates for commercial facilities which produced U-Fe feedstock for AVLIS enrichment plants using the continuous fluoride reduction (CFR) process. These costs can then be used together with appropriate economic assumptions to calculate estimated unit costs to the AVLIS plant owner (or utility customer) for such conversion services. Six cases are being examined. All cases assume that the conversion services are performed by a private company at a commercial site which has an existing NRC license to possess source material and which has existing uranium processing operations. The cases differ in terms of annual production capacity and whether the new process system is installed in a new building or in an existing building on the site. The six cases are summarized here

  19. U-AVLIS feed conversion using continuous metallothermic reduction of UF{sub 4}: System description and cost estimate

    Energy Technology Data Exchange (ETDEWEB)

    1994-04-01

    The purpose of this document is to present a system description and develop baseline capital and operating cost estimates for commercial facilities which produced U-Fe feedstock for AVLIS enrichment plants using the continuous fluoride reduction (CFR) process. These costs can then be used together with appropriate economic assumptions to calculate estimated unit costs to the AVLIS plant owner (or utility customer) for such conversion services. Six cases are being examined. All cases assume that the conversion services are performed by a private company at a commercial site which has an existing NRC license to possess source material and which has existing uranium processing operations. The cases differ in terms of annual production capacity and whether the new process system is installed in a new building or in an existing building on the site. The six cases are summarized here.

  20. Estimating pediatric entrance skin dose from digital radiography examination using DICOM metadata: A quality assurance tool

    Energy Technology Data Exchange (ETDEWEB)

    Brady, S. L., E-mail: samuel.brady@stjude.org; Kaufman, R. A., E-mail: robert.kaufman@stjude.org [Department of Diagnostic Imaging, St. Jude Children’s Research Hospital, Memphis, Tennessee 38105 (United States)

    2015-05-15

    Purpose: To develop an automated methodology to estimate patient examination dose in digital radiography (DR) imaging using DICOM metadata as a quality assurance (QA) tool. Methods: Patient examination and demographical information were gathered from metadata analysis of DICOM header data. The x-ray system radiation output (i.e., air KERMA) was characterized for all filter combinations used for patient examinations. Average patient thicknesses were measured for head, chest, abdomen, knees, and hands using volumetric images from CT. Backscatter factors (BSFs) were calculated from examination kVp. Patient entrance skin air KERMA (ESAK) was calculated by (1) looking up examination technique factors taken from DICOM header metadata (i.e., kVp and mA s) to derive an air KERMA (k{sub air}) value based on an x-ray characteristic radiation output curve; (2) scaling k{sub air} with a BSF value; and (3) correcting k{sub air} for patient thickness. Finally, patient entrance skin dose (ESD) was calculated by multiplying a mass–energy attenuation coefficient ratio by ESAK. Patient ESD calculations were computed for common DR examinations at our institution: dual view chest, anteroposterior (AP) abdomen, lateral (LAT) skull, dual view knee, and bone age (left hand only) examinations. Results: ESD was calculated for a total of 3794 patients; mean age was 11 ± 8 yr (range: 2 months to 55 yr). The mean ESD range was 0.19–0.42 mGy for dual view chest, 0.28–1.2 mGy for AP abdomen, 0.18–0.65 mGy for LAT view skull, 0.15–0.63 mGy for dual view knee, and 0.10–0.12 mGy for bone age (left hand) examinations. Conclusions: A methodology combining DICOM header metadata and basic x-ray tube characterization curves was demonstrated. In a regulatory era where patient dose reporting has become increasingly in demand, this methodology will allow a knowledgeable user the means to establish an automatable dose reporting program for DR and perform patient dose related QA testing for

  1. Wind turbine noise reduction. An indicative cost estimation; Sanering windturbinegeluid. Een indicatieve raming van kosten

    Energy Technology Data Exchange (ETDEWEB)

    Verheijen, E.N.G.; Jabben, J.

    2011-11-15

    Since the 1st of January 2011 new rules apply for wind turbine noise. The rules include a different calculation method and different noise limits, intended for new wind turbines. In order to tackle noise annoyance from existing wind turbines the government is considering to set up a abatement operation, for which a cost estimate is given in this study. At an abatement limit of 47 decibel L{sub den} (Level day-evening-night) approximately 450 dwellings would be eligible for noise remediation. The costs of this operation are estimated at 4.9 million euro. However, in many of these cases the wind turbine is probably owned by the respective residents. It is possible that public funds for noise remediation will not be allocated to the owners of dwellings that directly profit from the turbines. If these cases are excluded, the abatement operation would cover 165 to 275 dwellings with estimated costs for remediation of 1.6 to 2.6 million euro. A tentative cost-benefit analysis suggests that noise remediation will be cost effective in most situations. This means that the benefits of reduced annoyance or sleep disturbance are in balance with the cost of remediation. Only for the small group of wind turbines that are in use for over fifteen years, remediation will not be cost effective. These wind turbines are nearing the end of their lifespan and are therefore ignored in the above estimates. [Dutch] Sinds 1 januari 2011 zijn nieuwe regels rond windturbinegeluid van kracht. Bij de nieuwe regelgeving hoort een andere berekeningsmethode en normstelling, bedoeld voor nieuw te plaatsen windturbines. Voor de aanpak van de geluidhinder door bestaande windturbines overweegt de overheid een saneringsoperatie op te zetten, waarvoor in dit onderzoek een kostenraming wordt gegeven. Bij een saneringsgrenswaarde van 47 decibel zouden ongeveer 450 woningen voor sanering in aanmerking komen. De kosten voor sanering daarvan worden geschat op 4,9 miljoen euro. Bij een groot deel van deze

  2. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  3. Adaptation of the Tool to Estimate Patient Costs Questionnaire into Indonesian Context for Tuberculosis-affected Households.

    Science.gov (United States)

    Fuady, Ahmad; Houweling, Tanja A; Mansyur, Muchtaruddin; Richardus, Jan H

    2018-01-01

    Indonesia is the second-highest country for tuberculosis (TB) incidence worldwide. Hence, it urgently requires improvements and innovations beyond the strategies that are currently being implemented throughout the country. One fundamental step in monitoring its progress is by preparing a validated tool to measure total patient costs and catastrophic total costs. The World Health Organization (WHO) recommends using a version of the generic questionnaire that has been adapted to the local cultural context in order to interpret findings correctly. This study is aimed to adapt the Tool to Estimate Patient Costs questionnaire into the Indonesian context, which measures total costs and catastrophic total costs for tuberculosis-affected households. the tool was adapted using best-practice guidelines. On the basis of a pre-test performed in a previous study (referred to as Phase 1 Study), we refined the adaptation process by comparing it with the generic tool introduced by the WHO. We also held an expert committee review and performed pre-testing by interviewing 30 TB patients. After pre-testing, the tool was provided with complete explanation sheets for finalization. seventy-two major changes were made during the adaptation process including changing the answer choices to match the Indonesian context, refining the flow of questions, deleting questions, changing some words and restoring original questions that had been changed in Phase 1 Study. Participants indicated that most questions were clear and easy to understand. To address recall difficulties by the participants, we made some adaptations to obtain data that might be missing, such as tracking data to medical records, developing a proxy of costs and guiding interviewers to ask for a specific value when participants were uncertain about the estimated market value of property they had sold. the adapted Tool to Estimate Patient Costs in Bahasa Indonesia is comprehensive and ready for use in future studies on TB

  4. Beyond Self-Report: Tools to Compare Estimated and Real-World Smartphone Use

    Science.gov (United States)

    Andrews, Sally; Ellis, David A.; Shaw, Heather; Piwek, Lukasz

    2015-01-01

    Psychologists typically rely on self-report data when quantifying mobile phone usage, despite little evidence of its validity. In this paper we explore the accuracy of using self-reported estimates when compared with actual smartphone use. We also include source code to process and visualise these data. We compared 23 participants’ actual smartphone use over a two-week period with self-reported estimates and the Mobile Phone Problem Use Scale. Our results indicate that estimated time spent using a smartphone may be an adequate measure of use, unless a greater resolution of data are required. Estimates concerning the number of times an individual used their phone across a typical day did not correlate with actual smartphone use. Neither estimated duration nor number of uses correlated with the Mobile Phone Problem Use Scale. We conclude that estimated smartphone use should be interpreted with caution in psychological research. PMID:26509895

  5. Model reduction and frequency residuals for a robust estimation of nonlinearities in subspace identification

    Science.gov (United States)

    De Filippis, G.; Noël, J. P.; Kerschen, G.; Soria, L.; Stephan, C.

    2017-09-01

    The introduction of the frequency-domain nonlinear subspace identification (FNSI) method in 2013 constitutes one in a series of recent attempts toward developing a realistic, first-generation framework applicable to complex structures. If this method showed promising capabilities when applied to academic structures, it is still confronted with a number of limitations which needs to be addressed. In particular, the removal of nonphysical poles in the identified nonlinear models is a distinct challenge. In the present paper, it is proposed as a first contribution to operate directly on the identified state-space matrices to carry out spurious pole removal. A modal-space decomposition of the state and output matrices is examined to discriminate genuine from numerical poles, prior to estimating the extended input and feedthrough matrices. The final state-space model thus contains physical information only and naturally leads to nonlinear coefficients free of spurious variations. Besides spurious variations due to nonphysical poles, vibration modes lying outside the frequency band of interest may also produce drifts of the nonlinear coefficients. The second contribution of the paper is to include residual terms, accounting for the existence of these modes. The proposed improved FNSI methodology is validated numerically and experimentally using a full-scale structure, the Morane-Saulnier Paris aircraft.

  6. A Decision Tool to Evaluate Budgeting Methodologies for Estimating Facility Recapitalization Requirements

    National Research Council Canada - National Science Library

    Hickman, Krista M

    2008-01-01

    .... Specifically, the thesis sought to answer an overarching research question addressing the importance of recapitalization and the best method to estimate the facility recapitalization budget using...

  7. Estimation of the Tool Condition by Applying the Wavelet Transform to Acoustic Emission Signals

    International Nuclear Information System (INIS)

    Gomez, M. P.; Piotrkowski, R.; Ruzzante, J. E.; D'Attellis, C. E.

    2007-01-01

    This work follows the search of parameters to evaluate the tool condition in machining processes. The selected sensing technique is acoustic emission and it is applied to a turning process of steel samples. The obtained signals are studied using the wavelet transformation. The tool wear level is quantified as a percentage of the final wear specified by the Standard ISO 3685. The amplitude and relevant scale obtained of acoustic emission signals could be related with the wear level

  8. PROBLEMS OF ICT-BASED TOOLS ESTIMATION IN THE CONTEXT OF INFORMATION SOCIETY FORMATION

    Directory of Open Access Journals (Sweden)

    M. Shyshkina

    2012-03-01

    Full Text Available The article describes the problems of improvement of quality of implementation and use of e-learning tools which arise in terms of increasing quality and accessibility of education. It is determined that those issues are closely linked to specific scientific and methodological approaches to evaluation of quality, selection and use of ICT-based tools in view of emergence of promising information technological platforms of these resources implementation and delivery.

  9. Leakage Detection and Estimation Algorithm for Loss Reduction in Water Piping Networks

    Directory of Open Access Journals (Sweden)

    Kazeem B. Adedeji

    2017-10-01

    Full Text Available Water loss through leaking pipes constitutes a major challenge to the operational service of water utilities. In recent years, increasing concern about the financial loss and environmental pollution caused by leaking pipes has been driving the development of efficient algorithms for detecting leakage in water piping networks. Water distribution networks (WDNs are disperse in nature with numerous number of nodes and branches. Consequently, identifying the segment(s of the network and the exact leaking pipelines connected to this segment(s where higher background leakage outflow occurs is a challenging task. Background leakage concerns the outflow from small cracks or deteriorated joints. In addition, because they are diffuse flow, they are not characterised by quick pressure drop and are not detectable by measuring instruments. Consequently, they go unreported for a long period of time posing a threat to water loss volume. Most of the existing research focuses on the detection and localisation of burst type leakages which are characterised by a sudden pressure drop. In this work, an algorithm for detecting and estimating background leakage in water distribution networks is presented. The algorithm integrates a leakage model into a classical WDN hydraulic model for solving the network leakage flows. The applicability of the developed algorithm is demonstrated on two different water networks. The results of the tested networks are discussed and the solutions obtained show the benefits of the proposed algorithm. A noteworthy evidence is that the algorithm permits the detection of critical segments or pipes of the network experiencing higher leakage outflow and indicates the probable pipes of the network where pressure control can be performed. However, the possible position of pressure control elements along such critical pipes will be addressed in future work.

  10. SU-F-P-19: Fetal Dose Estimate for a High-Dose Fluoroscopy Guided Intervention Using Modern Data Tools

    Energy Technology Data Exchange (ETDEWEB)

    Moirano, J [University of Washington, Seattle, WA (United States)

    2016-06-15

    Purpose: An accurate dose estimate is necessary for effective patient management after a fetal exposure. In the case of a high-dose exposure, it is critical to use all resources available in order to make the most accurate assessment of the fetal dose. This work will demonstrate a methodology for accurate fetal dose estimation using tools that have recently become available in many clinics, and show examples of best practices for collecting data and performing the fetal dose calculation. Methods: A fetal dose estimate calculation was performed using modern data collection tools to determine parameters for the calculation. The reference point air kerma as displayed by the fluoroscopic system was checked for accuracy. A cumulative dose incidence map and DICOM header mining were used to determine the displayed reference point air kerma. Corrections for attenuation caused by the patient table and pad were measured and applied in order to determine the peak skin dose. The position and depth of the fetus was determined by ultrasound imaging and consultation with a radiologist. The data collected was used to determine a normalized uterus dose from Monte Carlo simulation data. Fetal dose values from this process were compared to other accepted calculation methods. Results: An accurate high-dose fetal dose estimate was made. Comparison to accepted legacy methods were were within 35% of estimated values. Conclusion: Modern data collection and reporting methods ease the process for estimation of fetal dose from interventional fluoroscopy exposures. Many aspects of the calculation can now be quantified rather than estimated, which should allow for a more accurate estimation of fetal dose.

  11. High-throughput migration modelling for estimating exposure to chemicals in food packaging in screening and prioritization tools

    DEFF Research Database (Denmark)

    Ernstoff, Alexi S; Fantke, Peter; Huang, Lei

    2017-01-01

    Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioriti...... to uncertainty and dramatically decreased model performance (R2 = 0.4, Se = 1). In all, this study provides a rapid migration modelling approach to estimate exposure to chemicals in food packaging for emerging screening and prioritization approaches....

  12. A model reduction approach for the variational estimation of vascular compliance by solving an inverse fluid–structure interaction problem

    International Nuclear Information System (INIS)

    Bertagna, Luca; Veneziani, Alessandro

    2014-01-01

    Scientific computing has progressively become an important tool for research in cardiovascular diseases. The role of quantitative analyses based on numerical simulations has moved from ‘proofs of concept’ to patient-specific investigations, thanks to a strong integration between imaging and computational tools. However, beyond individual geometries, numerical models require the knowledge of parameters that are barely retrieved from measurements, especially in vivo. For this reason, recently cardiovascular mathematics considered data assimilation procedures for extracting the knowledge of patient-specific parameters from measures and images. In this paper, we consider specifically the quantification of vascular compliance, i.e. the parameter quantifying the tendency of arterial walls to deform under blood stress. Following up a previous paper, where a variational data assimilation procedure was proposed, based on solving an inverse fluid–structure interaction problem, here we consider model reduction techniques based on a proper orthogonal decomposition approach to accomplish the solution of the inverse problem in a computationally efficient way. (paper)

  13. Children with developmental coordination disorder demonstrate a spatial mismatch when estimating coincident-timing ability with tools.

    Science.gov (United States)

    Caçola, Priscila; Ibana, Melvin; Ricard, Mark; Gabbard, Carl

    2016-01-01

    Coincident timing or interception ability can be defined as the capacity to precisely time sensory input and motor output. This study compared accuracy of typically developing (TD) children and those with Developmental Coordination Disorder (DCD) on a task involving estimation of coincident timing with their arm and various tool lengths. Forty-eight (48) participants performed two experiments where they imagined intercepting a target moving toward (Experiment 1) and target moving away (Experiment 2) from them in 5 conditions with their arm and tool lengths: arm, 10, 20, 30, and 40 cm. In Experiment 1, the DCD group overestimated interception points approximately twice as much as the TD group, and both groups overestimated consistently regardless of the tool used. Results for Experiment 2 revealed that those with DCD underestimated about three times as much as the TD group, with the exception of when no tool was used. Overall, these results indicate that children with DCD are less accurate with estimation of coincident-timing; which might in part explain their difficulties with common motor activities such as catching a ball or striking a baseball pitch. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Sediment traps as a new tool for estimation of longevity of planktonic foraminifera

    Digital Repository Service at National Institute of Oceanography (India)

    Nigam, R.

    Sediment trap technique provides time series data of sinking particles (faunal and sediment) from surface to bottom of the sea. Besides many other applications, data can also be used to estimate life span of planktonic foraminifera. Based on rearing...

  15. A tool for rapid post-hurricane urban tree debris estimates using high resolution aerial imagery

    Science.gov (United States)

    Zoltan Szantoi; Sparkle L Malone; Francisco Escobedo; Orlando Misas; Scot Smith; Bon Dewitt

    2012-01-01

    Coastal communities in the southeast United States have regularly experienced severe hurricane impacts. To better facilitate recovery efforts in these communities following natural disasters, state and federal agencies must respond quickly with information regarding the extent and severity of hurricane damage and the amount of tree debris volume. A tool was developed...

  16. PETrA : A Software-Based Tool for Estimating the Energy Profile of Android Applications

    NARCIS (Netherlands)

    Di Nucci, D.; Palomba, F.; Prota, Antonio; Panichella, A.; Zaidman, A.E.; De Lucia, Andrea

    2017-01-01

    Energy efficiency is a vital characteristic of any mobile application, and indeed is becoming an important factor for user satisfaction. For this reason, in recent years several approaches and tools for measuring the energy consumption of mobile devices have been proposed. Hardware-based solutions

  17. An Evaluation Tool for CONUS-Scale Estimates of Components of the Water Balance

    Science.gov (United States)

    Saxe, S.; Hay, L.; Farmer, W. H.; Markstrom, S. L.; Kiang, J. E.

    2016-12-01

    Numerous research groups are independently developing data products to represent various components of the water balance (e.g. runoff, evapotranspiration, recharge, snow water equivalent, soil moisture, and climate) at the scale of the conterminous United States. These data products are derived from a range of sources, including direct measurement, remotely-sensed measurement, and statistical and deterministic model simulations. An evaluation tool is needed to compare these data products and the components of the water balance they contain in order to identify the gaps in the understanding and representation of continental-scale hydrologic processes. An ideal tool will be an objective, universally agreed upon, framework to address questions related to closing the water balance. This type of generic, model agnostic evaluation tool would facilitate collaboration amongst different hydrologic research groups and improve modeling capabilities with respect to continental-scale water resources. By adopting a comprehensive framework to consider hydrologic modeling in the context of a complete water balance, it is possible to identify weaknesses in process modeling, data product representation and regional hydrologic variation. As part of its National Water Census initiative, the U.S. Geological survey is facilitating this dialogue to developing prototype evaluation tools.

  18. Structural correlation method for model reduction and practical estimation of patient specific parameters illustrated on heart rate regulation

    DEFF Research Database (Denmark)

    Ottesen, Johnny T.; Mehlsen, Jesper; Olufsen, Mette

    2014-01-01

    We consider the inverse and patient specific problem of short term (seconds to minutes) heart rate regulation specified by a system of nonlinear ODEs and corresponding data. We show how a recent method termed the structural correlation method (SCM) can be used for model reduction and for obtaining...... a set of practically identifiable parameters. The structural correlation method includes two steps: sensitivity and correlation analysis. When combined with an optimization step, it is possible to estimate model parameters, enabling the model to fit dynamics observed in data. This method is illustrated...... in detail on a model predicting baroreflex regulation of heart rate and applied to analysis of data from a rat and healthy humans. Numerous mathematical models have been proposed for prediction of baroreflex regulation of heart rate, yet most of these have been designed to provide qualitative predictions...

  19. Basis of Estimate Software Tool (BEST) - a practical solution to part of the cost and schedule integration puzzle

    International Nuclear Information System (INIS)

    Murphy, L.; Bain, P.

    1997-01-01

    The Basis of Estimate Software Tool (BEST) was developed at the Rocky Flats Environmental Technology Site (Rocky Flats) to bridge the gap that exists in conventional project control systems between scheduled activities, their allocated or assigned resources, and the set of assumptions (basis of estimate) that correlate resources and activities. Having a documented and auditable basis of estimate (BOE) is necessary for budget validation, work scope analysis, change control, and a number of related management control functions. The uniqueness of BEST is demonstrated by the manner in which it responds to the diverse needs of the heavily regulated environmental workplace - containing many features not found in conventional off-the-shelf software products. However, even companies dealing in relatively unregulated work places will find many attractive features in BEST. This product will be of particular interest to current Government contractors and contractors preparing proposals that may require subsequent validation. 2 figs

  20. Side-by-side ANFIS as a useful tool for estimating correlated thermophysical properties

    Science.gov (United States)

    Grieu, Stéphane; Faugeroux, Olivier; Traoré, Adama; Claudet, Bernard; Bodnar, Jean-Luc

    2015-12-01

    In the present paper, an artificial intelligence-based approach dealing with the estimation of correlated thermophysical properties is designed and evaluated. This new and "intelligent" approach makes use of photothermal responses obtained when homogeneous materials are subjected to a light flux. Commonly, gradient-based algorithms are used as parameter estimation techniques. Unfortunately, such algorithms show instabilities leading to non-convergence in case of correlated properties to be estimated from a rebuilt impulse response. So, the main objective of the present work was to simultaneously estimate both the thermal diffusivity and conductivity of homogeneous materials, from front-face or rear-face photothermal responses to pseudo random binary signals. To this end, we used side-by-side neuro-fuzzy systems (adaptive network-based fuzzy inference systems) trained with a hybrid algorithm. We focused on the impact on generalization of both the examples used during training and the fuzzification process. In addition, computation time was a key point to consider. That is why the developed algorithm is computationally tractable and allows both the thermal diffusivity and conductivity of homogeneous materials to be simultaneously estimated with very good accuracy (the generalization error ranges between 4.6% and 6.2%).

  1. Estimation of tool wear length in finish milling using a fuzzy inference algorithm

    Science.gov (United States)

    Ko, Tae Jo; Cho, Dong Woo

    1993-10-01

    The geometric accuracy and surface roughness are mainly affected by the flank wear at the minor cutting edge in finish machining. A fuzzy estimator obtained by a fuzzy inference algorithm with a max-min composition rule to evaluate the minor flank wear length in finish milling is introduced. The features sensitive to minor flank wear are extracted from the dispersion analysis of a time series AR model of the feed directional acceleration of the spindle housing. Linguistic rules for fuzzy estimation are constructed using these features, and then fuzzy inferences are carried out with test data sets under various cutting conditions. The proposed system turns out to be effective for estimating minor flank wear length, and its mean error is less than 12%.

  2. Scientific and practical tools for dealing with water resource estimations for the future

    Directory of Open Access Journals (Sweden)

    D. A. Hughes

    2015-06-01

    Full Text Available Future flow regimes will be different to today and imperfect knowledge of present and future climate variations, rainfall–runoff processes and anthropogenic impacts make them highly uncertain. Future water resources decisions will rely on practical and appropriate simulation tools that are sensitive to changes, can assimilate different types of change information and flexible enough to accommodate improvements in understanding of change. They need to include representations of uncertainty and generate information appropriate for uncertain decision-making. This paper presents some examples of the tools that have been developed to address these issues in the southern Africa region. The examples include uncertainty in present day simulations due to lack of understanding and data, using climate change projection data from multiple climate models and future catchment responses due to both climate and development effects. The conclusions are that the tools and models are largely available and what we need is more reliable forcing and model evlaution information as well as methods of making decisions with such inevitably uncertain information.

  3. ForestCrowns: a transparency estimation tool for digital photographs of forest canopies

    Science.gov (United States)

    Matthew Winn; Jeff Palmer; S.-M. Lee; Philip Araman

    2016-01-01

    ForestCrowns is a Windows®-based computer program that calculates forest canopy transparency (light transmittance) using ground-based digital photographs taken with standard or hemispherical camera lenses. The software can be used by forest managers and researchers to monitor growth/decline of forest canopies; provide input for leaf area index estimation; measure light...

  4. A simple tool for estimating throughfall nitrogen deposition in forests of western North America using lichens

    Science.gov (United States)

    Heather T. Root; Linda H. Geiser; Mark E. Fenn; Sarah Jovan; Martin A. Hutten; Suraj Ahuja; Karen Dillman; David Schirokauer; Shanti Berryman; Jill A. McMurray

    2013-01-01

    Anthropogenic nitrogen (N) deposition has had substantial impacts on forests of North America. Managers seek to monitor deposition to identify areas of concern and establish critical loads, which define the amount of deposition that can be tolerated by ecosystems without causing substantial harm. We present a new monitoring approach that estimates throughfall inorganic...

  5. qpure: A tool to estimate tumor cellularity from genome-wide single-nucleotide polymorphism profiles.

    Directory of Open Access Journals (Sweden)

    Sarah Song

    Full Text Available Tumour cellularity, the relative proportion of tumour and normal cells in a sample, affects the sensitivity of mutation detection, copy number analysis, cancer gene expression and methylation profiling. Tumour cellularity is traditionally estimated by pathological review of sectioned specimens; however this method is both subjective and prone to error due to heterogeneity within lesions and cellularity differences between the sample viewed during pathological review and tissue used for research purposes. In this paper we describe a statistical model to estimate tumour cellularity from SNP array profiles of paired tumour and normal samples using shifts in SNP allele frequency at regions of loss of heterozygosity (LOH in the tumour. We also provide qpure, a software implementation of the method. Our experiments showed that there is a medium correlation 0.42 ([Formula: see text]-value=0.0001 between tumor cellularity estimated by qpure and pathology review. Interestingly there is a high correlation 0.87 ([Formula: see text]-value [Formula: see text] 2.2e-16 between cellularity estimates by qpure and deep Ion Torrent sequencing of known somatic KRAS mutations; and a weaker correlation 0.32 ([Formula: see text]-value=0.004 between IonTorrent sequencing and pathology review. This suggests that qpure may be a more accurate predictor of tumour cellularity than pathology review. qpure can be downloaded from https://sourceforge.net/projects/qpure/.

  6. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    Directory of Open Access Journals (Sweden)

    Saeed Alexander I

    2008-12-01

    Full Text Available Abstract Background Mass spectrometry (MS based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007 described a modified spectral counting technique, Absolute Protein Expression (APEX, which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi. This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a

  7. Adaptation of the Tool to Estimate Patient Costs Questionnaire into Indonesian Context for Tuberculosis-affected Households

    Directory of Open Access Journals (Sweden)

    Ahmad Fuady

    2018-04-01

    Full Text Available Background: Indonesia is the second-highest country for tuberculosis (TB incidence worldwide. Hence, it urgently requires improvements and innovations beyond the strategies that are currently being implemented throughout the country. One fundamental step in monitoring its progress is by preparing a validated tool to measure total patient costs and catastrophic total costs. The World Health Organization (WHO recommends using a version of the generic questionnaire that has been adapted to the local cultural context in order to interpret findings correctly. This study is aimed to adapt the Tool to Estimate Patient Costs questionnaire into the Indonesian context, which measures total costs and catastrophic total costs for tuberculosis-affected households. Methods: the tool was adapted using best-practice guidelines. On the basis of a pre-test performed in a previous study (referred to as Phase 1 Study, we refined the adaptation process by comparing it with the generic tool introduced by the WHO. We also held an expert committee review and performed pre-testing by interviewing 30 TB patients. After pre-testing, the tool was provided with complete explanation sheets for finalization. Results: seventy-two major changes were made during the adaptation process including changing the answer choices to match the Indonesian context, refining the flow of questions, deleting questions, changing some words and restoring original questions that had been changed in Phase 1 Study. Participants indicated that most questions were clear and easy to understand. To address recall difficulties by the participants, we made some adaptations to obtain data that might be missing, such as tracking data to medical records, developing a proxy of costs and guiding interviewers to ask for a specific value when participants were uncertain about the estimated market value of property they had sold. Conclusion: the adapted Tool to Estimate Patient Costs in Bahasa Indonesia is

  8. Estimate of the benefits of a population-based reduction in dietary sodium additives on hypertension and its related health care costs in Canada.

    Science.gov (United States)

    Joffres, Michel R; Campbell, Norm R C; Manns, Braden; Tu, Karen

    2007-05-01

    Hypertension is the leading risk factor for mortality worldwide. One-quarter of the adult Canadian population has hypertension, and more than 90% of the population is estimated to develop hypertension if they live an average lifespan. Reductions in dietary sodium additives significantly lower systolic and diastolic blood pressure, and population reductions in dietary sodium are recommended by major scientific and public health organizations. To estimate the reduction in hypertension prevalence and specific hypertension management cost savings associated with a population-wide reduction in dietary sodium additives. Based on data from clinical trials, reducing dietary sodium additives by 1840 mg/day would result in a decrease of 5.06 mmHg (systolic) and 2.7 mmHg (diastolic) blood pressures. Using Canadian Heart Health Survey data, the resulting reduction in hypertension was estimated. Costs of laboratory testing and physician visits were based on 2001 to 2003 Ontario Health Insurance Plan data, and the number of physician visits and costs of medications for patients with hypertension were taken from 2003 IMS Canada. To estimate the reduction in total physician visits and laboratory costs, current estimates of aware hypertensive patients in Canada were used from the Canadian Community Health Survey. Reducing dietary sodium additives may decrease hypertension prevalence by 30%, resulting in one million fewer hypertensive patients in Canada, and almost double the treatment and control rate. Direct cost savings related to fewer physician visits, laboratory tests and lower medication use are estimated to be approximately $430 million per year. Physician visits and laboratory costs would decrease by 6.5%, and 23% fewer treated hypertensive patients would require medications for control of blood pressure. Based on these estimates, lowering dietary sodium additives would lead to a large reduction in hypertension prevalence and result in health care cost savings in Canada.

  9. High-throughput migration modelling for estimating exposure to chemicals in food packaging in screening and prioritization tools.

    Science.gov (United States)

    Ernstoff, Alexi S; Fantke, Peter; Huang, Lei; Jolliet, Olivier

    2017-11-01

    Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioritization, which require rapid computation of accurate estimates for diverse scenarios. To fulfil this need, we develop an accurate and rapid (high-throughput) model that estimates the fraction of organic chemicals migrating from polymeric packaging materials into foods. Several hundred step-wise simulations optimised the model coefficients to cover a range of user-defined scenarios (e.g. temperature). The developed model, operationalised in a spreadsheet for future dissemination, nearly instantaneously estimates chemical migration, and has improved performance over commonly used model simplifications. When using measured diffusion coefficients the model accurately predicted (R 2  = 0.9, standard error (S e ) = 0.5) hundreds of empirical data points for various scenarios. Diffusion coefficient modelling, which determines the speed of chemical transfer from package to food, was a major contributor to uncertainty and dramatically decreased model performance (R 2  = 0.4, S e  = 1). In all, this study provides a rapid migration modelling approach to estimate exposure to chemicals in food packaging for emerging screening and prioritization approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. A simple tool for estimating city-wide annual electrical energy savings from cooler surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Pomerantz, Melvin; Rosado, Pablo J.; Levinson, Ronnen

    2015-12-01

    We present a simple method to estimate the maximum possible electrical energy saving that might be achieved by increasing the albedo of surfaces in a large city. We restrict this to the “indirect effect”, the cooling of outside air that lessens the demand for air conditioning (AC). Given the power demand of the electric utilities and data about the city, we can use a single linear equation to estimate the maximum savings. For example, the result for an albedo change of 0.2 of pavements in a typical warm city in California, such as Sacramento, is that the saving is less than about 2 kWh per m2 per year. This may help decision makers choose which heat island mitigation techniques are economical from an energy-saving perspective.

  11. Is The Ca + K + Mg/Al Ratio in the Soil Solution a Predictive Tool for Estimating Forest Damage?

    International Nuclear Information System (INIS)

    Goeransson, A.; Eldhuset, T. D.

    2001-01-01

    The ratio between (Ca +K +Mg) and Al in nutrient solution has been suggested as a predictive tool for estimating tree growth disturbance. However, the ratio is unspecific in the sense that it is based on several elements which are all essential for plant growth;each of these may be growth-limiting. Furthermore,aluminium retards growth at higher concentrations. Itis therefore difficult to give causal and objective biological explanations for possible growth disturbances. The importance of the proportion of base-cations to N, at a fixed base-cation/Al ratio, is evaluated with regard to growth of Picea abies.The uptake of elements was found to be selective; nutrients were taken up while most Al remained in solution. Biomass partitioning to the roots increased after aluminium addition with low proportions of basecations to nitrogen. We conclude that the low growthrates depend on nutrient limitation in these treatments. Low growth rates in the high proportion experiments may be explained by high internal Alconcentrations. The results strongly suggest that growth rate is not correlated with the ratio in the rooting medium and question the validity of using ratios as predictive tools for estimating forest damage. We suggest that growth limitation of Picea abies in the field may depend on low proportions of base cations to nitrate. It is therefore important to know the nutritional status of the plant material in relation to the growth potential and environmental limitation to be able to predict and estimate forest damage

  12. A Remote-Sensing Driven Tool for Estimating Crop Stress and Yields

    Directory of Open Access Journals (Sweden)

    Martha C. Anderson

    2013-07-01

    Full Text Available Biophysical crop simulation models are normally forced with precipitation data recorded with either gauges or ground-based radar. However, ground-based recording networks are not available at spatial and temporal scales needed to drive the models at many critical places on earth. An alternative would be to employ satellite-based observations of either precipitation or soil moisture. Satellite observations of precipitation are currently not considered capable of forcing the models with sufficient accuracy for crop yield predictions. However, deduction of soil moisture from space-based platforms is in a more advanced state than are precipitation estimates so that these data may be capable of forcing the models with better accuracy. In this study, a mature two-source energy balance model, the Atmosphere Land Exchange Inverse (ALEXI model, was used to deduce root zone soil moisture for an area of North Alabama, USA. The soil moisture estimates were used in turn to force the state-of-the-art Decision Support System for Agrotechnology Transfer (DSSAT crop simulation model. The study area consisted of a mixture of rainfed and irrigated cornfields. The results indicate that the model forced with the ALEXI moisture estimates produced yield simulations that compared favorably with observed yields and with the rainfed model. The data appear to indicate that the ALEXI model did detect the soil moisture signal from the mixed rainfed/irrigation corn fields and this signal was of sufficient strength to produce adequate simulations of recorded yields over a 10 year period.

  13. Two NextGen Air Safety Tools: An ADS-B Equipped UAV and a Wake Turbulence Estimator

    Science.gov (United States)

    Handley, Ward A.

    Two air safety tools are developed in the context of the FAA's NextGen program. The first tool addresses the alarming increase in the frequency of near-collisions between manned and unmanned aircraft by equipping a common hobby class UAV with an ADS-B transponder that broadcasts its position, speed, heading and unique identification number to all local air traffic. The second tool estimates and outputs the location of dangerous wake vortex corridors in real time based on the ADS-B data collected and processed using a custom software package developed for this project. The TRansponder based Position Information System (TRAPIS) consists of data packet decoders, an aircraft database, Graphical User Interface (GUI) and the wake vortex extension application. Output from TRAPIS can be visualized in Google Earth and alleviates the problem of pilots being left to imagine where invisible wake vortex corridors are based solely on intuition or verbal warnings from ATC. The result of these two tools is the increased situational awareness, and hence safety, of human pilots in the National Airspace System (NAS).

  14. A software tool to estimate the dynamic behaviour of the IP2C samples as sensors for didactic purposes

    International Nuclear Information System (INIS)

    Graziani, S.; Pagano, F.; Pitrone, N.; Umana, E.

    2010-01-01

    Ionic Polymer Polymer Composites (IP 2 Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP 2 C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP 2 Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP 2 C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP 2 C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  15. A new tool for quality of multimedia estimation based on network behaviour

    Directory of Open Access Journals (Sweden)

    Jaroslav Frnda

    2016-03-01

    Full Text Available In this paper, we present a software tool capable of predicting the final quality of triple play services by using the most common assessment metrics. The quality of speech and video in network environment is a growing concern of all the internet service providers to carry the multimedia traffic without the excessive delays and losses, which degrade the quality of multimedia as it is perceived by the end users. Prediction mathematical model is based on results obtained from many performed testing scenarios simulating real behavior in the network. Based on the proposed model, speech or video quality is calculated with regard to policies applied for packet processing by routers and to the level of total network utilization. The application cannot only predict QoS parameters but also generate the source code of particular QoS policy setting according to the user interaction and apply the policy to the routers in the network. Contribution of the work consists of a new software tool enables network administrators and designers to improve and optimize network traffic efficiently.

  16. Method for Friction Force Estimation on the Flank of Cutting Tools

    Directory of Open Access Journals (Sweden)

    Luis Huerta

    2017-01-01

    Full Text Available Friction forces are present in any machining process. These forces could play an important role in the dynamics of the system. In the cutting process, friction is mainly present in the rake face and the flank of the tool. Although the one that acts on the rake face has a major influence, the other one can become also important and could take part in the stability of the system. In this work, experimental identification of the friction on the flank is presented. The experimental determination was carried out by machining aluminum samples in a CNC lathe. As a result, two friction functions were obtained as a function of the cutting speed and the relative motion of the contact elements. Experiments using a worn and a new insert were carried out. Force and acceleration were recorded simultaneously and, from these results, different friction levels were observed depending on the cutting parameters, such as cutting speed, feed rate, and tool condition. Finally, a friction model for the flank friction is presented.

  17. Estimating Prediction Uncertainty from Geographical Information System Raster Processing: A User's Manual for the Raster Error Propagation Tool (REPTool)

    Science.gov (United States)

    Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.

    2009-01-01

    The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.

  18. An Accurate Computational Tool for Performance Estimation of FSO Communication Links over Weak to Strong Atmospheric Turbulent Channels

    Directory of Open Access Journals (Sweden)

    Theodore D. Katsilieris

    2017-03-01

    Full Text Available The terrestrial optical wireless communication links have attracted significant research and commercial worldwide interest over the last few years due to the fact that they offer very high and secure data rate transmission with relatively low installation and operational costs, and without need of licensing. However, since the propagation path of the information signal, i.e., the laser beam, is the atmosphere, their effectivity affects the atmospheric conditions strongly in the specific area. Thus, system performance depends significantly on the rain, the fog, the hail, the atmospheric turbulence, etc. Due to the influence of these effects, it is necessary to study, theoretically and numerically, very carefully before the installation of such a communication system. In this work, we present exactly and accurately approximate mathematical expressions for the estimation of the average capacity and the outage probability performance metrics, as functions of the link’s parameters, the transmitted power, the attenuation due to the fog, the ambient noise and the atmospheric turbulence phenomenon. The latter causes the scintillation effect, which results in random and fast fluctuations of the irradiance at the receiver’s end. These fluctuations can be studied accurately with statistical methods. Thus, in this work, we use either the lognormal or the gamma–gamma distribution for weak or moderate to strong turbulence conditions, respectively. Moreover, using the derived mathematical expressions, we design, accomplish and present a computational tool for the estimation of these systems’ performances, while also taking into account the parameter of the link and the atmospheric conditions. Furthermore, in order to increase the accuracy of the presented tool, for the cases where the obtained analytical mathematical expressions are complex, the performance results are verified with the numerical estimation of the appropriate integrals. Finally, using

  19. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    Science.gov (United States)

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  20. Dynamics of the oral microbiota as a tool to estimate time since death.

    Science.gov (United States)

    Adserias-Garriga, J; Quijada, N M; Hernandez, M; Rodríguez Lázaro, D; Steadman, D; Garcia-Gil, L J

    2017-06-27

    The oral cavity harbors one of the most diverse microbiomes in the human body. It has been shown to be the second most complex in the body after the gastrointestinal tract. Upon death, the indigenous microorganisms lead to the decomposition of the carcass. Therefore, the oral cavity and gastrointestinal tract microbiomes play a key role in human decomposition. The aim of the present study is to monitor the microbiome of decaying bodies on a daily basis and to identify signature bacterial taxa, that can improve postmortem interval estimation. Three individuals (one male and two female) donated to the University of Tennessee Forensic Anthropology Center for the W.M. Bass Donated Skeletal Collection were studied. Oral swab samples were taken daily throughout the different stages of cadaveric putrefaction. DNA was extracted and analyzed by next-generation sequencing techniques. The three cadavers showed similar overall successional changes during the decomposition process. Firmicutes and Actinobacteria are the predominant phyla in the fresh stage. The presence of Tenericutes corresponds to bloat stage. Firmicutes is the predominant phylum in advanced decay, but the Firmicutes community is a different one from the predominant Firmicutes of the fresh stage. This study depicts the thanatomicrobiome successional changes in the oral cavity, and highlights its potential use in forensic cases as a quantitative and objective approach to estimate postmortem interval, from an ecological rationale. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Effective Dysphonia Detection Using Feature Dimension Reduction and Kernel Density Estimation for Patients with Parkinson’s Disease

    Science.gov (United States)

    Yang, Shanshan; Zheng, Fang; Luo, Xin; Cai, Suxian; Wu, Yunfeng; Liu, Kaizhi; Wu, Meihong; Chen, Jian; Krishnan, Sridhar

    2014-01-01

    Detection of dysphonia is useful for monitoring the progression of phonatory impairment for patients with Parkinson’s disease (PD), and also helps assess the disease severity. This paper describes the statistical pattern analysis methods to study different vocal measurements of sustained phonations. The feature dimension reduction procedure was implemented by using the sequential forward selection (SFS) and kernel principal component analysis (KPCA) methods. Four selected vocal measures were projected by the KPCA onto the bivariate feature space, in which the class-conditional feature densities can be approximated with the nonparametric kernel density estimation technique. In the vocal pattern classification experiments, Fisher’s linear discriminant analysis (FLDA) was applied to perform the linear classification of voice records for healthy control subjects and PD patients, and the maximum a posteriori (MAP) decision rule and support vector machine (SVM) with radial basis function kernels were employed for the nonlinear classification tasks. Based on the KPCA-mapped feature densities, the MAP classifier successfully distinguished 91.8% voice records, with a sensitivity rate of 0.986, a specificity rate of 0.708, and an area value of 0.94 under the receiver operating characteristic (ROC) curve. The diagnostic performance provided by the MAP classifier was superior to those of the FLDA and SVM classifiers. In addition, the classification results indicated that gender is insensitive to dysphonia detection, and the sustained phonations of PD patients with minimal functional disability are more difficult to be correctly identified. PMID:24586406

  2. The use of activity-based cost estimation as a management tool for cultural change

    Science.gov (United States)

    Mandell, Humboldt; Bilby, Curt

    1991-01-01

    It will be shown that the greatest barrier to American exploration of the planet Mars is not the development of the technology needed to deliver humans and return them safely to earth. Neither is it the cost of such an undertaking, as has been previously suggested, although certainly, such a venture may not be inexpensive by some measures. The predicted costs of exploration have discouraged serious political dialog on the subject. And, in fact, even optimistic projections of the NASA budget do not contain the resources required, under the existing development and management paradigm, for human space exploration programs. It will be demonstrated that the perception of the costs of such a venture, and the cultural responses to the perceptions are factors inhibiting American exploration of the moon and the planet Mars. Cost models employed in the aerospace industry today correctly mirror the history of past space programs, and as such, are representative of the existing management and development paradigms. However, if, under this current paradigm no major exploration programs are feasible, then cost analysis methods based in the past may not have great utility in exploring the needed cultural changes. This paper explores the use of a new type of model, the activity based cost model, which will treat management style as an input variable, in a sense providing a tool whereby a complete, affordable program might be designed, including both the technological and management aspects.

  3. Estimating Longitudinal Risks and Benefits From Cardiovascular Preventive Therapies Among Medicare Patients: The Million Hearts Longitudinal ASCVD Risk Assessment Tool: A Special Report From the American Heart Association and American College of Cardiology.

    Science.gov (United States)

    Lloyd-Jones, Donald M; Huffman, Mark D; Karmali, Kunal N; Sanghavi, Darshak M; Wright, Janet S; Pelser, Colleen; Gulati, Martha; Masoudi, Frederick A; Goff, David C

    2017-03-28

    The Million Hearts Initiative has a goal of preventing 1 million heart attacks and strokes-the leading causes of mortality-through several public health and healthcare strategies by 2017. The American Heart Association and American College of Cardiology support the program. The Cardiovascular Risk Reduction Model was developed by Million Hearts and the Center for Medicare & Medicaid Services as a strategy to assess a value-based payment approach toward reduction in 10-year predicted risk of atherosclerotic cardiovascular disease (ASCVD) by implementing cardiovascular preventive strategies to manage the "ABCS" (aspirin therapy in appropriate patients, blood pressure control, cholesterol management, and smoking cessation). The purpose of this special report is to describe the development and intended use of the Million Hearts Longitudinal ASCVD Risk Assessment Tool. The Million Hearts Tool reinforces and builds on the "2013 ACC/AHA Guideline on the Assessment of Cardiovascular Risk" by allowing clinicians to estimate baseline and updated 10-year ASCVD risk estimates for primary prevention patients adhering to the appropriate ABCS over time, alone or in combination. The tool provides updated risk estimates based on evidence from high-quality systematic reviews and meta-analyses of the ABCS therapies. This novel approach to personalized estimation of benefits from risk-reducing therapies in primary prevention may help target therapies to those in whom they will provide the greatest benefit, and serves as the basis for a Center for Medicare & Medicaid Services program designed to evaluate the Million Hearts Cardiovascular Risk Reduction Model. Copyright © 2017 American Heart Association, Inc., and the American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  4. Design and validation of new genotypic tools for easy and reliable estimation of HIV tropism before using CCR5 antagonists.

    Science.gov (United States)

    Poveda, Eva; Seclén, Eduardo; González, María del Mar; García, Federico; Chueca, Natalia; Aguilera, Antonio; Rodríguez, Jose Javier; González-Lahoz, Juan; Soriano, Vincent

    2009-05-01

    Genotypic tools may allow easier and less expensive estimation of HIV tropism before prescription of CCR5 antagonists compared with the Trofile assay (Monogram Biosciences, South San Francisco, CA, USA). Paired genotypic and Trofile results were compared in plasma samples derived from the maraviroc expanded access programme (EAP) in Europe. A new genotypic approach was built to improve the sensitivity to detect X4 variants based on an optimization of the webPSSM algorithm. Then, the new tool was validated in specimens from patients included in the ALLEGRO trial, a multicentre study conducted in Spain to assess the prevalence of R5 variants in treatment-experienced HIV patients. A total of 266 specimens from the maraviroc EAP were tested. Overall geno/pheno concordance was above 72%. A high specificity was generally seen for the detection of X4 variants using genotypic tools (ranging from 58% to 95%), while sensitivity was low (ranging from 31% to 76%). The PSSM score was then optimized to enhance the sensitivity to detect X4 variants changing the original threshold for R5 categorization. The new PSSM algorithms, PSSM(X4R5-8) and PSSM(SINSI-6.4), considered as X4 all V3 scoring values above -8 or -6.4, respectively, increasing the sensitivity to detect X4 variants up to 80%. The new algorithms were then validated in 148 specimens derived from patients included in the ALLEGRO trial. The sensitivity/specificity to detect X4 variants was 93%/69% for PSSM(X4R5-8) and 93%/70% for PSSM(SINSI-6.4). PSSM(X4R5-8) and PSSM(SINSI-6.4) may confidently assist therapeutic decisions for using CCR5 antagonists in HIV patients, providing an easier and rapid estimation of tropism in clinical samples.

  5. USE OF UAV PLATFORM AS AN AUTONOMOUS TOOL FOR ESTIMATING EXPANSION ON INVADED AGRICULTURAL LAND

    Directory of Open Access Journals (Sweden)

    Niarkios Luiz Santos de Salles Graça

    Full Text Available Abstract: For a long time, in many countries, questions involving disputes about land ownership has generated demand for geoinformation and documentation. In most cases, access for researchers is restricted or humanely impossible by eminence of conflicts, even armed. In these cases, researchers use Remote Sensing and Photogrammetry to enable their studies. However, the dynamics of the phenomenon being studied often requires approaches that traditional techniques become unviable or unable to fulfil. This work shows the results of an approach that used a photogrammetric UAV platform to take pictures of an invaded rural area in Brazil and estimate its expansion over two years. From the taken images, mosaics were generated and then classified using Decision Tree to identify tents. Then it was developed a Matlab algorithm, to detect and quantify the tents on the classified Images. It was possible to infer that there was an expansion of 7.3% between the two analyzed dates and probably more than three thousand people occupied the invasion site.

  6. Effect of Using Different Vehicle Weight Groups on the Estimated Relationship Between Mass Reduction and U.S. Societal Fatality Risk per Vehicle Miles of Travel

    Energy Technology Data Exchange (ETDEWEB)

    Wenzel, Tom P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Technologies Area. Building Technology and Urban Systems Division

    2016-08-22

    This report recalculates the estimated relationship between vehicle mass and societal fatality risk, using alternative groupings by vehicle weight, to test whether the trend of decreasing fatality risk from mass reduction as case vehicle mass increases, holds over smaller increments of the range in case vehicle masses. The NHTSA baseline regression model estimates the relationship using for two weight groups for cars and light trucks; we re-estimated the mass reduction coefficients using four, six, and eight bins of vehicle mass. The estimated effect of mass reduction on societal fatality risk was not consistent over the range in vehicle masses in these weight bins. These results suggest that the relationship indicated by the NHTSA baseline model is a result of other, unmeasured attributes of the mix of vehicles in the lighter vs. heavier weight bins, and not necessarily the result of a correlation between mass reduction and societal fatality risk. An analysis of the average vehicle, driver, and crash characteristics across the various weight groupings did not reveal any strong trends that might explain the lack of a consistent trend of decreasing fatality risk from mass reduction in heavier vehicles.

  7. Volcano-tectonic earthquakes: A new tool for estimating intrusive volumes and forecasting eruptions

    Science.gov (United States)

    White, Randall; McCausland, Wendy

    2016-01-01

    We present data on 136 high-frequency earthquakes and swarms, termed volcano-tectonic (VT) seismicity, which preceded 111 eruptions at 83 volcanoes, plus data on VT swarms that preceded intrusions at 21 other volcanoes. We find that VT seismicity is usually the earliest reported seismic precursor for eruptions at volcanoes that have been dormant for decades or more, and precedes eruptions of all magma types from basaltic to rhyolitic and all explosivities from VEI 0 to ultraplinian VEI 6 at such previously long-dormant volcanoes. Because large eruptions occur most commonly during resumption of activity at long-dormant volcanoes, VT seismicity is an important precursor for the Earth's most dangerous eruptions. VT seismicity precedes all explosive eruptions of VEI ≥ 5 and most if not all VEI 4 eruptions in our data set. Surprisingly we find that the VT seismicity originates at distal locations on tectonic fault structures at distances of one or two to tens of kilometers laterally from the site of the eventual eruption, and rarely if ever starts beneath the eruption site itself. The distal VT swarms generally occur at depths almost equal to the horizontal distance of the swarm from the summit out to about 15 km distance, beyond which hypocenter depths level out. We summarize several important characteristics of this distal VT seismicity including: swarm-like nature, onset days to years prior to the beginning of magmatic eruptions, peaking of activity at the time of the initial eruption whether phreatic or magmatic, and large non-double couple component to focal mechanisms. Most importantly we show that the intruded magma volume can be simply estimated from the cumulative seismic moment of the VT seismicity from: Log10 V = 0.77 Log ΣMoment - 5.32, with volume, V, in cubic meters and seismic moment in Newton meters. Because the cumulative seismic moment can be approximated from the size of just the few largest events, and is quite insensitive to precise locations

  8. Hyper-Fractal Analysis: A visual tool for estimating the fractal dimension of 4D objects

    Science.gov (United States)

    Grossu, I. V.; Grossu, I.; Felea, D.; Besliu, C.; Jipa, Al.; Esanu, T.; Bordeianu, C. C.; Stan, E.

    2013-04-01

    This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images and 3D objects (Grossu et al. (2010) [1]). The program was extended for working with four-dimensional objects stored in comma separated values files. This might be of interest in biomedicine, for analyzing the evolution in time of three-dimensional images. New version program summaryProgram title: Hyper-Fractal Analysis (Fractal Analysis v03) Catalogue identifier: AEEG_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v3_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 745761 No. of bytes in distributed program, including test data, etc.: 12544491 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 100M Classification: 14 Catalogue identifier of previous version: AEEG_v2_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 831-832 Does the new version supersede the previous version? Yes Nature of problem: Estimating the fractal dimension of 4D images. Solution method: Optimized implementation of the 4D box-counting algorithm. Reasons for new version: Inspired by existing applications of 3D fractals in biomedicine [3], we extended the optimized version of the box-counting algorithm [1, 2] to the four-dimensional case. This might be of interest in analyzing the evolution in time of 3D images. The box-counting algorithm was extended in order to support 4D objects, stored in comma separated values files. A new form was added for generating 2D, 3D, and 4D test data. The application was tested on 4D objects with known dimension, e.g. the Sierpinski hypertetrahedron gasket, Df=ln(5)/ln(2) (Fig. 1). The algorithm could be extended, with minimum effort, to

  9. Volcano-tectonic earthquakes: A new tool for estimating intrusive volumes and forecasting eruptions

    Science.gov (United States)

    White, Randall A.; McCausland, Wendy

    2016-01-01

    We present data on 136 high-frequency earthquakes and swarms, termed volcano-tectonic (VT) seismicity, which preceded 111 eruptions at 83 volcanoes, plus data on VT swarms that preceded intrusions at 21 other volcanoes. We find that VT seismicity is usually the earliest reported seismic precursor for eruptions at volcanoes that have been dormant for decades or more, and precedes eruptions of all magma types from basaltic to rhyolitic and all explosivities from VEI 0 to ultraplinian VEI 6 at such previously long-dormant volcanoes. Because large eruptions occur most commonly during resumption of activity at long-dormant volcanoes, VT seismicity is an important precursor for the Earth's most dangerous eruptions. VT seismicity precedes all explosive eruptions of VEI ≥ 5 and most if not all VEI 4 eruptions in our data set. Surprisingly we find that the VT seismicity originates at distal locations on tectonic fault structures at distances of one or two to tens of kilometers laterally from the site of the eventual eruption, and rarely if ever starts beneath the eruption site itself. The distal VT swarms generally occur at depths almost equal to the horizontal distance of the swarm from the summit out to about 15 km distance, beyond which hypocenter depths level out. We summarize several important characteristics of this distal VT seismicity including: swarm-like nature, onset days to years prior to the beginning of magmatic eruptions, peaking of activity at the time of the initial eruption whether phreatic or magmatic, and large non-double couple component to focal mechanisms. Most importantly we show that the intruded magma volume can be simply estimated from the cumulative seismic moment of the VT seismicity from:

  10. The groundwater budget: A tool for preliminary estimation of the hydraulic connection between neighboring aquifers

    Science.gov (United States)

    Viaroli, Stefano; Mastrorillo, Lucia; Lotti, Francesca; Paolucci, Vittorio; Mazza, Roberto

    2018-01-01

    Groundwater management authorities usually use groundwater budget calculations to evaluate the sustainability of withdrawals for different purposes. The groundwater budget calculation does not always provide reliable information, and it must often be supported by further aquifer monitoring in the case of hydraulic connections between neighboring aquifers. The Riardo Plain aquifer is a strategic drinking resource for more than 100,000 people, water storage for 60 km2 of irrigated land, and the source of a mineral water bottling plant. Over a long period, the comparison between the direct recharge and the estimated natural outflow and withdrawals highlights a severe water deficit of approximately 40% of the total groundwater outflow. A groundwater budget deficit should be a clue to the aquifer depletion, but the results of long-term water level monitoring allowed the observation of the good condition of this aquifer. In fact, in the Riardo Plain, the calculated deficit is not comparable to the aquifer monitoring data acquired in the same period (1992-2014). The small oscillations of the groundwater level and the almost stable streambed spring discharge allows the presumption of an additional aquifer recharge source. The confined carbonate aquifer locally mixes with the above volcanic aquifer, providing an externally stable recharge that reduces the effects of the local rainfall variability. The combined approach of the groundwater budget results and long-term aquifer monitoring (spring discharge and/or hydraulic head oscillation) provides information about significant external groundwater exchanges, even if unidentified by field measurements, and supports the stakeholders in groundwater resource management.

  11. How many holes is too many? A prototype tool for estimating mosquito entry risk into damaged bed nets.

    Science.gov (United States)

    Sutcliffe, James; Ji, Xin; Yin, Shaoman

    2017-08-01

    Insecticide-treated bed nets (ITNs) have played an integral role in malaria reduction but how insecticide depletion and accumulating physical damage affect ITN performance is poorly understood. More accurate methods are needed to assess damage to bed nets so that they can be designed, deployed and replaced optimally. Video recordings of female Anopheles gambiae in near approach (1-½ cm) to occupied untreated rectangular bed nets in a laboratory study were used to quantify the amount of mosquito activity (appearances over time) around different parts of the net, the per-appearance probability of a mosquito coming close to holes of different sizes (hole encounter) and the per-encounter probability of mosquitoes passing through holes of different sizes (hole passage). Appearance frequency on different parts of the net reflected previously reported patterns: the area of the net under greatest mosquito pressure was the roof, followed by the bottom 30 cm of the sides, followed by the 30 cm area immediately above this, followed by the upper two-thirds of the sides. The ratio of activity in these areas was (respectively) 250:33:5:1. Per-appearance probability of hole encounter on all parts of the net was strongly predicted by a factor combining hole perimeter and area. Per-encounter probability of hole passage, in turn, was strongly predicted by hole width. For a given width, there was a 20% greater risk of passage through holes on the roof than holes on the sides. Appearance, encounter and passage predictors correspond to various mosquito behaviours that have previously been described and are combined into a prototype mosquito entry risk tool that predicts mosquito entry rates for nets with various amounts of damage. Scenarios that use the entry risk tool to test the recommendations of the WHOPES proportionate hole index (pHI) suggest that the pHI hole size categories and failure to account for hole location likely sometimes lead to incorrect conclusions about net

  12. Performance Analysis of a Fluidic Axial Oscillation Tool for Friction Reduction with the Absence of a Throttling Plate

    Directory of Open Access Journals (Sweden)

    Xinxin Zhang

    2017-04-01

    Full Text Available An axial oscillation tool is proved to be effective in solving problems associated with high friction and torque in the sliding drilling of a complex well. The fluidic axial oscillation tool, based on an output-fed bistable fluidic oscillator, is a type of axial oscillation tool which has become increasingly popular in recent years. The aim of this paper is to analyze the dynamic flow behavior of a fluidic axial oscillation tool with the absence of a throttling plate in order to evaluate its overall performance. In particular, the differences between the original design with a throttling plate and the current default design are profoundly analyzed, and an improvement is expected to be recorded for the latter. A commercial computational fluid dynamics code, Fluent, was used to predict the pressure drop and oscillation frequency of a fluidic axial oscillation tool. The results of the numerical simulations agree well with corresponding experimental results. A sufficient pressure pulse amplitude with a low pressure drop is desired in this study. Therefore, a relative pulse amplitude of pressure drop and displacement are introduced in our study. A comparison analysis between the two designs with and without a throttling plate indicates that when the supply flow rate is relatively low or higher than a certain value, the fluidic axial oscillation tool with a throttling plate exhibits a better performance; otherwise, the fluidic axial oscillation tool without a throttling plate seems to be a preferred alternative. In most of the operating circumstances in terms of the supply flow rate and pressure drop, the fluidic axial oscillation tool performs better than the original design.

  13. Aerial Survey as a Tool to Estimate Abundance and Describe Distribution of a Carcharhinid Species, the Lemon Shark, Negaprion brevirostris

    Directory of Open Access Journals (Sweden)

    S. T. Kessel

    2013-01-01

    Full Text Available Aerial survey provides an important tool to assess the abundance of both terrestrial and marine vertebrates. To date, limited work has tested the effectiveness of this technique to estimate the abundance of smaller shark species. In Bimini, Bahamas, the lemon shark (Negaprion brevirostris shows high site fidelity to a shallow sandy lagoon, providing an ideal test species to determine the effectiveness of localised aerial survey techniques for a Carcharhinid species in shallow subtropical waters. Between September 2007 and September 2008, visual surveys were conducted from light aircraft following defined transects ranging in length between 8.8 and 4.4 km. Count results were corrected for “availability”, “perception”, and “survey intensity” to provide unbiased abundance estimates. The abundance of lemon sharks was greatest in the central area of the lagoon during high tide, with a change in abundance distribution to the east and western regions of the lagoon with low tide. Mean abundance of sharks was estimated at 49 (±8.6 individuals, and monthly abundance was significantly positively correlated with mean water temperature. The successful implementation of the aerial survey technique highlighted the potential of further employment for shark abundance assessments in shallow coastal marine environments.

  14. Estimates of the timing of reductions in genital warts and high grade cervical intraepithelial neoplasia after onset of human papillomavirus (HPV) vaccination in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Ekwueme, Donatus U; Saraiya, Mona; Dunne, Eileen F; Markowitz, Lauri E

    2013-08-20

    The objective of this study was to estimate the number of years after onset of a quadrivalent HPV vaccination program before notable reductions in genital warts and cervical intraepithelial neoplasia (CIN) will occur in teenagers and young adults in the United States. We applied a previously published model of HPV vaccination in the United States and focused on the timing of reductions in genital warts among both sexes and reductions in CIN 2/3 among females. Using different coverage scenarios, the lowest being consistent with current 3-dose coverage in the United States, we estimated the number of years before reductions of 10%, 25%, and 50% would be observed after onset of an HPV vaccination program for ages 12-26 years. The model suggested female-only HPV vaccination in the intermediate coverage scenario will result in a 10% reduction in genital warts within 2-4 years for females aged 15-19 years and a 10% reduction in CIN 2/3 among females aged 20-29 years within 7-11 years. Coverage had a major impact on when reductions would be observed. For example, in the higher coverage scenario a 25% reduction in CIN2/3 would be observed with 8 years compared with 15 years in the lower coverage scenario. Our model provides estimates of the potential timing and magnitude of the impact of HPV vaccination on genital warts and CIN 2/3 at the population level in the United States. Notable, population-level impacts of HPV vaccination on genital warts and CIN 2/3 can occur within a few years after onset of vaccination, particularly among younger age groups. Our results are generally consistent with early reports of declines in genital warts among youth. Published by Elsevier Ltd.

  15. Use Of Statistical Tools To Evaluate The Reductive Dechlorination Of High Levels Of TCE In Microcosm Studies

    Science.gov (United States)

    A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study ...

  16. Distance Learning as a Tool for Poverty Reduction and Economic Development: A Focus on China and Mexico

    Science.gov (United States)

    Larson, Richard C.; Murray, M. Elizabeth

    2008-01-01

    This paper uses case studies to focus on distance learning in developing countries as an enabler for economic development and poverty reduction. To provide perspective, we first review the history of telecottages, local technology-equipped facilities to foster community-based learning, which have evolved into "telecenters" or…

  17. Reduction of potassium content of green bean pods and chard by culinary processing. Tools for chronic kidney disease

    Directory of Open Access Journals (Sweden)

    Montserrat Martínez-Pineda

    2016-07-01

    Conclusion: The results shown in this study are very positive because they provide tools for professionals who deal with this kind of patients. They allow them to adapt more easily to the needs and preferences of their patients and increase dietary variety.

  18. Modeling of the effect of tool wear per discharge estimation error on the depth of machined cavities in micro-EDM milling

    DEFF Research Database (Denmark)

    Puthumana, Govindan; Bissacco, Giuliano; Hansen, Hans Nørgaard

    2017-01-01

    In micro-EDM milling, real time electrode wear compensation based on tool wear per discharge (TWD) estimation permits the direct control of the position of the tool electrode frontal surface. However, TWD estimation errors will cause errors on the tool electrode axial depth. A simulation tool...... is developed to determine the effects of errors in the initial estimation of TWD and its propagation effect with respect to the error on the depth of the cavity generated. Simulations were applied to micro-EDM milling of a slot of 5000 μm length and 50 μm depth and validated through slot milling experiments...... performed on a micro-EDM machine. Simulations and experimental results were found to be in good agreement, showing the effect of errror amplification through the cavity depth....

  19. FunGeneNet: a web tool to estimate enrichment of functional interactions in experimental gene sets.

    Science.gov (United States)

    Tiys, Evgeny S; Ivanisenko, Timofey V; Demenkov, Pavel S; Ivanisenko, Vladimir A

    2018-02-09

    Estimation of functional connectivity in gene sets derived from genome-wide or other biological experiments is one of the essential tasks of bioinformatics. A promising approach for solving this problem is to compare gene networks built using experimental gene sets with random networks. One of the resources that make such an analysis possible is CrossTalkZ, which uses the FunCoup database. However, existing methods, including CrossTalkZ, do not take into account individual types of interactions, such as protein/protein interactions, expression regulation, transport regulation, catalytic reactions, etc., but rather work with generalized types characterizing the existence of any connection between network members. We developed the online tool FunGeneNet, which utilizes the ANDSystem and STRING to reconstruct gene networks using experimental gene sets and to estimate their difference from random networks. To compare the reconstructed networks with random ones, the node permutation algorithm implemented in CrossTalkZ was taken as a basis. To study the FunGeneNet applicability, the functional connectivity analysis of networks constructed for gene sets involved in the Gene Ontology biological processes was conducted. We showed that the method sensitivity exceeds 0.8 at a specificity of 0.95. We found that the significance level of the difference between gene networks of biological processes and random networks is determined by the type of connections considered between objects. At the same time, the highest reliability is achieved for the generalized form of connections that takes into account all the individual types of connections. By taking examples of the thyroid cancer networks and the apoptosis network, it is demonstrated that key participants in these processes are involved in the interactions of those types by which these networks differ from random ones. FunGeneNet is a web tool aimed at proving the functionality of networks in a wide range of sizes of

  20. Geospatial tools effectively estimate nonexceedance probabilities of daily streamflow at ungauged and intermittently gauged locations in Ohio

    Directory of Open Access Journals (Sweden)

    William H. Farmer

    2017-10-01

    New hydrological insights for the region: Several methods for estimating nonexceedance probabilities of daily mean streamflows are explored, including single-index methodologies (nearest-neighboring index and geospatial tools (kriging and topological kriging. These methods were evaluated by conducting leave-one-out cross-validations based on analyses of nearly 7 years of daily streamflow data from 79 unregulated streamgages in Ohio and neighboring states. The pooled, ordinary kriging model, with a median Nash–Sutcliffe performance of 0.87, was superior to the single-site index methods, though there was some bias in the tails of the probability distribution. Incorporating network structure through topological kriging did not improve performance. The pooled, ordinary kriging model was applied to 118 locations without systematic streamgaging across Ohio where instantaneous streamflow measurements had been made concurrent with water-quality sampling on at least 3 separate days. Spearman rank correlations between estimated nonexceedance probabilities and measured streamflows were high, with a median value of 0.76. In consideration of application, the degree of regulation in a set of sample sites helped to specify the streamgages required to implement kriging approaches successfully.

  1. OligoHeatMap (OHM): an online tool to estimate and display hybridizations of oligonucleotides onto DNA sequences.

    Science.gov (United States)

    Croce, Olivier; Chevenet, François; Christen, Richard

    2008-07-01

    The efficiency of molecular methods involving DNA/DNA hybridizations depends on the accurate prediction of the melting temperature (T(m)) of the duplex. Many softwares are available for T(m) calculations, but difficulties arise when one wishes to check if a given oligomer (PCR primer or probe) hybridizes well or not on more than a single sequence. Moreover, the presence of mismatches within the duplex is not sufficient to estimate specificity as it does not always significantly decrease the T(m). OHM (OligoHeatMap) is an online tool able to provide estimates of T(m) for a set of oligomers and a set of aligned sequences, not only as text files of complete results but also in a graphical way: T(m) values are translated into colors and displayed as a heat map image, either stand alone or to be used by softwares such as TreeDyn to be included in a phylogenetic tree. OHM is freely available at http://bioinfo.unice.fr/ohm/, with links to the full source code and online help.

  2. How accurate are adolescents in portion-size estimation using the computer tool young adolescents' nutrition assessment on computer (YANA-C)?

    OpenAIRE

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-01-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amou...

  3. Establishing the value of occupational health nurses' contributions to worker health and safety: a pilot test of a user-friendly estimation tool.

    Science.gov (United States)

    Graeve, Catherine; McGovern, Patricia; Nachreiner, Nancy M; Ayers, Lynn

    2014-01-01

    Occupational health nurses use their knowledge and skills to improve the health and safety of the working population; however, companies increasingly face budget constraints and may eliminate health and safety programs. Occupational health nurses must be prepared to document their services and outcomes, and use quantitative tools to demonstrate their value to employers. The aim of this project was to create and pilot test a quantitative tool for occupational health nurses to track their activities and potential cost savings for on-site occupational health nursing services. Tool developments included a pilot test in which semi-structured interviews with occupational health and safety leaders were conducted to identify currents issues and products used for estimating the value of occupational health nursing services. The outcome was the creation of a tool that estimates the economic value of occupational health nursing services. The feasibility and potential value of this tool is described.

  4. A novel tool for user-friendly estimation of natural, diagnostic and professional radiation risk: Radio-Risk software

    International Nuclear Information System (INIS)

    Carpeggiani, Clara; Paterni, Marco; Caramella, Davide; Vano, Eliseo; Semelka, Richard C.; Picano, Eugenio

    2012-01-01

    Background: Awareness of radiological risk is low among doctors and patients. An educational/decision tool that considers each patient’ s cumulative lifetime radiation exposure would facilitate provider–patient communication. Aim: The purpose of this work was to develop user-friendly software for simple estimation and communication of radiological risk to patients and doctors as a part of the SUIT-Heart (Stop Useless Imaging Testing in Heart disease) Project of the Tuscany Region. Methods: We developed a novel software program (PC-platform, Windows OS fully downloadable at (http://suit-heart.ifc.cnr.it)) considering reference dose estimates from American Heart Association Radiological Imaging 2009 guidelines and UK Royal College of Radiology 2007 guidelines. Cancer age and gender-weighted risk were derived from Biological Effects of Ionising Radiation VII Committee, 2006. Results: With simple input functions (demographics, age, gender) the user selects from a predetermined menu variables relating to natural (e.g., airplane flights and geo-tracked background exposure), professional (e.g., cath lab workers) and medical (e.g., CT, cardiac scintigraphy, coronary stenting) sources. The program provides a simple numeric (cumulative effective dose in milliSievert, mSv, and equivalent number of chest X-rays) and graphic (cumulative temporal trends of exposure, cancer cases out of 100 exposed persons) display. Conclusions: A simple software program allows straightforward estimation of cumulative dose (in multiples of chest X-rays) and risk (in extra % lifetime cancer risk), with simple numbers quantifying lifetime extra cancer risk. Pictorial display of radiation risk may be valuable for increasing radiological awareness in cardiologists.

  5. Using the soil and water assessment tool to estimate dissolved inorganic nitrogen water pollution abatement cost functions in central portugal.

    Science.gov (United States)

    Roebeling, P C; Rocha, J; Nunes, J P; Fidélis, T; Alves, H; Fonseca, S

    2014-01-01

    Coastal aquatic ecosystems are increasingly affected by diffuse source nutrient water pollution from agricultural activities in coastal catchments, even though these ecosystems are important from a social, environmental and economic perspective. To warrant sustainable economic development of coastal regions, we need to balance marginal costs from coastal catchment water pollution abatement and associated marginal benefits from coastal resource appreciation. Diffuse-source water pollution abatement costs across agricultural sectors are not easily determined given the spatial heterogeneity in biophysical and agro-ecological conditions as well as the available range of best agricultural practices (BAPs) for water quality improvement. We demonstrate how the Soil and Water Assessment Tool (SWAT) can be used to estimate diffuse-source water pollution abatement cost functions across agricultural land use categories based on a stepwise adoption of identified BAPs for water quality improvement and corresponding SWAT-based estimates for agricultural production, agricultural incomes, and water pollution deliveries. Results for the case of dissolved inorganic nitrogen (DIN) surface water pollution by the key agricultural land use categories ("annual crops," "vineyards," and "mixed annual crops & vineyards") in the Vouga catchment in central Portugal show that no win-win agricultural practices are available within the assessed BAPs for DIN water quality improvement. Estimated abatement costs increase quadratically in the rate of water pollution abatement, with largest abatement costs for the "mixed annual crops & vineyards" land use category (between 41,900 and 51,900 € tDIN yr) and fairly similar abatement costs across the "vineyards" and "annual crops" land use categories (between 7300 and 15,200 € tDIN yr). Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  6. Construction of estimated flow- and load-duration curves for Kentucky using the Water Availability Tool for Environmental Resources (WATER)

    Science.gov (United States)

    Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.

    2012-01-01

    Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.

  7. A novel tool for user-friendly estimation of natural, diagnostic and professional radiation risk: Radio-Risk software

    Energy Technology Data Exchange (ETDEWEB)

    Carpeggiani, Clara; Paterni, Marco [CNR, Institute of Clinical Physiology (Italy); Caramella, Davide [Radiology Department, Pisa University, Pisa (Italy); Vano, Eliseo [San Carlos Hospital, Radiology Department, Complutense University, Madrid (Spain); Semelka, Richard C. [University of North Carolina, Chapel Hill, NC (United States); Picano, Eugenio, E-mail: picano@ifc.cnr.it [CNR, Institute of Clinical Physiology (Italy)

    2012-11-15

    Background: Awareness of radiological risk is low among doctors and patients. An educational/decision tool that considers each patient' s cumulative lifetime radiation exposure would facilitate provider-patient communication. Aim: The purpose of this work was to develop user-friendly software for simple estimation and communication of radiological risk to patients and doctors as a part of the SUIT-Heart (Stop Useless Imaging Testing in Heart disease) Project of the Tuscany Region. Methods: We developed a novel software program (PC-platform, Windows OS fully downloadable at (http://suit-heart.ifc.cnr.it)) considering reference dose estimates from American Heart Association Radiological Imaging 2009 guidelines and UK Royal College of Radiology 2007 guidelines. Cancer age and gender-weighted risk were derived from Biological Effects of Ionising Radiation VII Committee, 2006. Results: With simple input functions (demographics, age, gender) the user selects from a predetermined menu variables relating to natural (e.g., airplane flights and geo-tracked background exposure), professional (e.g., cath lab workers) and medical (e.g., CT, cardiac scintigraphy, coronary stenting) sources. The program provides a simple numeric (cumulative effective dose in milliSievert, mSv, and equivalent number of chest X-rays) and graphic (cumulative temporal trends of exposure, cancer cases out of 100 exposed persons) display. Conclusions: A simple software program allows straightforward estimation of cumulative dose (in multiples of chest X-rays) and risk (in extra % lifetime cancer risk), with simple numbers quantifying lifetime extra cancer risk. Pictorial display of radiation risk may be valuable for increasing radiological awareness in cardiologists.

  8. Consultant management estimating tool.

    Science.gov (United States)

    2012-04-01

    The New York State Department of Transportation (NYSDOT) Consultant Management Bureaus primary responsibilities are to negotiate staffing hours/resources with : engineering design consultants, and to monitor the consultant's costs. Currently the C...

  9. Water, sanitation and hygiene interventions for acute childhood diarrhea: a systematic review to provide estimates for the Lives Saved Tool.

    Science.gov (United States)

    Darvesh, Nazia; Das, Jai K; Vaivada, Tyler; Gaffey, Michelle F; Rasanathan, Kumanan; Bhutta, Zulfiqar A

    2017-11-07

    In the Sustainable Development Goals (SDGs) era, there is growing recognition of the responsibilities of non-health sectors in improving the health of children. Interventions to improve access to clean water, sanitation facilities, and hygiene behaviours (WASH) represent key opportunities to improve child health and well-being by preventing the spread of infectious diseases and improving nutritional status. We conducted a systematic review of studies evaluating the effects of WASH interventions on childhood diarrhea in children 0-5 years old. Searches were run up to September 2016. We screened the titles and abstracts of retrieved articles, followed by screening of the full-text reports of relevant studies. We abstracted study characteristics and quantitative data, and assessed study quality. Meta-analyses were performed for similar intervention and outcome pairs. Pooled analyses showed diarrhea risk reductions from the following interventions: point-of-use water filtration (pooled risk ratio (RR): 0.47, 95% confidence interval (CI): 0.36-0.62), point-of-use water disinfection (pooled RR: 0.69, 95% CI: 0.60-0.79), and hygiene education with soap provision (pooled RR: 0.73, 95% CI: 0.57-0.94). Quality ratings were low or very low for most studies, and heterogeneity was high in pooled analyses. Improvements to the water supply and water disinfection at source did not show significant effects on diarrhea risk, nor did the one eligible study examining the effect of latrine construction. Various WASH interventions show diarrhea risk reductions between 27% and 53% in children 0-5 years old, depending on intervention type, providing ample evidence to support the scale-up of WASH in low and middle-income countries (LMICs). Due to the overall low quality of the evidence and high heterogeneity, further research is required to accurately estimate the magnitude of the effects of these interventions in different contexts.

  10. Cost effectiveness of robotics and remote tooling for occupational risk reduction at a nuclear fuel fabrication facility

    Energy Technology Data Exchange (ETDEWEB)

    Lochard, Jacques

    1989-08-01

    This case study, related to the design stage of a fuel fabrication facility, presents the evaluation of alternative options to manipulate mixed oxide fuel rods in a quality control shop. It is based on a study performed in the framework of the 'MELOX project' developed by COGEMA in France. The methodology for evaluating robotic actions is resulting from a research work part funded by the IAEA under the co-ordinated research programme on 'Comparison of cost-effectiveness of risk reduction among different energy systems', and by the commission of the European Communities under the research and training programme on radiation protection.

  11. Cost effectiveness of robotics and remote tooling for occupational risk reduction at a nuclear fuel fabrication facility

    International Nuclear Information System (INIS)

    Lochard, Jacques

    1989-01-01

    This case study, related to the design stage of a fuel fabrication facility, presents the evaluation of alternative options to manipulate mixed oxide fuel rods in a quality control shop. It is based on a study performed in the framework of the 'MELOX project' developed by COGEMA in France. The methodology for evaluating robotic actions is resulting from a research work part funded by the IAEA under the co-ordinated research programme on 'Comparison of cost-effectiveness of risk reduction among different energy systems', and by the commission of the European Communities under the research and training programme on radiation protection

  12. Low-cost Tools for Aerial Video Geolocation and Air Traffic Analysis for Delay Reduction Using Google Earth

    Science.gov (United States)

    Zetterlind, V.; Pledgie, S.

    2009-12-01

    Low-cost, low-latency, robust geolocation and display of aerial video is a common need for a wide range of earth observing as well as emergency response and security applications. While hardware costs for aerial video collection systems, GPS, and inertial sensors have been decreasing, software costs for geolocation algorithms and reference imagery/DTED remain expensive and highly proprietary. As part of a Federal Small Business Innovative Research project, MosaicATM and EarthNC, Inc have developed a simple geolocation system based on the Google Earth API and Google's 'built-in' DTED and reference imagery libraries. This system geolocates aerial video based on platform and camera position, attitude, and field-of-view metadata using geometric photogrammetric principles of ray-intersection with DTED. Geolocated video can be directly rectified and viewed in the Google Earth API during processing. Work is underway to extend our geolocation code to NASA World Wind for additional flexibility and a fully open-source platform. In addition to our airborne remote sensing work, MosaicATM has developed the Surface Operations Data Analysis and Adaptation (SODAA) tool, funded by NASA Ames, which supports analysis of airport surface operations to optimize aircraft movements and reduce fuel burn and delays. As part of SODAA, MosaicATM and EarthNC, Inc have developed powerful tools to display national airspace data and time-animated 3D flight tracks in Google Earth for 4D analysis. The SODAA tool can convert raw format flight track data, FAA National Flight Data (NFD), and FAA 'Adaptation' airport surface data to a spatial database representation and then to Google Earth KML. The SODAA client provides users with a simple graphical interface through which to generate queries with a wide range of predefined and custom filters, plot results, and export for playback in Google Earth in conjunction with NFD and Adaptation overlays.

  13. Tourism as a Poverty Reduction Tool: The Case of Mukuni Village in the Southern Province of Zambia

    Directory of Open Access Journals (Sweden)

    Miroslav Horák

    2014-01-01

    Full Text Available Globally, tourism is becoming one of the cornerstones of national economic growth and as a means of poverty alleviation, especially in the tourist attractions in rural areas. This article assesses the levels of utilization of tourism potentials in Zambia, in general, and the Mukuni village in the Southern province in Zambia, in particular, with reference to poverty reduction. The world famous Victoria Falls is situated in the Southern province and therefore this area is the most visited places in Zambia and attracts more tourists throughout the whole year. The main income of the local people, which includes the Tonga tribe comes from tourism. Even though tourism has brought positive results, including the realization of some local development projects and prosperity to the people, it has also brought some negative effects such as sociocultural change, pollution and waste in the tourist destination areas in Zambia.For the Mukuni people and Zambia as a whole to fully exploit tourism potentials, stricter laws protecting the destruction of the environment and the preservation culture of the indigenous people should be enforced in the tourist destination areas. The government should use the levy from tourism to provide better infrastructure, create job opportunities and create wealth within the tourist areas for sustainable tourism development and poverty reduction.

  14. Cost reduction in the production process using the ABC and Lean tools: Case Study in the refrigeration components industry

    Directory of Open Access Journals (Sweden)

    Levi da Silva Guimarães

    2015-03-01

    Full Text Available This paper focuses on production management with respect to operating costs that relate directly to the value of the product. For this study, three methods were used, ABC - Activity Based Costing, which provides accurate information about the knowledge of the real costs, VSM - Value Stream Mapping and Lean Manufacturing. The method adopted for this research was the case study. The study was conducted at a refrigeration components company in the Industrial Center of Manaus. The analyses and observations initially went through the process of mapping the value stream, measuring the current state of activities (cycle time, setup, etc.. After analysis it was possible to map the cost for each activity and finally calculate the cost of the product before and after the improvements resulting from the lean methodology. The results obtained in this study showed a 20% reduction in product costs resulting from operational improvements. The activity-based cost led to a discovery of the real costs of waste. The steps for this study include process mapping through the value stream, measuring the current state of activities (cycle time, setup, etc., establishing the cost driver for each activity, and finally calculating the cost of the product before and after the application of lean improvements. The paper was conducted through literature and descriptive review, and used a case study method. It describes the model that has been tested in a production line for a refrigeration components company from the Manaus Industrial Center, achieving a 20% reduction in product cost.

  15. Accounting for density reduction and structural loss in standing dead trees: Implications for forest biomass and carbon stock estimates in the United States

    Directory of Open Access Journals (Sweden)

    Domke Grant M

    2011-11-01

    Full Text Available Abstract Background Standing dead trees are one component of forest ecosystem dead wood carbon (C pools, whose national stock is estimated by the U.S. as required by the United Nations Framework Convention on Climate Change. Historically, standing dead tree C has been estimated as a function of live tree growing stock volume in the U.S.'s National Greenhouse Gas Inventory. Initiated in 1998, the USDA Forest Service's Forest Inventory and Analysis program (responsible for compiling the Nation's forest C estimates began consistent nationwide sampling of standing dead trees, which may now supplant previous purely model-based approaches to standing dead biomass and C stock estimation. A substantial hurdle to estimating standing dead tree biomass and C attributes is that traditional estimation procedures are based on merchantability paradigms that may not reflect density reductions or structural loss due to decomposition common in standing dead trees. The goal of this study was to incorporate standing dead tree adjustments into the current estimation procedures and assess how biomass and C stocks change at multiple spatial scales. Results Accounting for decay and structural loss in standing dead trees significantly decreased tree- and plot-level C stock estimates (and subsequent C stocks by decay class and tree component. At a regional scale, incorporating adjustment factors decreased standing dead quaking aspen biomass estimates by almost 50 percent in the Lake States and Douglas-fir estimates by more than 36 percent in the Pacific Northwest. Conclusions Substantial overestimates of standing dead tree biomass and C stocks occur when one does not account for density reductions or structural loss. Forest inventory estimation procedures that are descended from merchantability standards may need to be revised toward a more holistic approach to determining standing dead tree biomass and C attributes (i.e., attributes of tree biomass outside of sawlog

  16. A Visualization Tool to Analyse Usage of Web-Based Interventions: The Example of Positive Online Weight Reduction (POWeR)

    Science.gov (United States)

    Smith, Emily; Bradbury, Katherine; Morrison, Leanne; Dennison, Laura; Michaelides, Danius; Yardley, Lucy

    2015-01-01

    Background Attrition is a significant problem in Web-based interventions. Consequently, this research aims to identify the relation between Web usage and benefit from such interventions. A visualization tool has been developed that enables researchers to more easily examine large datasets on intervention usage that can be difficult to make sense of using traditional descriptive or statistical techniques alone. Objective This paper demonstrates how the visualization tool was used to explore patterns in participants’ use of a Web-based weight management intervention, termed "positive online weight reduction (POWeR)." We also demonstrate how the visualization tool can be used to perform subsequent statistical analyses of the association between usage patterns, participant characteristics, and intervention outcome. Methods The visualization tool was used to analyze data from 132 participants who had accessed at least one session of the POWeR intervention. Results There was a drop in usage of optional sessions after participants had accessed the initial, core POWeR sessions, but many users nevertheless continued to complete goal and weight reviews. The POWeR tools relating to the food diary and steps diary were reused most often. Differences in participant characteristics and usage of other intervention components were identified between participants who did and did not choose to access optional POWeR sessions (in addition to the initial core sessions) or reuse the food and steps diaries. Reuse of the steps diary and the getting support tools was associated with greater weight loss. Conclusions The visualization tool provided a quick and efficient method for exploring patterns of Web usage, which enabled further analyses of whether different usage patterns were associated with participant characteristics or differences in intervention outcome. Further usage of visualization techniques is recommended to (1) make sense of large datasets more quickly and efficiently; (2

  17. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    Science.gov (United States)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  18. A Modelling Approach to Estimate the Impact of Sodium Reduction in Soups on Cardiovascular Health in the Netherlands

    Directory of Open Access Journals (Sweden)

    Maaike J. Bruins

    2015-09-01

    Full Text Available Hypertension is a major modifiable risk factor for cardiovascular disease and mortality, which could be lowered by reducing dietary sodium. The potential health impact of a product reformulation in the Netherlands was modelled, selecting packaged soups containing on average 25% less sodium as an example of an achievable product reformulation when implemented gradually. First, the blood pressure lowering resulting from sodium intake reduction was modelled. Second, the predicted blood pressure lowering was translated into potentially preventable incidence and mortality cases from stroke, acute myocardial infarction (AMI, angina pectoris, and heart failure (HF implementing one year salt reduction. Finally, the potentially preventable subsequent lifetime Disability-Adjusted Life Years (DALYs were calculated. The sodium reduction in soups might potentially reduce the incidence and mortality of stroke by approximately 0.5%, AMI and angina by 0.3%, and HF by 0.2%. The related burden of disease could be reduced by approximately 800 lifetime DALYs. This modelling approach can be used to provide insight into the potential public health impact of sodium reduction in specific food products. The data demonstrate that an achievable food product reformulation to reduce sodium can potentially benefit public health, albeit modest. When implemented across multiple product categories and countries, a significant health impact could be achieved.

  19. Using SMS Text Messaging to Assess Moderators of Smoking Reduction: Validating a New Tool for Ecological Measurement of Health Behaviors

    Science.gov (United States)

    Berkman, Elliot T.; Dickenson, Janna; Falk, Emily B.; Lieberman, Matthew D.

    2011-01-01

    Objective Understanding the psychological processes that contribute to smoking reduction will yield population health benefits. Negative mood may moderate smoking lapse during cessation, but this relationship has been difficult to measure in ongoing daily experience. We used a novel form of ecological momentary assessment to test a self-control model of negative mood and craving leading to smoking lapse. Design We validated short message service (SMS) text as a user-friendly and low-cost option for ecologically measuring real-time health behaviors. We sent text messages to cigarette smokers attempting to quit eight times daily for the first 21 days of cessation (N-obs = 3,811). Main outcome measures Approximately every two hours, we assessed cigarette count, mood, and cravings, and examined between- and within-day patterns and time-lagged relationships among these variables. Exhaled carbon monoxide was assessed pre- and posttreatment. Results Negative mood and craving predicted smoking two hours later, but craving mediated the mood–smoking relationship. Also, this mediation relationship predicted smoking over the next two, but not four, hours. Conclusion Results clarify conflicting previous findings on the relation between affect and smoking, validate a new low-cost and user-friendly method for collecting fine-grained health behavior assessments, and emphasize the importance of rapid, real-time measurement of smoking moderators. PMID:21401252

  20. A molecular diagnostic tool to replace larval culture in conventional faecal egg count reduction testing in sheep.

    Directory of Open Access Journals (Sweden)

    Florian Roeber

    Full Text Available The accurate diagnosis of parasitic nematode infections in livestock (including sheep and goats is central to their effective control and the detection of the anthelmintic resistance. Traditionally, the faecal egg count reduction test (FECRT, combined with the technique of larval culture (LC, has been used widely to assess drug-susceptibility/resistance in strongylid nematodes. However, this approach suffers from a lack of specificity, sensitivity and reliability, and is time-consuming and costly to conduct. Here, we critically assessed a specific PCR assay to support FECRT, in a well-controlled experiment on sheep with naturally acquired strongylid infections known to be resistant to benzimidazoles. We showed that the PCR results were in close agreement with those of total worm count (TWC, but not of LC. Importantly, albendazole resistance detected by PCR-coupled FECRT was unequivocally linked to Teladorsagia circumcincta and, to lesser extent, Trichostrongylus colubriformis, a result that was not achievable by LC. The key findings from this study demonstrate that our PCR-coupled FECRT approach has major merit for supporting anthelmintic resistance in nematode populations. The findings also show clearly that our PCR assay can be used as an alternative to LC, and is more time-efficient and less laborious, which has important practical implications for the effective management and control strongylid nematodes of sheep.

  1. Drive Cost Reduction, Increase Innovation and Mitigate Risk with Advanced Knowledge Discovery Tools Designed to Unlock and Leverage Prior Knowledge

    International Nuclear Information System (INIS)

    Mitchell, I.

    2016-01-01

    Full text: The nuclear industry is knowledge-intensive and includes a diverse number of stakeholders. Much of this knowledge is at risk as engineers, technicians and project professionals retire, leaving a widening skills and information gap. This knowledge is critical in an increasingly complex environment with information from past projects often buried in decades-old, non-integrated systems enterprise. Engineers can spend 40% or more of their time searching for answers across the enterprise instead of solving problems. The inability to access trusted industry knowledge results in increased risk and expense. Advanced knowledge discovery technologies slash research times by as much as 75% and accelerate innovation and problem solving by giving technical professionals access to the information they need, in the context of the problems they are trying to solve. Unlike traditional knowledge management approaches, knowledge discovery tools powered by semantic search technologies are adept at uncovering answers in unstructured data and require no tagging, organization or moving of data, meaning a smaller IT footprint and faster time-to-knowledge. This session will highlight best-in-class knowledge discovery technologies, content, and strategies to give nuclear industry organizations the ability to leverage the corpus of enterprise knowledge into the future. (author

  2. Estimation of the solubility parameters of model plant surfaces and agrochemicals: a valuable tool for understanding plant surface interactions.

    Science.gov (United States)

    Khayet, Mohamed; Fernández, Victoria

    2012-11-14

    Most aerial plant parts are covered with a hydrophobic lipid-rich cuticle, which is the interface between the plant organs and the surrounding environment. Plant surfaces may have a high degree of hydrophobicity because of the combined effects of surface chemistry and roughness. The physical and chemical complexity of the plant cuticle limits the development of models that explain its internal structure and interactions with surface-applied agrochemicals. In this article we introduce a thermodynamic method for estimating the solubilities of model plant surface constituents and relating them to the effects of agrochemicals. Following the van Krevelen and Hoftyzer method, we calculated the solubility parameters of three model plant species and eight compounds that differ in hydrophobicity and polarity. In addition, intact tissues were examined by scanning electron microscopy and the surface free energy, polarity, solubility parameter and work of adhesion of each were calculated from contact angle measurements of three liquids with different polarities. By comparing the affinities between plant surface constituents and agrochemicals derived from (a) theoretical calculations and (b) contact angle measurements we were able to distinguish the physical effect of surface roughness from the effect of the chemical nature of the epicuticular waxes. A solubility parameter model for plant surfaces is proposed on the basis of an increasing gradient from the cuticular surface towards the underlying cell wall. The procedure enabled us to predict the interactions among agrochemicals, plant surfaces, and cuticular and cell wall components, and promises to be a useful tool for improving our understanding of biological surface interactions.

  3. Primary Sclerosing Cholangitis Risk Estimate Tool (PREsTo) Predicts Outcomes in PSC: A Derivation & Validation Study Using Machine Learning.

    Science.gov (United States)

    Eaton, John E; Vesterhus, Mette; McCauley, Bryan M; Atkinson, Elizabeth J; Schlicht, Erik M; Juran, Brian D; Gossard, Andrea A; LaRusso, Nicholas F; Gores, Gregory J; Karlsen, Tom H; Lazaridis, Konstantinos N

    2018-05-09

    Improved methods are needed to risk stratify and predict outcomes in patients with primary sclerosing cholangitis (PSC). Therefore, we sought to derive and validate a new prediction model and compare its performance to existing surrogate markers. The model was derived using 509 subjects from a multicenter North American cohort and validated in an international multicenter cohort (n=278). Gradient boosting, a machine based learning technique, was used to create the model. The endpoint was hepatic decompensation (ascites, variceal hemorrhage or encephalopathy). Subjects with advanced PSC or cholangiocarcinoma at baseline were excluded. The PSC risk estimate tool (PREsTo) consists of 9 variables: bilirubin, albumin, serum alkaline phosphatase (SAP) times the upper limit of normal (ULN), platelets, AST, hemoglobin, sodium, patient age and the number of years since PSC was diagnosed. Validation in an independent cohort confirms PREsTo accurately predicts decompensation (C statistic 0.90, 95% confidence interval (CI) 0.84-0.95) and performed well compared to MELD score (C statistic 0.72, 95% CI 0.57-0.84), Mayo PSC risk score (C statistic 0.85, 95% CI 0.77-0.92) and SAP statistic 0.65, 95% CI 0.55-0.73). PREsTo continued to be accurate among individuals with a bilirubin statistic 0.90, 95% CI 0.82-0.96) and when the score was re-applied at a later course in the disease (C statistic 0.82, 95% CI 0.64-0.95). PREsTo accurately predicts hepatic decompensation in PSC and exceeds the performance among other widely available, noninvasive prognostic scoring systems. This article is protected by copyright. All rights reserved. © 2018 by the American Association for the Study of Liver Diseases.

  4. Numerical modelling as a cost-reduction tool for probability of detection of bolt hole eddy current testing

    Science.gov (United States)

    Mandache, C.; Khan, M.; Fahr, A.; Yanishevsky, M.

    2011-03-01

    Probability of detection (PoD) studies are broadly used to determine the reliability of specific nondestructive inspection procedures, as well as to provide data for damage tolerance life estimations and calculation of inspection intervals for critical components. They require inspections on a large set of samples, a fact that makes these statistical assessments time- and cost-consuming. Physics-based numerical simulations of nondestructive testing inspections could be used as a cost-effective alternative to empirical investigations. They realistically predict the inspection outputs as functions of the input characteristics related to the test piece, transducer and instrument settings, which are subsequently used to partially substitute and/or complement inspection data in PoD analysis. This work focuses on the numerical modelling aspects of eddy current testing for the bolt hole inspections of wing box structures typical of the Lockheed Martin C-130 Hercules and P-3 Orion aircraft, found in the air force inventory of many countries. Boundary element-based numerical modelling software was employed to predict the eddy current signal responses when varying inspection parameters related to probe characteristics, crack geometry and test piece properties. Two demonstrator exercises were used for eddy current signal prediction when lowering the driver probe frequency and changing the material's electrical conductivity, followed by subsequent discussions and examination of the implications on using simulated data in the PoD analysis. Despite some simplifying assumptions, the modelled eddy current signals were found to provide similar results to the actual inspections. It is concluded that physics-based numerical simulations have the potential to partially substitute or complement inspection data required for PoD studies, reducing the cost, time, effort and resources necessary for a full empirical PoD assessment.

  5. Estimation of the influence of tool wear on force signals: A finite element approach in AISI 1045 orthogonal cutting

    Science.gov (United States)

    Equeter, Lucas; Ducobu, François; Rivière-Lorphèvre, Edouard; Abouridouane, Mustapha; Klocke, Fritz; Dehombreux, Pierre

    2018-05-01

    Industrial concerns arise regarding the significant cost of cutting tools in machining process. In particular, their improper replacement policy can lead either to scraps, or to early tool replacements, which would waste fine tools. ISO 3685 provides the flank wear end-of-life criterion. Flank wear is also the nominal type of wear for longest tool lifetimes in optimal cutting conditions. Its consequences include bad surface roughness and dimensional discrepancies. In order to aid the replacement decision process, several tool condition monitoring techniques are suggested. Force signals were shown in the literature to be strongly linked with tools flank wear. It can therefore be assumed that force signals are highly relevant for monitoring the condition of cutting tools and providing decision-aid information in the framework of their maintenance and replacement. The objective of this work is to correlate tools flank wear with numerically computed force signals. The present work uses a Finite Element Model with a Coupled Eulerian-Lagrangian approach. The geometry of the tool is changed for different runs of the model, in order to obtain results that are specific to a certain level of wear. The model is assessed by comparison with experimental data gathered earlier on fresh tools. Using the model at constant cutting parameters, force signals under different tool wear states are computed and provide force signals for each studied tool geometry. These signals are qualitatively compared with relevant data from the literature. At this point, no quantitative comparison could be performed on worn tools because the reviewed literature failed to provide similar studies in this material, either numerical or experimental. Therefore, further development of this work should include experimental campaigns aiming at collecting cutting forces signals and assessing the numerical results that were achieved through this work.

  6. A GIS-based tool for estimating tree canopy cover on fixed-radius plots using high-resolution aerial imagery

    Science.gov (United States)

    Sara A. Goeking; Greg C. Liknes; Erik Lindblom; John Chase; Dennis M. Jacobs; Robert. Benton

    2012-01-01

    Recent changes to the Forest Inventory and Analysis (FIA) Program's definition of forest land precipitated the development of a geographic information system (GIS)-based tool for efficiently estimating tree canopy cover for all FIA plots. The FIA definition of forest land has shifted from a density-related criterion based on stocking to a 10 percent tree canopy...

  7. The estimated effect of mass or footprint reduction in recent light-duty vehicles on U.S. societal fatality risk per vehicle mile traveled.

    Science.gov (United States)

    Wenzel, Tom

    2013-10-01

    The National Highway Traffic Safety Administration (NHTSA) recently updated its 2003 and 2010 logistic regression analyses of the effect of a reduction in light-duty vehicle mass on US societal fatality risk per vehicle mile traveled (VMT; Kahane, 2012). Societal fatality risk includes the risk to both the occupants of the case vehicle as well as any crash partner or pedestrians. The current analysis is the most thorough investigation of this issue to date. This paper replicates the Kahane analysis and extends it by testing the sensitivity of his results to changes in the definition of risk, and the data and control variables used in the regression models. An assessment by Lawrence Berkeley National Laboratory (LBNL) indicates that the estimated effect of mass reduction on risk is smaller than in Kahane's previous studies, and is statistically non-significant for all but the lightest cars (Wenzel, 2012a). The estimated effects of a reduction in mass or footprint (i.e. wheelbase times track width) are small relative to other vehicle, driver, and crash variables used in the regression models. The recent historical correlation between mass and footprint is not so large to prohibit including both variables in the same regression model; excluding footprint from the model, i.e. allowing footprint to decrease with mass, increases the estimated detrimental effect of mass reduction on risk in cars and crossover utility vehicles (CUVs)/minivans, but has virtually no effect on light trucks. Analysis by footprint deciles indicates that risk does not consistently increase with reduced mass for vehicles of similar footprint. Finally, the estimated effects of mass and footprint reduction are sensitive to the measure of exposure used (fatalities per induced exposure crash, rather than per VMT), as well as other changes in the data or control variables used. It appears that the safety penalty from lower mass can be mitigated with careful vehicle design, and that manufacturers can

  8. Twitter as a Potential Disaster Risk Reduction Tool. Part II: Descriptive Analysis of Identified Twitter Activity during the 2013 Hattiesburg F4 Tornado.

    Science.gov (United States)

    Cooper, Guy Paul; Yeager, Violet; Burkle, Frederick M; Subbarao, Italo

    2015-06-29

    This article describes a novel triangulation methodological approach for identifying twitter activity of regional active twitter users during the 2013 Hattiesburg EF-4 Tornado. A data extraction and geographically centered filtration approach was utilized to generate Twitter data for 48 hrs pre- and post-Tornado. The data was further validated using six sigma approach utilizing GPS data. The regional analysis revealed a total of 81,441 tweets, 10,646 Twitter users, 27,309 retweets and 2637 tweets with GPS coordinates. Twitter tweet activity increased 5 fold during the response to the Hattiesburg Tornado.  Retweeting activity increased 2.2 fold. Tweets with a hashtag increased 1.4 fold. Twitter was an effective disaster risk reduction tool for the Hattiesburg EF-4 Tornado 2013.

  9. Building the evidence base for stigma and discrimination-reduction programming in Thailand: development of tools to measure healthcare stigma and discrimination

    Directory of Open Access Journals (Sweden)

    Kriengkrai Srithanaviboonchai

    2017-03-01

    Full Text Available Abstract Background HIV-related stigma and discrimination (S&D are recognized as key impediments to controlling the HIV epidemic. S&D are particularly detrimental within health care settings because people who are at risk of HIV and people living with HIV (PLHIV must seek services from health care facilities. Standardized tools and monitoring systems are needed to inform S&D reduction efforts, measure progress, and monitor trends. This article describes the processes followed to adapt and refine a standardized global health facility staff S&D questionnaire for the context of Thailand and develop a similar questionnaire measuring health facility stigma experienced by PLHIV. Both questionnaires are currently being used for the routine monitoring of HIV-related S&D in the Thai healthcare system. Methods The questionnaires were adapted through a series of consultative meetings, pre-testing, and revision. The revised questionnaires then underwent field testing, and the data and field experiences were analyzed. Results Two brief questionnaires were finalized and are now being used by the Department of Disease Control to collect national routine data for monitoring health facility S&D: 1 a health facility staff questionnaire that collects data on key drivers of S&D in health facilities (i.e., fear of HIV infection, attitudes toward PLHIV and key populations, and health facility policy and environment and observed enacted stigma and 2 a brief PLHIV questionnaire that captures data on experienced discriminatory practices at health care facilities. Conclusions This effort provides an example of how a country can adapt global S&D measurement tools to a local context for use in national routine monitoring. Such data helps to strengthen the national response to HIV through the provision of evidence to shape S&D-reduction programming.

  10. sTools - a data reduction pipeline for the GREGOR Fabry-Pérot Interferometer and the High-resolution Fast Imager at the GREGOR solar telescope

    Science.gov (United States)

    Kuckein, C.; Denker, C.; Verma, M.; Balthasar, H.; González Manrique, S. J.; Louis, R. E.; Diercke, A.

    2017-10-01

    A huge amount of data has been acquired with the GREGOR Fabry-Pérot Interferometer (GFPI), large-format facility cameras, and since 2016 with the High-resolution Fast Imager (HiFI). These data are processed in standardized procedures with the aim of providing science-ready data for the solar physics community. For this purpose, we have developed a user-friendly data reduction pipeline called ``sTools'' based on the Interactive Data Language (IDL) and licensed under creative commons license. The pipeline delivers reduced and image-reconstructed data with a minimum of user interaction. Furthermore, quick-look data are generated as well as a webpage with an overview of the observations and their statistics. All the processed data are stored online at the GREGOR GFPI and HiFI data archive of the Leibniz Institute for Astrophysics Potsdam (AIP). The principles of the pipeline are presented together with selected high-resolution spectral scans and images processed with sTools.

  11. Estimation of the effects of a lead vest on dose reduction for radiation workers using Monte Carlo calculations

    International Nuclear Information System (INIS)

    Young-khi, Lim; Byoung-il, Lee; Jeong-in, Kim

    2008-01-01

    Full text: In the field of medical diagnosis or treatments using radiations, lead vests or aprons are widely used to protect the patients or workers from unwanted irradiation. Also, in nuclear power plants, it is recommended that the workers should wear a lead vest to reduce the dose for working in high radiation area. Generally, personal dosimeters were used to estimate the doses of workers but these cannot give the absolute values. So, measured values should be modified by comparing the reference conditions with conversion factors. Many trials to estimate the doses of workers with lead shield using two or more dosimeters at different locations were done but these had limitations. Through this study the personal dose with/without a lead vest and the effectiveness were evaluated by Monte Carlo methods. A lead vest which had been used at several nuclear sites was modelled with MIRD-V and typical Korean voxel phantom using MCNP-5 transport code. Organ doses were calculated in AP, PA, RLAT, LLAT irradiation geometry for several parallel photon beams. Also irradiation experiments were carried out using real typical Korean phantom with the lead vest and the results were compared with those calculated by simulations. In most cases, the lead vest decreases the organ doses about 30%. For low energy, the lead vest is very effective to reduce the dose but it is not so good for high energy photon shielding. For thyroids, the doses to high energy photons increased by 5% on the contrary. This study may be applied to the better design of personal shielding and dose estimation procedures for practical use. (author)

  12. EucaTool®, a cloud computing application for estimating the growth and production of Eucalyptus globulus Labill. plantations in Galicia (NW Spain

    Directory of Open Access Journals (Sweden)

    Alberto Rojo-Alboreca

    2015-12-01

    Full Text Available Aim of study: To present the software utilities and explain how to use EucaTool®, a free cloud computing application developed to estimate the growth and production of seedling and clonal blue gum (Eucalyptus globulus Labill. plantations in Galicia (NW Spain.Area of study: Galicia (NW Spain.Material and methods: EucaTool® implements a dynamic growth and production model that is valid for clonal and non-clonal blue gum plantations in the region. The model integrates transition functions for dominant height (site index curves, number of stems per hectare (mortality function and basal area, as well as output functions for tree and stand volume, biomass and carbon content.Main results: EucaTool® can be freely accessed from any device with an Internet connection, from http://app.eucatool.com. In addition, useful information about the application is published on a related website: http://www.eucatool.com.Research highlights: The application has been designed to enable forest stakeholders to estimate volume, biomass and carbon content of forest plantations from individual trees, diameter classes or stand data, as well as to estimate growth and future production (indicating the optimal rotation age for maximum income by measurement of only four stand variables: age, number of trees per hectare, dominant height and basal area.Keywords: forest management; biomass; seedling; clones; blue gum; forest tool.

  13. Estimating Potential Reductions in Premature Mortality in New York City From Raising the Minimum Wage to $15.

    Science.gov (United States)

    Tsao, Tsu-Yu; Konty, Kevin J; Van Wye, Gretchen; Barbot, Oxiris; Hadler, James L; Linos, Natalia; Bassett, Mary T

    2016-06-01

    To assess potential reductions in premature mortality that could have been achieved in 2008 to 2012 if the minimum wage had been $15 per hour in New York City. Using the 2008 to 2012 American Community Survey, we performed simulations to assess how the proportion of low-income residents in each neighborhood might change with a hypothetical $15 minimum wage under alternative assumptions of labor market dynamics. We developed an ecological model of premature death to determine the differences between the levels of premature mortality as predicted by the actual proportions of low-income residents in 2008 to 2012 and the levels predicted by the proportions of low-income residents under a hypothetical $15 minimum wage. A $15 minimum wage could have averted 2800 to 5500 premature deaths between 2008 and 2012 in New York City, representing 4% to 8% of total premature deaths in that period. Most of these avertable deaths would be realized in lower-income communities, in which residents are predominantly people of color. A higher minimum wage may have substantial positive effects on health and should be considered as an instrument to address health disparities.

  14. Estimation of CO2 reduction by parallel hard-type power hybridization for gasoline and diesel vehicles.

    Science.gov (United States)

    Oh, Yunjung; Park, Junhong; Lee, Jong Tae; Seo, Jigu; Park, Sungwook

    2017-10-01

    The purpose of this study is to investigate possible improvements in ICEVs by implementing fuzzy logic-based parallel hard-type power hybrid systems. Two types of conventional ICEVs (gasoline and diesel) and two types of HEVs (gasoline-electric, diesel electric) were generated using vehicle and powertrain simulation tools and a Matlab-Simulink application programming interface. For gasoline and gasoline-electric HEV vehicles, the prediction accuracy for four types of LDV models was validated by conducting comparative analysis with the chassis dynamometer and OBD test data. The predicted results show strong correlation with the test data. The operating points of internal combustion engines and electric motors are well controlled in the high efficiency region and battery SOC was well controlled within ±1.6%. However, for diesel vehicles, we generated virtual diesel-electric HEV vehicle because there is no available vehicles with similar engine and vehicle specifications with ICE vehicle. Using a fuzzy logic-based parallel hybrid system in conventional ICEVs demonstrated that HEVs showed superior performance in terms of fuel consumption and CO 2 emission in most driving modes. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Community-based field implementation scenarios of a short message service reporting tool for lymphatic filariasis case estimates in Africa and Asia.

    Science.gov (United States)

    Mableson, Hayley E; Martindale, Sarah; Stanton, Michelle C; Mackenzie, Charles; Kelly-Hope, Louise A

    2017-01-01

    Lymphatic filariasis (LF) is a neglected tropical disease (NTD) targeted for global elimination by 2020. Currently there is considerable international effort to scale-up morbidity management activities in endemic countries, however there remains a need for rapid, cost-effective methods and adaptable tools for obtaining estimates of people presenting with clinical manifestations of LF, namely lymphoedema and hydrocele. The mHealth tool ' MeasureSMS-Morbidity ' allows health workers in endemic areas to use their own mobile phones to send clinical information in a simple format using short message service (SMS). The experience gained through programmatic use of the tool in five endemic countries across a diversity of settings in Africa and Asia is used here to present implementation scenarios that are suitable for adapting the tool for use in a range of different programmatic, endemic, demographic and health system settings. A checklist of five key factors and sub-questions was used to determine and define specific community-based field implementation scenarios for using the MeasureSMS-Morbidity tool in a range of settings. These factors included: (I) tool feasibility (acceptability; community access and ownership); (II) LF endemicity (high; low prevalence); (III) population demography (urban; rural); (IV) health system structure (human resources; community access); and (V) integration with other diseases (co-endemicity). Based on experiences in Bangladesh, Ethiopia, Malawi, Nepal and Tanzania, four implementation scenarios were identified as suitable for using the MeasureSMS-Morbidity tool for searching and reporting LF clinical case data across a range of programmatic, endemic, demographic and health system settings. These include: (I) urban, high endemic setting with two-tier reporting; (II) rural, high endemic setting with one-tier reporting; (III) rural, high endemic setting with two-tier reporting; and (IV) low-endemic, urban and rural setting with one

  16. Estimating energy intensity and CO{sub 2} emission reduction potentials in the manufacturing sectors in Thailand

    Energy Technology Data Exchange (ETDEWEB)

    Wangskarn, P.; Khummongkol, P.; Schrattenholzer, L. [and others

    1996-12-31

    The final energy consumption in Thailand increased at about ten percent annually within the last 10 years. To slow the energy demand growth rate while maintaining the country`s economic advance and environmental sustainability, the Energy Conservation Promotion Act (ECPA) was adopted in 1992. With this Act, a comprehensive Energy Conservation Program (ENCON) was initiated. ENCON commits the government to promoting energy conservation, to developing appropriate regulations, and to providing financial and organizational resources for program implementation. Due to this existing ENCON program a great benefit is expected not only to reducing energy consumption, but also to decreasing GHGs emissions substantially. This study is a part of the ENCON research program which was supported by the German Federal Government under the program called Prompt-Start Measures to Implement the U.N. Framework Convention on Climate Change (FCCC). The basic activities carried out during the project included (1) An assessment of Thailand`s total and specific energy consumption in the industrial sectors and commercial buildings; (2) Identification of existing and candidate technologies for GHG emission reduction and energy efficiency improvements in specific factories and commercial buildings; and (3) Identification of individual factories and commercial buildings as candidates for detailed further study. Although the energy assessment had been carried out for the commercial buildings also, this paper will cover only the work on the manufacturing sector. On the basis of these steps, 14 factories were visited by the project team and preliminary energy audits were performed. As a result, concrete measures and investments were proposed and classified into two groups according to their economic characteristics. Those investments with a payback time of less than four years were considered together in a Moderate scenario, and those with longer payback times in an Intensive scenario.

  17. The power tool

    International Nuclear Information System (INIS)

    HAYFIELD, J.P.

    1999-01-01

    POWER Tool--Planning, Optimization, Waste Estimating and Resourcing tool, a hand-held field estimating unit and relational database software tool for optimizing disassembly and final waste form of contaminated systems and equipment

  18. The estimated reduction in the odds of loss-of-control type crashes for sport utility vehicles equipped with electronic stability control.

    Science.gov (United States)

    Green, Paul E; Woodrooffe, John

    2006-01-01

    Using data from the NASS General Estimates System (GES), the method of induced exposure was used to assess the effects of electronic stability control (ESC) on loss-of-control type crashes for sport utility vehicles. Sport utility vehicles were classified into crash types generally associated with loss of control and crash types most likely not associated with loss of control. Vehicles were then compared as to whether ESC technology was present or absent in the vehicles. A generalized additive model was fit to assess the effects of ESC, driver age, and driver gender on the odds of loss of control. In addition, the effects of ESC on roads that were not dry were compared to effects on roads that were dry. Overall, the estimated percentage reduction in the odds of a loss-of-control crash for sport utility vehicles equipped with ESC was 70.3%. Both genders and all age groups showed reduced odds of loss-of-control crashes, but there was no significant difference between males and females. With respect to driver age, the maximum percentage reduction of 73.6% occurred at age 27. The positive effects of ESC on roads that were not dry were significantly greater than on roads that were dry.

  19. Assessing the Effect of Potential Reductions in Non-Hepatic Mortality on the Estimated Cost-Effectiveness of Hepatitis C Treatment in Early Stages of Liver Disease

    Science.gov (United States)

    Chesson, Harrell W.; Spradling, Philip R.; Holmberg, Scott D.

    2018-01-01

    Background Most cost-effectiveness analyses of hepatitis C (HCV) therapy focus on the benefits of reducing liver-related morbidity and mortality. Objectives Our objective was to assess how cost-effectiveness estimates of HCV therapy can vary depending on assumptions regarding the potential impact of HCV therapy on non-hepatic mortality. Methods We adapted a state-transition model to include potential effects of HCV therapy on non-hepatic mortality. We assumed successful treatment could reduce non-hepatic mortality by as little as 0 % to as much as 100 %. Incremental cost-effectiveness ratios were computed comparing immediate treatment versus delayed treatment and comparing immediate treatment versus non-treatment. Results Comparing immediate treatment versus delayed treatment, when we included a 44 % reduction in nonhepatic mortality following successful HCV treatment, the incremental cost per quality-adjusted life year (QALY) gained by HCV treatment fell by 76 % (from US$314,100 to US$76,900) for patients with no fibrosis and by 43 % (from US$62,500 to US$35,800) for patients with moderate fibrosis. Comparing immediate treatment versus non-treatment, assuming a 44 % reduction in non-hepatic mortality following successful HCV treatment, the incremental cost per QALY gained by HCV treatment fell by 64 % (from US$186,700 to US$67,300) for patients with no fibrosis and by 27 % (from US$35,000 to US$25,500) for patients with moderate fibrosis. Conclusion Including reductions in non-hepatic mortality from HCV treatment can have substantial effects on the estimated cost-effectiveness of treatment. PMID:27480538

  20. Calliphora vicina (Diptera: Calliphoridae) pupae: a timeline of external morphological development and a new age and PMI estimation tool.

    Science.gov (United States)

    Brown, Katherine; Thorne, Alan; Harvey, Michelle

    2015-07-01

    The minimum postmortem interval (PMI(min)) is commonly estimated using calliphorid larvae, for which there are established age estimation methods based on morphological and development data. Despite the increased duration and sedentary nature of the pupal stage of the blowfly, morphological age estimation methods are poorly documented and infrequently used for PMI determination. The aim of this study was to develop a timeline of metamorphosis, focusing on the development of external morphology (within the puparium), to provide a means of age and PMI estimation for Calliphora vicina (Rob-Desvoidy) pupae. Under controlled conditions, 1,494 pupae were reared and sampled at regular time intervals. After puparium removal, observations of 23 external metamorphic developments were correlated to age in accumulated degree hours (ADH). Two age estimation methods were developed based on (1) the combination of possible age ranges observed for each characteristic and (2) regression analyses to generate age estimation equations employing all 23 characteristics observed and a subset of ten characteristics most significantly correlated with age. Blind sample analysis indicated that, using the combination of both methods, pupal age could be estimated to within ±500 ADH with 95% reliability.

  1. Reduction of the estimated radiation dose and associated patient risk with prospective ECG-gated 256-slice CT coronary angiography

    International Nuclear Information System (INIS)

    Efstathopoulos, E P; Kelekis, N L; Pantos, I; Brountzos, E; Argentos, S; Grebac, J; Ziaka, D; Seimenis, I; Katritsis, D G

    2009-01-01

    Computed tomography (CT) coronary angiography has been widely used since the introduction of 64-slice scanners and dual-source CT technology, but high radiation doses have been reported. Prospective ECG-gating using a 'step-and-shoot' axial scanning protocol has been shown to reduce radiation exposure effectively while maintaining diagnostic accuracy. 256-slice scanners with 80 mm detector coverage have been currently introduced into practice, but their impact on radiation exposure has not been adequately studied. The aim of this study was to assess radiation doses associated with CT coronary angiography using a 256-slice CT scanner. Radiation doses were estimated for 25 patients scanned with either prospective or retrospective ECG-gating. Image quality was assessed objectively in terms of mean CT attenuation at selected regions of interest on axial coronary images and subjectively by coronary segment quality scoring. It was found that radiation doses associated with prospective ECG-gating were significantly lower than retrospective ECG-gating (3.2 ± 0.6 mSv versus 13.4 ± 2.7 mSv). Consequently, the radiogenic fatal cancer risk for the patient is much lower with prospective gating (0.0176% versus 0.0737%). No statistically significant differences in image quality were observed between the two scanning protocols for both objective and subjective quality assessments. Therefore, prospective ECG-gating using a 'step-and-shoot' protocol that covers the cardiac anatomy in two axial acquisitions effectively reduces radiation doses in 256-slice CT coronary angiography without compromising image quality.

  2. Estimating the effectiveness of pulmonary rehabilitation for COPD exacerbations: reduction of hospital inpatient days during the following year

    Directory of Open Access Journals (Sweden)

    Katajisto M

    2017-09-01

    Full Text Available Milla Katajisto,1,2 Tarja Laitinen3 1Clinical Research Unit for Pulmonary Diseases, Division of Pulmonology, Helsinki University Hospital Heart and Lung Center, 2Helsinki University, Helsinki, 3Department of Pulmonary Diseases and Clinical Allergology, Turku University Hospital, University of Turku, Turku, Finland Aims: To study the short- and long-term results of pulmonary rehabilitation (PR given in the Helsinki University Heart and Lung Center and to understand the hospital resources used to treat severe COPD exacerbations in the city of Helsinki.Materials and methods: Seventy-eight inactive patients with severe COPD were recruited for a PR course; three of them did not finish the course. The course took 6–8 weeks and included 11–16 supervised exercise sessions. Using electronic medical records, we studied all COPD patients with hospital admission in the city of Helsinki in 2014, including COPD diagnosis, criteria for exacerbation, and potential exclusion/inclusion criteria for PR.Results: Seventy-five of the patients finished the PR course and 92% of those patients showed clinically significant improvement. Their hospital days were reduced by 54% when compared to the year before. At 1 year after the course, 53% of the patients reported that they have continued with regular exercise training. In the city of Helsinki, 437 COPD patients were treated in a hospital due to exacerbation during 2014. On the basis of their electronic medical records, 57% of them would be suitable for PR. According to a rough estimate, 10%–20% hospital days could be saved annually if PR was available to all, assuming that the PR results would be as good as those shown here.Conclusions: The study showed that in a real-world setting, PR is efficient when measured by saved hospital days in severe COPD. Half of the patients could be motivated to continue exercising on their own. Keywords: COPD, severe exacerbation, pulmonary rehabilitation, physical inactivity, COPD

  3. Estimated medical cost reductions for paliperidone palmitate vs placebo in a randomized, double-blind relapse-prevention trial of patients with schizoaffective disorder.

    Science.gov (United States)

    Joshi, K; Lin, J; Lingohr-Smith, M; Fu, D J

    2015-01-01

    The objective of this economic model was to estimate the difference in medical costs among patients treated with paliperidone palmitate once-monthly injectable antipsychotic (PP1M) vs placebo, based on clinical event rates reported in the 15-month randomized, double-blind, placebo-controlled, parallel-group study of paliperidone palmitate evaluating time to relapse in subjects with schizoaffective disorder. Rates of psychotic, depressive, and/or manic relapses and serious and non-serious treatment-emergent adverse events (TEAEs) were obtained from the long-term paliperidone palmitate vs placebo relapse prevention study. The total annual medical cost for a relapse from a US payer perspective was obtained from published literature and the costs for serious and non-serious TEAEs were based on Common Procedure Terminology codes. Total annual medical cost differences for patients treated with PP1M vs placebo were then estimated. Additionally, one-way and Monte Carlo sensitivity analyses were conducted. Lower rates of relapse (-18.3%) and serious TEAEs (-3.9%) were associated with use of PP1M vs placebo as reported in the long-term paliperidone palmitate vs placebo relapse prevention study. As a result of the reduction in these clinical event rates, the total annual medical cost was reduced by $7140 per patient treated with PP1M vs placebo. One-way sensitivity analysis showed that variations in relapse rates had the greatest impact on the estimated medical cost differences (range: -$9786, -$4670). Of the 10,000 random cycles of Monte Carlo simulations, 100% showed a medical cost difference schizoaffective disorder was associated with a significantly lower rate of relapse and a reduction in medical costs compared to placebo. Further evaluation in the real-world setting is warranted.

  4. Tidal Mixing Box Submodel for Tampa Bay: Calibration of Tidal Exchange Flows with the Parameter Estimation Tool (PEST)

    Science.gov (United States)

    In the mid-1990s the Tampa Bay Estuary Program proposed a nutrient reduction strategy focused on improving water clarity to promote seagrass expansion within Tampa Bay. A System Dynamics Model is being developed to evaluate spatially and temporally explicit impacts of nutrient r...

  5. The GAAS Metagenomic Tool and Its Estimations of Viral and Microbial Average Genome Size in Four Major Biomes

    OpenAIRE

    Angly, Florent E.; Willner, Dana; Prieto-Dav?, Alejandra; Edwards, Robert A.; Schmieder, Robert; Vega-Thurber, Rebecca; Antonopoulos, Dionysios A.; Barott, Katie; Cottrell, Matthew T.; Desnues, Christelle; Dinsdale, Elizabeth A.; Furlan, Mike; Haynes, Matthew; Henn, Matthew R.; Hu, Yongfei

    2009-01-01

    Metagenomic studies characterize both the composition and diversity of uncultured viral and microbial communities. BLAST-based comparisons have typically been used for such analyses; however, sampling biases, high percentages of unknown sequences, and the use of arbitrary thresholds to find significant similarities can decrease the accuracy and validity of estimates. Here, we present Genome relative Abundance and Average Size (GAAS), a complete software package that provides improved estimate...

  6. The Massachusetts Sustainable-Yield Estimator: A decision-support tool to assess water availability at ungaged stream locations in Massachusetts

    Science.gov (United States)

    Archfield, Stacey A.; Vogel, Richard M.; Steeves, Peter A.; Brandt, Sara L.; Weiskel, Peter K.; Garabedian, Stephen P.

    2010-01-01

    Federal, State and local water-resource managers require a variety of data and modeling tools to better understand water resources. The U.S. Geological Survey, in cooperation with the Massachusetts Department of Environmental Protection, has developed a statewide, interactive decision-support tool to meet this need. The decision-support tool, referred to as the Massachusetts Sustainable-Yield Estimator (MA SYE) provides screening-level estimates of the sustainable yield of a basin, defined as the difference between the unregulated streamflow and some user-specified quantity of water that must remain in the stream to support such functions as recreational activities or aquatic habitat. The MA SYE tool was designed, in part, because the quantity of surface water available in a basin is a time-varying quantity subject to competing demands for water. To compute sustainable yield, the MA SYE tool estimates a daily time series of unregulated, daily mean streamflow for a 44-year period of record spanning October 1, 1960, through September 30, 2004. Selected streamflow quantiles from an unregulated, daily flow-duration curve are estimated by solving six regression equations that are a function of physical and climate basin characteristics at an ungaged site on a stream of interest. Streamflow is then interpolated between the estimated quantiles to obtain a continuous daily flow-duration curve. A time series of unregulated daily streamflow subsequently is created by transferring the timing of the daily streamflow at a reference streamgage to the ungaged site by equating exceedence probabilities of contemporaneous flow at the two locations. One of 66 reference streamgages is selected by kriging, a geostatistical method, which is used to map the spatial relation among correlations between the time series of the logarithm of daily streamflows at each reference streamgage and the ungaged site. Estimated unregulated, daily mean streamflows show good agreement with observed

  7. Phylogenetic Reconstruction as a Broadly Applicable Teaching Tool in the Biology Classroom: The Value of Data in Estimating Likely Answers

    Science.gov (United States)

    Julius, Matthew L.; Schoenfuss, Heiko L.

    2006-01-01

    This laboratory exercise introduces students to a fundamental tool in evolutionary biology--phylogenetic inference. Students are required to create a data set via observation and through mining preexisting data sets. These student data sets are then used to develop and compare competing hypotheses of vertebrate phylogeny. The exercise uses readily…

  8. Numerical tools to estimate the flux of a gas across the air–water interface and assess the heterogeneity of its forcing functions

    Directory of Open Access Journals (Sweden)

    V. M. N. C. S. Vieira

    2013-03-01

    Full Text Available A numerical tool was developed for the estimation of gas fluxes across the air–water interface. The primary objective is to use it to estimate CO2 fluxes. Nevertheless application to other gases is easily accomplished by changing the values of the parameters related to the physical properties of the gases. A user-friendly software was developed allowing to build upon a standard kernel a custom-made gas flux model with the preferred parameterizations. These include single or double layer models; several numerical schemes for the effects of wind in the air-side and water-side transfer velocities; the effects of atmospheric stability, surface roughness and turbulence from current drag with the bottom; and the effects on solubility of water temperature, salinity, air temperature and pressure. An analysis was also developed which decomposes the difference between the fluxes in a reference situation and in alternative situations into its several forcing functions. This analysis relies on the Taylor expansion of the gas flux model, requiring the numerical estimation of partial derivatives by a multivariate version of the collocation polynomial. Both the flux model and the difference decomposition analysis were tested with data taken from surveys done in the lagoon system of Ria Formosa, south Portugal, in which the CO2 fluxes were estimated using the infrared gas analyzer (IRGA and floating chamber method, whereas the CO2 concentrations were estimated using the IRGA and degasification chamber. Observations and estimations show a remarkable fit.

  9. Racemization of aspartic acid in root dentin as a tool for age estimation in a Kuwaiti population.

    Science.gov (United States)

    Elfawal, Mohamed Amin; Alqattan, Sahib Issa; Ghallab, Noha Ayman

    2015-01-01

    Estimation of age is one of the most significant tasks in forensic practice. Amino acid racemization is considered one of the most reliable and accurate methods of age estimation and aspartic acid shows a high racemization reaction rate. The present study has investigated the application of aspartic acid racemization in age estimation in a Kuwaiti population using root dentin from a total of 89 upper first premolar teeth. The D/L ratio of aspartic acid was obtained by HPLC technique in a test group of 50 subjects and a linear regression line was established between aspartic acid racemization and age. The correlation coefficient (r) was 0.97, and the standard error of estimation was ±1.26 years. The racemization age "t" of each subject was calculated by applying the following formula: ln [(1 + D/L)/(1 - D/L)] = 0.003181 t + (-0.01591). When the proposed formula "estimated age t = ln [(1 + D/L)/(1 - D/L)] + 0.01591/0.003181" was applied to a validation group of 39 subjects, the range of error was less than one year in 82.1% of the cases and the standard error of estimation was ±1.12. The current work has established a reasonably significant correlation of the D-/L-aspartic acid ratio with age, and proposed an apparently reliable formula for calculating the age in Kuwaiti populations through aspartic acid racemization. Further research is required to find out whether similar findings are applicable to other ethnic populations. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  10. SU-G-IeP3-05: Effects of Image Receptor Technology and Dose Reduction Software On Radiation Dose Estimates for Fluoroscopically-Guided Interventional (FGI) Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Merritt, Z; Dave, J; Eschelman, D; Gonsalves, C [Thomas Jefferson University, Philadelphia, PA (United States)

    2016-06-15

    Purpose: To investigate the effects of image receptor technology and dose reduction software on radiation dose estimates for most frequently performed fluoroscopically-guided interventional (FGI) procedures at a tertiary health care center. Methods: IRB approval was obtained for retrospective analysis of FGI procedures performed in the interventional radiology suites between January-2011 and December-2015. This included procedures performed using image-intensifier (II) based systems which were subsequently replaced, flat-panel-detector (FPD) based systems which were later upgraded with ClarityIQ dose reduction software (Philips Healthcare) and relatively new FPD system already equipped with ClarityIQ. Post procedure, technologists entered system-reported cumulative air kerma (CAK) and kerma-area product (KAP; only KAP for II based systems) in RIS; these values were analyzed. Data pre-processing included correcting typographical errors and cross-verifying CAK and KAP. The most frequent high and low dose FGI procedures were identified and corresponding CAK and KAP values were compared. Results: Out of 27,251 procedures within this time period, most frequent high and low dose procedures were chemo/immuno-embolization (n=1967) and abscess drainage (n=1821). Mean KAP for embolization and abscess drainage procedures were 260,657, 310,304 and 94,908 mGycm{sup 2}, and 14,497, 15,040 and 6307 mGycm{sup 2} using II-, FPD- and FPD with ClarityIQ- based systems, respectively. Statistically significant differences were observed in KAP values for embolization procedures with respect to different systems but for abscess drainage procedures significant differences were only noted between systems with FPD and FPD with ClarityIQ (p<0.05). Mean CAK reduced significantly from 823 to 308 mGy and from 43 to 21 mGy for embolization and abscess drainage procedures, respectively, in transitioning to FPD systems with ClarityIQ (p<0.05). Conclusion: While transitioning from II- to FPD- based

  11. Dengue prediction by the web: Tweets are a useful tool for estimating and forecasting Dengue at country and city level.

    Directory of Open Access Journals (Sweden)

    Cecilia de Almeida Marques-Toledo

    2017-07-01

    Full Text Available Infectious diseases are a leading threat to public health. Accurate and timely monitoring of disease risk and progress can reduce their impact. Mentioning a disease in social networks is correlated with physician visits by patients, and can be used to estimate disease activity. Dengue is the fastest growing mosquito-borne viral disease, with an estimated annual incidence of 390 million infections, of which 96 million manifest clinically. Dengue burden is likely to increase in the future owing to trends toward increased urbanization, scarce water supplies and, possibly, environmental change. The epidemiological dynamic of Dengue is complex and difficult to predict, partly due to costly and slow surveillance systems.In this study, we aimed to quantitatively assess the usefulness of data acquired by Twitter for the early detection and monitoring of Dengue epidemics, both at country and city level at a weekly basis. Here, we evaluated and demonstrated the potential of tweets modeling for Dengue estimation and forecast, in comparison with other available web-based data, Google Trends and Wikipedia access logs. Also, we studied the factors that might influence the goodness-of-fit of the model. We built a simple model based on tweets that was able to 'nowcast', i.e. estimate disease numbers in the same week, but also 'forecast' disease in future weeks. At the country level, tweets are strongly associated with Dengue cases, and can estimate present and future Dengue cases until 8 weeks in advance. At city level, tweets are also useful for estimating Dengue activity. Our model can be applied successfully to small and less developed cities, suggesting a robust construction, even though it may be influenced by the incidence of the disease, the activity of Twitter locally, and social factors, including human development index and internet access.Tweets association with Dengue cases is valuable to assist traditional Dengue surveillance at real-time and low

  12. Dengue prediction by the web: Tweets are a useful tool for estimating and forecasting Dengue at country and city level.

    Science.gov (United States)

    Marques-Toledo, Cecilia de Almeida; Degener, Carolin Marlen; Vinhal, Livia; Coelho, Giovanini; Meira, Wagner; Codeço, Claudia Torres; Teixeira, Mauro Martins

    2017-07-01

    Infectious diseases are a leading threat to public health. Accurate and timely monitoring of disease risk and progress can reduce their impact. Mentioning a disease in social networks is correlated with physician visits by patients, and can be used to estimate disease activity. Dengue is the fastest growing mosquito-borne viral disease, with an estimated annual incidence of 390 million infections, of which 96 million manifest clinically. Dengue burden is likely to increase in the future owing to trends toward increased urbanization, scarce water supplies and, possibly, environmental change. The epidemiological dynamic of Dengue is complex and difficult to predict, partly due to costly and slow surveillance systems. In this study, we aimed to quantitatively assess the usefulness of data acquired by Twitter for the early detection and monitoring of Dengue epidemics, both at country and city level at a weekly basis. Here, we evaluated and demonstrated the potential of tweets modeling for Dengue estimation and forecast, in comparison with other available web-based data, Google Trends and Wikipedia access logs. Also, we studied the factors that might influence the goodness-of-fit of the model. We built a simple model based on tweets that was able to 'nowcast', i.e. estimate disease numbers in the same week, but also 'forecast' disease in future weeks. At the country level, tweets are strongly associated with Dengue cases, and can estimate present and future Dengue cases until 8 weeks in advance. At city level, tweets are also useful for estimating Dengue activity. Our model can be applied successfully to small and less developed cities, suggesting a robust construction, even though it may be influenced by the incidence of the disease, the activity of Twitter locally, and social factors, including human development index and internet access. Tweets association with Dengue cases is valuable to assist traditional Dengue surveillance at real-time and low-cost. Tweets are

  13. Quantitative Estimation of Yeast on Maxillary Denture in Patients with Denture Stomatitis and the Effect of Chlorhexidine Gluconate in Reduction of Yeast

    Directory of Open Access Journals (Sweden)

    Jaykumar R Gade

    2011-01-01

    Full Text Available Denture stomatitis is a condition associated with wearing of a denture. The predisposing factor leading to denture stomatitis could be poor oral hygiene, ill-fitting denture and relief areas. Around 30 patients with denture stomatitis were advised to rinse with chlorhexidine gluconate mouthwash for 14 days and were directed to immerse the upper denture in the chlorhexidine solution for 8 hours. The samples were collected by scraping maxillary denture in saline at three intervals, prior to, at the end of 24 hours and after 14 days of treatment, then were inoculated and quantitative estimation of the yeast growth on Sabouraud′s dextrose agar plate was done. It was observed that after a period of 14 days, there was a reduction in the growth of yeast and also improvement in the clinical picture of the oral mucosa

  14. Prediction of toxicity and comparison of alternatives using WebTEST (Web-services Toxicity Estimation Software Tool)

    Science.gov (United States)

    A Java-based web service is being developed within the US EPA’s Chemistry Dashboard to provide real time estimates of toxicity values and physical properties. WebTEST can generate toxicity predictions directly from a simple URL which includes the endpoint, QSAR method, and ...

  15. Breast dose reduction for chest CT by modifying the scanning parameters based on the pre-scan size-specific dose estimate (SSDE)

    Energy Technology Data Exchange (ETDEWEB)

    Kidoh, Masafumi; Utsunomiya, Daisuke; Oda, Seitaro; Nakaura, Takeshi; Yuki, Hideaki; Hirata, Kenichiro; Namimoto, Tomohiro; Sakabe, Daisuke; Hatemura, Masahiro; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Faculty of Life Sciences, Honjo, Kumamoto (Japan); Funama, Yoshinori [Kumamoto University, Department of Medical Physics, Faculty of Life Sciences, Honjo, Kumamoto (Japan)

    2017-06-15

    To investigate the usefulness of modifying scanning parameters based on the size-specific dose estimate (SSDE) for a breast-dose reduction for chest CT. We scanned 26 women with a fixed volume CT dose index (CTDI{sub vol}) (15 mGy) and another 26 with a fixed SSDE (15 mGy) protocol (protocol 1 and 2, respectively). In protocol 2, tube current was calculated based on the patient habitus obtained on scout images. We compared the mean breast dose and the inter-patient breast dose variability and performed linear regression analysis of the breast dose and the body mass index (BMI) of the two protocols. The mean breast dose was about 35 % lower under protocol 2 than protocol 1 (10.9 mGy vs. 16.8 mGy, p < 0.01). The inter-patient breast dose variability was significantly lower under protocol 2 than 1 (1.2 mGy vs. 2.5 mGy, p < 0.01). We observed a moderate negative correlation between the breast dose and the BMI under protocol 1 (r = 0.43, p < 0.01); there was no significant correlation (r = 0.06, p = 0.35) under protocol 2. The SSDE-based protocol achieved a reduction in breast dose and in inter-patient breast dose variability. (orig.)

  16. Commissioning the neutron production of a Linac: Development of a simple tool for second cancer risk estimation

    International Nuclear Information System (INIS)

    Romero-Expósito, M.; Sánchez-Nieto, B.; Terrón, J. A.; Lopes, M. C.; Ferreira, B. C.; Grishchuk, D.; Sandín, C.; Moral-Sánchez, S.; Melchor, M.; Domingo, C.

    2015-01-01

    Purpose: Knowing the contribution of neutron to collateral effects in treatments is both a complex and a mandatory task. This work aims to present an operative procedure for neutron estimates in any facility using a neutron digital detector. Methods: The authors’ previous work established a linear relationship between the total second cancer risk due to neutrons (TR n ) and the number of MU of the treatment. Given that the digital detector also presents linearity with MU, its response can be used to determine the TR n per unit MU, denoted as m, normally associated to a generic Linac model and radiotherapy facility. Thus, from the number of MU of each patient treatment, the associated risk can be estimated. The feasibility of the procedure was tested by applying it in eight facilities; patients were evaluated as well. Results: From the reading of the detector under selected irradiation conditions, m values were obtained for different machines, ranging from 0.25 × 10 −4 % per MU for an Elekta Axesse at 10 MV to 6.5 × 10 −4 % per MU for a Varian Clinac at 18 MV. Using these values, TR n of patients was estimated in each facility and compared to that from the individual evaluation. Differences were within the range of uncertainty of the authors’ methodology of equivalent dose and risk estimations. Conclusions: The procedure presented here allows an easy estimation of the second cancer risk due to neutrons for any patient, given the number of MU of the treatment. It will enable the consideration of this information when selecting the optimal treatment for a patient by its implementation in the treatment planning system

  17. Commissioning the neutron production of a Linac: Development of a simple tool for second cancer risk estimation

    Energy Technology Data Exchange (ETDEWEB)

    Romero-Expósito, M., E-mail: mariateresa.romero@uab.cat [Departamento de Fisiología Médica y Biofísica, Universidad de Sevilla, Sevilla 41009, Spain and Departament de Física, Universitat Autònoma de Barcelona, Bellaterra 08193 (Spain); Sánchez-Nieto, B. [Instituto de Física, Pontificia Universidad Católica de Chile, Santiago 4880 (Chile); Terrón, J. A. [Servicio de Radiofísica, Hospital Universitario Virgen Macarena, Sevilla 41009 (Spain); Lopes, M. C. [Serviço de Física Médica, Instituto Português de Oncologia, Coimbra 3000-075 (Portugal); Ferreira, B. C. [i3N, Department of Physics, University of Aveiro, Aveiro 3810-193 (Portugal); Grishchuk, D. [Radiotherapy Service, Russian Research Center for Radiology and Surgical Technology, Saint Petersburg 197758 (Russian Federation); Sandín, C. [Elekta, Ltd., Crawley RH10 9RR (United Kingdom); Moral-Sánchez, S. [Servicio de Radiofísica, Instituto Onkologikoa, San Sebastián 20014 (Spain); Melchor, M. [Servicio de Radiofísica, Hospital Universitario de la Ribera, Alzira 46600, Valencia (Spain); Domingo, C. [Departament de Física, Universitat Autònoma de Barcelona, Bellaterra 08193 (Spain); and others

    2015-01-15

    Purpose: Knowing the contribution of neutron to collateral effects in treatments is both a complex and a mandatory task. This work aims to present an operative procedure for neutron estimates in any facility using a neutron digital detector. Methods: The authors’ previous work established a linear relationship between the total second cancer risk due to neutrons (TR{sup n}) and the number of MU of the treatment. Given that the digital detector also presents linearity with MU, its response can be used to determine the TR{sup n} per unit MU, denoted as m, normally associated to a generic Linac model and radiotherapy facility. Thus, from the number of MU of each patient treatment, the associated risk can be estimated. The feasibility of the procedure was tested by applying it in eight facilities; patients were evaluated as well. Results: From the reading of the detector under selected irradiation conditions, m values were obtained for different machines, ranging from 0.25 × 10{sup −4}% per MU for an Elekta Axesse at 10 MV to 6.5 × 10{sup −4}% per MU for a Varian Clinac at 18 MV. Using these values, TR{sup n} of patients was estimated in each facility and compared to that from the individual evaluation. Differences were within the range of uncertainty of the authors’ methodology of equivalent dose and risk estimations. Conclusions: The procedure presented here allows an easy estimation of the second cancer risk due to neutrons for any patient, given the number of MU of the treatment. It will enable the consideration of this information when selecting the optimal treatment for a patient by its implementation in the treatment planning system.

  18. PAF: A software tool to estimate free-geometry extended bodies of anomalous pressure from surface deformation data

    Science.gov (United States)

    Camacho, A. G.; Fernández, J.; Cannavò, F.

    2018-02-01

    We present a software package to carry out inversions of surface deformation data (any combination of InSAR, GPS, and terrestrial data, e.g., EDM, levelling) as produced by 3D free-geometry extended bodies with anomalous pressure changes. The anomalous structures are described as an aggregation of elementary cells (whose effects are estimated as coming from point sources) in an elastic half space. The linear inverse problem (considering some simple regularization conditions) is solved by means of an exploratory approach. This software represents the open implementation of a previously published methodology (Camacho et al., 2011). It can be freely used with large data sets (e.g. InSAR data sets) or with data coming from small control networks (e.g. GPS monitoring data), mainly in volcanic areas, to estimate the expected pressure bodies representing magmatic intrusions. Here, the software is applied to some real test cases.

  19. A software tool to estimate the dynamic behaviour of the IP{sup 2}C samples as sensors for didactic purposes

    Energy Technology Data Exchange (ETDEWEB)

    Graziani, S.; Pagano, F.; Pitrone, N.; Umana, E., E-mail: nicola.pitrone@diees.unict.i [Dipartimento di Ingegneria Elettrica Elettronica e dei Sistemi -University of Catania V.le A. Doria 6, 95125, Catania (Italy)

    2010-07-01

    Ionic Polymer Polymer Composites (IP{sup 2}Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP{sup 2}C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP{sup 2}Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP{sup 2}C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP{sup 2}C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  20. Estimating the Impacts of Direct Load Control Programs Using GridPIQ, a Web-Based Screening Tool

    Energy Technology Data Exchange (ETDEWEB)

    Pal, Seemita; Thayer, Brandon L.; Barrett, Emily L.; Studarus, Karen E.

    2017-11-13

    In direct load control (DLC) programs, utilities can curtail the demand of participating loads to contractually agreed-upon levels during periods of critical peak load, thereby reducing stress on the system, generation cost, and required transmission and generation capacity. Participating customers receive financial incentives. The impacts of implementing DLC programs extend well beyond peak shaving. There may be a shift of load proportional to the interrupted load to the times before or after a DLC event, and different load shifts have different consequences. Tools that can quantify the impacts of such programs on load curves, peak demand, emissions, and fossil fuel costs are currently lacking. The Grid Project Impact Quantification (GridPIQ) screening tool includes a Direct Load Control module, which takes into account project-specific inputs as well as the larger system context in order to quantify the impacts of a given DLC program. This allows users (utilities, researchers, etc.) to test and compare different program specifications and their impacts.

  1. The construction of a decision tool to analyse local demand and local supply for GP care using a synthetic estimation model.

    Science.gov (United States)

    de Graaf-Ruizendaal, Willemijn A; de Bakker, Dinny H

    2013-10-27

    This study addresses the growing academic and policy interest in the appropriate provision of local healthcare services to the healthcare needs of local populations to increase health status and decrease healthcare costs. However, for most local areas information on the demand for primary care and supply is missing. The research goal is to examine the construction of a decision tool which enables healthcare planners to analyse local supply and demand in order to arrive at a better match. National sample-based medical record data of general practitioners (GPs) were used to predict the local demand for GP care based on local populations using a synthetic estimation technique. Next, the surplus or deficit in local GP supply were calculated using the national GP registry. Subsequently, a dynamic internet tool was built to present demand, supply and the confrontation between supply and demand regarding GP care for local areas and their surroundings in the Netherlands. Regression analysis showed a significant relationship between sociodemographic predictors of postcode areas and GP consultation time (F [14, 269,467] = 2,852.24; P 1,000 inhabitants in the Netherlands covering 97% of the total population. Confronting these estimated demand figures with the actual GP supply resulted in the average GP workload and the number of full-time equivalent (FTE) GP too much/too few for local areas to cover the demand for GP care. An estimated shortage of one FTE GP or more was prevalent in about 19% of the postcode areas with >1,000 inhabitants if the surrounding postcode areas were taken into consideration. Underserved areas were mainly found in rural regions. The constructed decision tool is freely accessible on the Internet and can be used as a starting point in the discussion on primary care service provision in local communities and it can make a considerable contribution to a primary care system which provides care when and where people need it.

  2. Culvert Analysis Program Graphical User Interface 1.0--A preprocessing and postprocessing tool for estimating flow through culvert

    Science.gov (United States)

    Bradley, D. Nathan

    2013-01-01

    The peak discharge of a flood can be estimated from the elevation of high-water marks near the inlet and outlet of a culvert after the flood has occurred. This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks on trees or buildings. When combined with the cross-sectional geometry of the channel upstream from the culvert and the culvert size, shape, roughness, and orientation, the high-water marks define a water-surface profile that can be used to estimate the peak discharge by using the methods described by Bodhaine (1968). This type of measurement is in contrast to a “direct” measurement of discharge made during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a streamgage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the streamgage, resulting in more accurate computation of high flows. The Culvert Analysis Program (CAP) (Fulford, 1998) is a command-line program written in Fortran for computing peak discharges and culvert rating surfaces or curves. CAP reads input data from a formatted text file and prints results to another formatted text file. Preparing and correctly formatting the input file may be time-consuming and prone to errors. This document describes the CAP graphical user interface (GUI)—a modern, cross-platform, menu-driven application that prepares the CAP input file, executes the program, and helps the user interpret the output

  3. Lattice energy calculation - A quick tool for screening of cocrystals and estimation of relative solubility. Case of flavonoids

    Science.gov (United States)

    Kuleshova, L. N.; Hofmann, D. W. M.; Boese, R.

    2013-03-01

    Cocrystals (or multicomponent crystals) have physico-chemical properties that are different from crystals of pure components. This is significant in drug development, since the desired properties, e.g. solubility, stability and bioavailability, can be tailored by binding two substances into a single crystal without chemical modification of an active component. Here, the FLEXCRYST program suite, implemented with a data mining force field, was used to estimate the relative stability and, consequently, the relative solubility of cocrystals of flavonoids vs their pure crystals, stored in the Cambridge Structural Database. The considerable potency of this approach for in silico screening of cocrystals, as well as their relative solubility, was demonstrated.

  4. FPGA-based fused smart-sensor for tool-wear area quantitative estimation in CNC machine inserts.

    Science.gov (United States)

    Trejo-Hernandez, Miguel; Osornio-Rios, Roque Alfredo; de Jesus Romero-Troncoso, Rene; Rodriguez-Donate, Carlos; Dominguez-Gonzalez, Aurelio; Herrera-Ruiz, Gilberto

    2010-01-01

    Manufacturing processes are of great relevance nowadays, when there is a constant claim for better productivity with high quality at low cost. The contribution of this work is the development of a fused smart-sensor, based on FPGA to improve the online quantitative estimation of flank-wear area in CNC machine inserts from the information provided by two primary sensors: the monitoring current output of a servoamplifier, and a 3-axis accelerometer. Results from experimentation show that the fusion of both parameters makes it possible to obtain three times better accuracy when compared with the accuracy obtained from current and vibration signals, individually used.

  5. Estimating nutrient releases from agriculture in China: An extended substance flow analysis framework and a modeling tool

    International Nuclear Information System (INIS)

    Chen, M.; Chen, J.; Sun, F.

    2010-01-01

    Agriculture related pollution has attracted the attention of policy makers as well as scientists in China as its contribution to water impairment has increased, and quantitative information at the national and regional levels is being sought to support decision making. However, traditional approaches are either time-consuming, expensive (e.g. national surveys) or oversimplified and crude (e.g. coefficient methods). Therefore, this study proposed an extended substance flow analysis (SFA) framework to estimate nutrient releases from agricultural and rural activities in China by depicting the nutrient flows in Chinese agro-ecosystems. The six-step process proposed herein includes: (a) system definition; (b) model development; (c) database development; (d) model validation; (e) results interpretation; and (f) uncertainty analysis. The developed Eubolism (Elementary Unit based nutrient Balance mOdeLIng in agro-ecoSysteM) model combined a nutrient balance module with an emission inventory module to quantify the nutrient flows in the agro-ecosystem. The model was validated and then applied to estimate the total agricultural nutrient loads, identify the contribution of different agricultural and rural activities and different land use types to the total loads, and analyze the spatial pattern of agricultural nutrient emissions in China. These results could provide an entire picture of agricultural pollution at the national level and be used to support policy making. Furthermore, uncertainties associated with the structure of the elementary units, spatial resolution, and inputs/parameters were also analyzed to evaluate the robustness of the model results.

  6. The detector response simulation for the CBM silicon tracking system as a tool for hit error estimation

    Energy Technology Data Exchange (ETDEWEB)

    Malygina, Hanna [Goethe Universitaet Frankfurt (Germany); KINR, Kyiv (Ukraine); GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany); Friese, Volker; Zyzak, Maksym [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany); Collaboration: CBM-Collaboration

    2016-07-01

    The Compressed Baryonic Matter experiment(CBM) at FAIR is designed to explore the QCD phase diagram in the region of high net-baryon densities. As the central detector component, the Silicon Tracking System (STS) is based on double-sided micro-strip sensors. To achieve realistic modelling, the response of the silicon strip sensors should be precisely included in the digitizer which simulates a complete chain of physical processes caused by charged particles traversing the detector, from charge creation in silicon to a digital output signal. The current implementation of the STS digitizer comprises non-uniform energy loss distributions (according to the Urban theory), thermal diffusion and charge redistribution over the read-out channels due to interstrip capacitances. Using the digitizer, one can test an influence of each physical processes on hit error separately. We have developed a new cluster position finding algorithm and a hit error estimation method for it. Estimated errors were verified by the width of pull distribution (expected to be about unity) and its shape.

  7. Evidence-based optimal number of radiotherapy fractions for cancer: A useful tool to estimate radiotherapy demand.

    Science.gov (United States)

    Wong, Karen; Delaney, Geoff P; Barton, Michael B

    2016-04-01

    The recently updated optimal radiotherapy utilisation model estimated that 48.3% of all cancer patients should receive external beam radiotherapy at least once during their disease course. Adapting this model, we constructed an evidence-based model to estimate the optimal number of fractions for notifiable cancers in Australia to determine equipment and workload implications. The optimal number of fractions was calculated based on the frequency of specific clinical conditions where radiotherapy is indicated and the evidence-based recommended number of fractions for each condition. Sensitivity analysis was performed to assess the impact of variables on the model. Of the 27 cancer sites, the optimal number of fractions for the first course of radiotherapy ranged from 0 to 23.3 per cancer patient, and 1.5 to 29.1 per treatment course. Brain, prostate and head and neck cancers had the highest average number of fractions per course. Overall, the optimal number of fractions was 9.4 per cancer patient (range 8.7-10.0) and 19.4 per course (range 18.0-20.7). These results provide valuable data for radiotherapy services planning and comparison with actual practice. The model can be easily adapted by inserting population-specific epidemiological data thus making it applicable to other jurisdictions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Review of Bothnian Sea shore-level displacement data and use of a GIS tool to estimate isostatic uplift

    International Nuclear Information System (INIS)

    Vuorela, A.; Penttinen, T.; Lahdenperae, A.-M.

    2009-02-01

    The aim and approach of the study were to produce source data estimates necessary for modelling the future biosphere. The study updated the list of 14 C datings of sea-level index points, which show when lakes and mires were isolated from the Baltic Sea due to isostatic uplift. The study concentrated on the Bothnian Sea, especially the Olkiluoto area. The older Finnish datings (a list of 260 sea-level index points determined in 1995) were checked and revised as needed. New data was available for 56 Finnish and 41 Swedish index points. State-of-the-art 14 C calibration methods were applied. Various available data were used to estimate the parameters of the glacio-isostatic uplift model's slow component. The component describes the uplift in relation to time using parameters B s , which is related to the uplift's total duration, and A s , which is half of the total uplift possible in the period lasting from the Last Glacial Maximum to the distant future. The B s values were estimated by means of 1) crustal thickness and 2) shoreline displacement curves. In applying method 1, this study revised the function describing the relationship between crustal thickness and B s and created a new derivative-based method that also estimates the parameter A s without radiocarbon datings and using only crustal thickness and current uplift maps. In method 2, sea-level index point subsets along the Finnish and Swedish coasts of the Bothnian Sea were selected from the revised database, and their datings and elevations were used to determine the corresponding land uplift parameters. The parameter value distributions were used to produce maps. The values of the inertia factor B s are on average 6% higher than those calculated in 2001 but they are 10% lower in the Olkiluoto region. According to the interpolations of the new and old data, the estimated uplift at Olkiluoto for AD 12000 is 2.8 m (7%) less than calculated previously. The derivative-based method predicts an uplift for AD

  9. SVPWM Technique with Varying DC-Link Voltage for Common Mode Voltage Reduction in a Matrix Converter and Analytical Estimation of its Output Voltage Distortion

    Science.gov (United States)

    Padhee, Varsha

    Common Mode Voltage (CMV) in any power converter has been the major contributor to premature motor failures, bearing deterioration, shaft voltage build up and electromagnetic interference. Intelligent control methods like Space Vector Pulse Width Modulation (SVPWM) techniques provide immense potential and flexibility to reduce CMV, thereby targeting all the afore mentioned problems. Other solutions like passive filters, shielded cables and EMI filters add to the volume and cost metrics of the entire system. Smart SVPWM techniques therefore, come with a very important advantage of being an economical solution. This thesis discusses a modified space vector technique applied to an Indirect Matrix Converter (IMC) which results in the reduction of common mode voltages and other advanced features. The conventional indirect space vector pulse-width modulation (SVPWM) method of controlling matrix converters involves the usage of two adjacent active vectors and one zero vector for both rectifying and inverting stages of the converter. By suitable selection of space vectors, the rectifying stage of the matrix converter can generate different levels of virtual DC-link voltage. This capability can be exploited for operation of the converter in different ranges of modulation indices for varying machine speeds. This results in lower common mode voltage and improves the harmonic spectrum of the output voltage, without increasing the number of switching transitions as compared to conventional modulation. To summarize it can be said that the responsibility of formulating output voltages with a particular magnitude and frequency has been transferred solely to the rectifying stage of the IMC. Estimation of degree of distortion in the three phase output voltage is another facet discussed in this thesis. An understanding of the SVPWM technique and the switching sequence of the space vectors in detail gives the potential to estimate the RMS value of the switched output voltage of any

  10. Stochastic differential equations as a tool to regularize the parameter estimation problem for continuous time dynamical systems given discrete time measurements.

    Science.gov (United States)

    Leander, Jacob; Lundh, Torbjörn; Jirstrand, Mats

    2014-05-01

    In this paper we consider the problem of estimating parameters in ordinary differential equations given discrete time experimental data. The impact of going from an ordinary to a stochastic differential equation setting is investigated as a tool to overcome the problem of local minima in the objective function. Using two different models, it is demonstrated that by allowing noise in the underlying model itself, the objective functions to be minimized in the parameter estimation procedures are regularized in the sense that the number of local minima is reduced and better convergence is achieved. The advantage of using stochastic differential equations is that the actual states in the model are predicted from data and this will allow the prediction to stay close to data even when the parameters in the model is incorrect. The extended Kalman filter is used as a state estimator and sensitivity equations are provided to give an accurate calculation of the gradient of the objective function. The method is illustrated using in silico data from the FitzHugh-Nagumo model for excitable media and the Lotka-Volterra predator-prey system. The proposed method performs well on the models considered, and is able to regularize the objective function in both models. This leads to parameter estimation problems with fewer local minima which can be solved by efficient gradient-based methods. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  11. How accurate are adolescents in portion-size estimation using the computer tool Young Adolescents' Nutrition Assessment on Computer (YANA-C)?

    Science.gov (United States)

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-06-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amounts of ten commonly consumed foods (breakfast cereals, French fries, pasta, rice, apple sauce, carrots and peas, crisps, creamy velouté, red cabbage, and peas). Two procedures were followed: (1) short-term recall: adolescents (n 73) self-served their usual portions of the ten foods and estimated the amounts later the same day; (2) real-time perception: adolescents (n 128) estimated two sets (different portions) of pre-weighed portions displayed near the computer. Self-served portions were, on average, 8 % underestimated; significant underestimates were found for breakfast cereals, French fries, peas, and carrots and peas. Spearman's correlations between the self-served and estimated weights varied between 0.51 and 0.84, with an average of 0.72. The kappa statistics were moderate (>0.4) for all but one item. Pre-weighed portions were, on average, 15 % underestimated, with significant underestimates for fourteen of the twenty portions. Photographs of food items can serve as a good aid in ranking subjects; however, to assess the actual intake at a group level, underestimation must be considered.

  12. Open-source LCA tool for estimating greenhouse gas emissions from crude oil production using field characteristics.

    Science.gov (United States)

    El-Houjeiri, Hassan M; Brandt, Adam R; Duffy, James E

    2013-06-04

    Existing transportation fuel cycle emissions models are either general and calculate nonspecific values of greenhouse gas (GHG) emissions from crude oil production, or are not available for public review and auditing. We have developed the Oil Production Greenhouse Gas Emissions Estimator (OPGEE) to provide open-source, transparent, rigorous GHG assessments for use in scientific assessment, regulatory processes, and analysis of GHG mitigation options by producers. OPGEE uses petroleum engineering fundamentals to model emissions from oil and gas production operations. We introduce OPGEE and explain the methods and assumptions used in its construction. We run OPGEE on a small set of fictional oil fields and explore model sensitivity to selected input parameters. Results show that upstream emissions from petroleum production operations can vary from 3 gCO2/MJ to over 30 gCO2/MJ using realistic ranges of input parameters. Significant drivers of emissions variation are steam injection rates, water handling requirements, and rates of flaring of associated gas.

  13. Estimations of bone maturation and calculations of prediction of adult height as tools for the evaluation of growth disorders

    Energy Technology Data Exchange (ETDEWEB)

    Zachmann, M

    1982-03-01

    The methods of estimation of bone maturation (Greulich and Pyle, Tanner et al.) and the possibilities for the calculation of future adult heigth (Bayley and Pinneau, Roche et al., Tanner et al.) are briefly described and their advantages and disadvantages in normal children and in children with growth disorders are discussed. In normal children, all methods provide valuable results, but there are small differences of precision depending on whether the pubertal development is early, average, or late. In pathological conditions, however, as e.g. in precocious puberty or in girls with Turner syndrome, the methods of Roche et al. and of Tanner et al. may overestimate adult height considerably, while that of Bayley and Pinneau remains reasonably accurate. A computerized system, which facilitates the complicated and time-consuming calculations is briefly presented.

  14. SOAP 2.0: a tool to estimate the photometric and radial velocity variations induced by stellar spots and plages

    International Nuclear Information System (INIS)

    Dumusque, X.; Boisse, I.; Santos, N. C.

    2014-01-01

    This paper presents SOAP 2.0, a new version of the Spot Oscillation And Planet (SOAP) code that estimates in a simple way the photometric and radial velocity (RV) variations induced by active regions. The inhibition of the convective blueshift (CB) inside active regions is considered, as well as the limb brightening effect of plages, a quadratic limb darkening law, and a realistic spot and plage contrast ratio. SOAP 2.0 shows that the activity-induced variation of plages is dominated by the inhibition of the CB effect. For spots, this effect becomes significant only for slow rotators. In addition, in the case of a major active region dominating the activity-induced signal, the ratio between the FWHM and the RV peak-to-peak amplitudes of the cross correlation function can be used to infer the type of active region responsible for the signal for stars with v sin i ≤8 km s –1 . A ratio smaller than three implies a spot, while a larger ratio implies a plage. Using the observation of HD 189733, we show that SOAP 2.0 manages to reproduce the activity variation as well as previous simulations when a spot is dominating the activity-induced variation. In addition, SOAP 2.0 also reproduces the activity variation induced by a plage on the slowly rotating star α Cen B, which is not possible using previous simulations. Following these results, SOAP 2.0 can be used to estimate the signal induced by spots and plages, but also to correct for it when a major active region is dominating the RV variation.

  15. SOAP 2.0: A Tool to Estimate the Photometric and Radial Velocity Variations Induced by Stellar Spots and Plages

    Science.gov (United States)

    Dumusque, X.; Boisse, I.; Santos, N. C.

    2014-12-01

    This paper presents SOAP 2.0, a new version of the Spot Oscillation And Planet (SOAP) code that estimates in a simple way the photometric and radial velocity (RV) variations induced by active regions. The inhibition of the convective blueshift (CB) inside active regions is considered, as well as the limb brightening effect of plages, a quadratic limb darkening law, and a realistic spot and plage contrast ratio. SOAP 2.0 shows that the activity-induced variation of plages is dominated by the inhibition of the CB effect. For spots, this effect becomes significant only for slow rotators. In addition, in the case of a major active region dominating the activity-induced signal, the ratio between the FWHM and the RV peak-to-peak amplitudes of the cross correlation function can be used to infer the type of active region responsible for the signal for stars with v sin i SOAP 2.0 manages to reproduce the activity variation as well as previous simulations when a spot is dominating the activity-induced variation. In addition, SOAP 2.0 also reproduces the activity variation induced by a plage on the slowly rotating star α Cen B, which is not possible using previous simulations. Following these results, SOAP 2.0 can be used to estimate the signal induced by spots and plages, but also to correct for it when a major active region is dominating the RV variation. . The work in this paper is based on observations made with the MOST satellite, the HARPS instrument on the ESO 3.6 m telescope at La Silla Observatory (Chile), and the SOPHIE instrument at the Observatoire de Haute Provence (France).

  16. The GAAS metagenomic tool and its estimations of viral and microbial average genome size in four major biomes.

    Science.gov (United States)

    Angly, Florent E; Willner, Dana; Prieto-Davó, Alejandra; Edwards, Robert A; Schmieder, Robert; Vega-Thurber, Rebecca; Antonopoulos, Dionysios A; Barott, Katie; Cottrell, Matthew T; Desnues, Christelle; Dinsdale, Elizabeth A; Furlan, Mike; Haynes, Matthew; Henn, Matthew R; Hu, Yongfei; Kirchman, David L; McDole, Tracey; McPherson, John D; Meyer, Folker; Miller, R Michael; Mundt, Egbert; Naviaux, Robert K; Rodriguez-Mueller, Beltran; Stevens, Rick; Wegley, Linda; Zhang, Lixin; Zhu, Baoli; Rohwer, Forest

    2009-12-01

    Metagenomic studies characterize both the composition and diversity of uncultured viral and microbial communities. BLAST-based comparisons have typically been used for such analyses; however, sampling biases, high percentages of unknown sequences, and the use of arbitrary thresholds to find significant similarities can decrease the accuracy and validity of estimates. Here, we present Genome relative Abundance and Average Size (GAAS), a complete software package that provides improved estimates of community composition and average genome length for metagenomes in both textual and graphical formats. GAAS implements a novel methodology to control for sampling bias via length normalization, to adjust for multiple BLAST similarities by similarity weighting, and to select significant similarities using relative alignment lengths. In benchmark tests, the GAAS method was robust to both high percentages of unknown sequences and to variations in metagenomic sequence read lengths. Re-analysis of the Sargasso Sea virome using GAAS indicated that standard methodologies for metagenomic analysis may dramatically underestimate the abundance and importance of organisms with small genomes in environmental systems. Using GAAS, we conducted a meta-analysis of microbial and viral average genome lengths in over 150 metagenomes from four biomes to determine whether genome lengths vary consistently between and within biomes, and between microbial and viral communities from the same environment. Significant differences between biomes and within aquatic sub-biomes (oceans, hypersaline systems, freshwater, and microbialites) suggested that average genome length is a fundamental property of environments driven by factors at the sub-biome level. The behavior of paired viral and microbial metagenomes from the same environment indicated that microbial and viral average genome sizes are independent of each other, but indicative of community responses to stressors and environmental conditions.

  17. The GAAS metagenomic tool and its estimations of viral and microbial average genome size in four major biomes.

    Directory of Open Access Journals (Sweden)

    Florent E Angly

    2009-12-01

    Full Text Available Metagenomic studies characterize both the composition and diversity of uncultured viral and microbial communities. BLAST-based comparisons have typically been used for such analyses; however, sampling biases, high percentages of unknown sequences, and the use of arbitrary thresholds to find significant similarities can decrease the accuracy and validity of estimates. Here, we present Genome relative Abundance and Average Size (GAAS, a complete software package that provides improved estimates of community composition and average genome length for metagenomes in both textual and graphical formats. GAAS implements a novel methodology to control for sampling bias via length normalization, to adjust for multiple BLAST similarities by similarity weighting, and to select significant similarities using relative alignment lengths. In benchmark tests, the GAAS method was robust to both high percentages of unknown sequences and to variations in metagenomic sequence read lengths. Re-analysis of the Sargasso Sea virome using GAAS indicated that standard methodologies for metagenomic analysis may dramatically underestimate the abundance and importance of organisms with small genomes in environmental systems. Using GAAS, we conducted a meta-analysis of microbial and viral average genome lengths in over 150 metagenomes from four biomes to determine whether genome lengths vary consistently between and within biomes, and between microbial and viral communities from the same environment. Significant differences between biomes and within aquatic sub-biomes (oceans, hypersaline systems, freshwater, and microbialites suggested that average genome length is a fundamental property of environments driven by factors at the sub-biome level. The behavior of paired viral and microbial metagenomes from the same environment indicated that microbial and viral average genome sizes are independent of each other, but indicative of community responses to stressors and

  18. SOAP 2.0: a tool to estimate the photometric and radial velocity variations induced by stellar spots and plages

    Energy Technology Data Exchange (ETDEWEB)

    Dumusque, X. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Boisse, I. [Laboratoire d' Astrophysique de Marseille (UMR 6110), Technopole de Château-Gombert, 38 rue Frédéric Joliot-Curie, F-13388 Marseille Cedex 13 (France); Santos, N. C., E-mail: xdumusque@cfa.harvard.edu [Centro de Astrofìsica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal)

    2014-12-01

    This paper presents SOAP 2.0, a new version of the Spot Oscillation And Planet (SOAP) code that estimates in a simple way the photometric and radial velocity (RV) variations induced by active regions. The inhibition of the convective blueshift (CB) inside active regions is considered, as well as the limb brightening effect of plages, a quadratic limb darkening law, and a realistic spot and plage contrast ratio. SOAP 2.0 shows that the activity-induced variation of plages is dominated by the inhibition of the CB effect. For spots, this effect becomes significant only for slow rotators. In addition, in the case of a major active region dominating the activity-induced signal, the ratio between the FWHM and the RV peak-to-peak amplitudes of the cross correlation function can be used to infer the type of active region responsible for the signal for stars with v sin i ≤8 km s{sup –1}. A ratio smaller than three implies a spot, while a larger ratio implies a plage. Using the observation of HD 189733, we show that SOAP 2.0 manages to reproduce the activity variation as well as previous simulations when a spot is dominating the activity-induced variation. In addition, SOAP 2.0 also reproduces the activity variation induced by a plage on the slowly rotating star α Cen B, which is not possible using previous simulations. Following these results, SOAP 2.0 can be used to estimate the signal induced by spots and plages, but also to correct for it when a major active region is dominating the RV variation.

  19. Use of geoprocessing tools in uranium mining: volume estimation of sterile piles from the Osamu Utsumi Mine of INB / Caldas

    International Nuclear Information System (INIS)

    Ferreira, A.M.; Menezes, P.H.B.J.; Alberti, H.L.C.; Silva, N.C. da; Goda, R.T.

    2017-01-01

    The determination of the volumes of the sterile piles generated in the uranium mining and their respective characterization is of extreme importance for the management of mining wastes and future decommissioning actions of a nuclear facility. With the development of information technology, it becomes possible to simulate different scenarios in a computational environment, being able to store, represent and process data from existing information. In the industrial mining context, the sterile is represented with rocky materials of different granulometries and with ore content below the cut content determined by the industrial process. In this sense, the present work has the objective of calculating the volume of the sterile stacks of the Osamu Utsumi uranium mine of INB - Nuclear Industries of Brazil / Caldas. The MOU was officially inaugurated in 1977 and operated until 1995, where 1,200 tons of U 2 O 3 were produced generating about 94.5 x 106 tons of sterile material containing low levels of radioactive material and pyrite. The methodology for the development of this work initially involves integration approaches between the Geographic Information System (GIS) and terrain modeling for the sterile piles called BF4 and BF8. The results obtained were compared with the existing literature, translating the importance of GIS as a tool in the management of wastes

  20. Submerged macrophyte communities in the Forsmark area. Building of a GIS application as a tool for biomass estimations

    International Nuclear Information System (INIS)

    Fredriksson, Ronny

    2005-12-01

    The aim of this study was to compile the information from previous studies to produce a GIS application that both illustrates the distribution of different vegetation communities and also makes it possible to estimate the total biomass of the different vegetation communities and its associated fauna. The GIS application was created by means of the software Arc View 3.3 by Environmental Systems Research Institute, Inc. Distribution readings and quantitative data of submerged macrophyte communities and its associated fauna was obtained from studies by Kautsky et al. and by Borgiel. Information about the macrophyte distribution in Laangoersviken, located in the northern parts of Kallrigafjaerden, was obtained from a report by Upplandsstiftelsen. Information about water depth and bottom substrate was available as USGS DEM file, produced by Geological Survey of Sweden. Complementary data of the covering degree of submerged vegetation was obtained from a study using an under water video camera by Tobiasson. Quantitative data on macrophyte and faunal biomass were either obtained from the primary SKB data base SICADA or directly from reports. Samples were compiled and analysed according to dominating vegetation. The work was carried out as follows: Where information about the bottom substrate was available polygons were created by means of the substrate shape file and depth grid from Geological Survey of Sweden. The vegetation community and the covering degree on a certain depth and substrate combination were determined by compiled information from studies by Kautsky and by Borgiel. All observations from a certain bottom substrate were analysed to find the dominating vegetation within different depth ranges. After determining the dominating vegetation, the covering degrees of different macrophyte classes within each depth range were calculated as a mean of all readings. Areas without information about the bottom substrate, but still adjacent to areas included in the

  1. Submerged macrophyte communities in the Forsmark area. Building of a GIS application as a tool for biomass estimations

    Energy Technology Data Exchange (ETDEWEB)

    Fredriksson, Ronny [Univ. of Kalmar (Sweden)

    2005-12-15

    The aim of this study was to compile the information from previous studies to produce a GIS application that both illustrates the distribution of different vegetation communities and also makes it possible to estimate the total biomass of the different vegetation communities and its associated fauna. The GIS application was created by means of the software Arc View 3.3 by Environmental Systems Research Institute, Inc. Distribution readings and quantitative data of submerged macrophyte communities and its associated fauna was obtained from studies by Kautsky et al. and by Borgiel. Information about the macrophyte distribution in Laangoersviken, located in the northern parts of Kallrigafjaerden, was obtained from a report by Upplandsstiftelsen. Information about water depth and bottom substrate was available as USGS DEM file, produced by Geological Survey of Sweden. Complementary data of the covering degree of submerged vegetation was obtained from a study using an under water video camera by Tobiasson. Quantitative data on macrophyte and faunal biomass were either obtained from the primary SKB data base SICADA or directly from reports. Samples were compiled and analysed according to dominating vegetation. The work was carried out as follows: Where information about the bottom substrate was available polygons were created by means of the substrate shape file and depth grid from Geological Survey of Sweden. The vegetation community and the covering degree on a certain depth and substrate combination were determined by compiled information from studies by Kautsky and by Borgiel. All observations from a certain bottom substrate were analysed to find the dominating vegetation within different depth ranges. After determining the dominating vegetation, the covering degrees of different macrophyte classes within each depth range were calculated as a mean of all readings. Areas without information about the bottom substrate, but still adjacent to areas included in the

  2. Exonic Splicing Mutations Are More Prevalent than Currently Estimated and Can Be Predicted by Using In Silico Tools

    Science.gov (United States)

    Soukarieh, Omar; Gaildrat, Pascaline; Hamieh, Mohamad; Drouet, Aurélie; Baert-Desurmont, Stéphanie; Frébourg, Thierry; Tosi, Mario; Martins, Alexandra

    2016-01-01

    The identification of a causal mutation is essential for molecular diagnosis and clinical management of many genetic disorders. However, even if next-generation exome sequencing has greatly improved the detection of nucleotide changes, the biological interpretation of most exonic variants remains challenging. Moreover, particular attention is typically given to protein-coding changes often neglecting the potential impact of exonic variants on RNA splicing. Here, we used the exon 10 of MLH1, a gene implicated in hereditary cancer, as a model system to assess the prevalence of RNA splicing mutations among all single-nucleotide variants identified in a given exon. We performed comprehensive minigene assays and analyzed patient’s RNA when available. Our study revealed a staggering number of splicing mutations in MLH1 exon 10 (77% of the 22 analyzed variants), including mutations directly affecting splice sites and, particularly, mutations altering potential splicing regulatory elements (ESRs). We then used this thoroughly characterized dataset, together with experimental data derived from previous studies on BRCA1, BRCA2, CFTR and NF1, to evaluate the predictive power of 3 in silico approaches recently described as promising tools for pinpointing ESR-mutations. Our results indicate that ΔtESRseq and ΔHZEI-based approaches not only discriminate which variants affect splicing, but also predict the direction and severity of the induced splicing defects. In contrast, the ΔΨ-based approach did not show a compelling predictive power. Our data indicates that exonic splicing mutations are more prevalent than currently appreciated and that they can now be predicted by using bioinformatics methods. These findings have implications for all genetically-caused diseases. PMID:26761715

  3. Exonic Splicing Mutations Are More Prevalent than Currently Estimated and Can Be Predicted by Using In Silico Tools.

    Directory of Open Access Journals (Sweden)

    Omar Soukarieh

    2016-01-01

    Full Text Available The identification of a causal mutation is essential for molecular diagnosis and clinical management of many genetic disorders. However, even if next-generation exome sequencing has greatly improved the detection of nucleotide changes, the biological interpretation of most exonic variants remains challenging. Moreover, particular attention is typically given to protein-coding changes often neglecting the potential impact of exonic variants on RNA splicing. Here, we used the exon 10 of MLH1, a gene implicated in hereditary cancer, as a model system to assess the prevalence of RNA splicing mutations among all single-nucleotide variants identified in a given exon. We performed comprehensive minigene assays and analyzed patient's RNA when available. Our study revealed a staggering number of splicing mutations in MLH1 exon 10 (77% of the 22 analyzed variants, including mutations directly affecting splice sites and, particularly, mutations altering potential splicing regulatory elements (ESRs. We then used this thoroughly characterized dataset, together with experimental data derived from previous studies on BRCA1, BRCA2, CFTR and NF1, to evaluate the predictive power of 3 in silico approaches recently described as promising tools for pinpointing ESR-mutations. Our results indicate that ΔtESRseq and ΔHZEI-based approaches not only discriminate which variants affect splicing, but also predict the direction and severity of the induced splicing defects. In contrast, the ΔΨ-based approach did not show a compelling predictive power. Our data indicates that exonic splicing mutations are more prevalent than currently appreciated and that they can now be predicted by using bioinformatics methods. These findings have implications for all genetically-caused diseases.

  4. A Software Tool for Estimation of Burden of Infectious Diseases in Europe Using Incidence-Based Disability Adjusted Life Years.

    Science.gov (United States)

    Colzani, Edoardo; Cassini, Alessandro; Lewandowski, Daniel; Mangen, Marie-Josee J; Plass, Dietrich; McDonald, Scott A; van Lier, Alies; Haagsma, Juanita A; Maringhini, Guido; Pini, Alessandro; Kramarz, Piotr; Kretzschmar, Mirjam E

    2017-01-01

    The burden of disease framework facilitates the assessment of the health impact of diseases through the use of summary measures of population health such as Disability-Adjusted Life Years (DALYs). However, calculating, interpreting and communicating the results of studies using this methodology poses a challenge. The aim of the Burden of Communicable Disease in Europe (BCoDE) project is to summarize the impact of communicable disease in the European Union and European Economic Area Member States (EU/EEA MS). To meet this goal, a user-friendly software tool (BCoDE toolkit), was developed. This stand-alone application, written in C++, is open-access and freely available for download from the website of the European Centre for Disease Prevention and Control (ECDC). With the BCoDE toolkit, one can calculate DALYs by simply entering the age group- and sex-specific number of cases for one or more of selected sets of 32 communicable diseases (CDs) and 6 healthcare associated infections (HAIs). Disease progression models (i.e., outcome trees) for these communicable diseases were created following a thorough literature review of their disease progression pathway. The BCoDE toolkit runs Monte Carlo simulations of the input parameters and provides disease-specific results, including 95% uncertainty intervals, and permits comparisons between the different disease models entered. Results can be displayed as mean and median overall DALYs, DALYs per 100,000 population, and DALYs related to mortality vs. disability. Visualization options summarize complex epidemiological data, with the goal of improving communication and knowledge transfer for decision-making.

  5. A Software Tool for Estimation of Burden of Infectious Diseases in Europe Using Incidence-Based Disability Adjusted Life Years.

    Directory of Open Access Journals (Sweden)

    Edoardo Colzani

    Full Text Available The burden of disease framework facilitates the assessment of the health impact of diseases through the use of summary measures of population health such as Disability-Adjusted Life Years (DALYs. However, calculating, interpreting and communicating the results of studies using this methodology poses a challenge. The aim of the Burden of Communicable Disease in Europe (BCoDE project is to summarize the impact of communicable disease in the European Union and European Economic Area Member States (EU/EEA MS. To meet this goal, a user-friendly software tool (BCoDE toolkit, was developed. This stand-alone application, written in C++, is open-access and freely available for download from the website of the European Centre for Disease Prevention and Control (ECDC. With the BCoDE toolkit, one can calculate DALYs by simply entering the age group- and sex-specific number of cases for one or more of selected sets of 32 communicable diseases (CDs and 6 healthcare associated infections (HAIs. Disease progression models (i.e., outcome trees for these communicable diseases were created following a thorough literature review of their disease progression pathway. The BCoDE toolkit runs Monte Carlo simulations of the input parameters and provides disease-specific results, including 95% uncertainty intervals, and permits comparisons between the different disease models entered. Results can be displayed as mean and median overall DALYs, DALYs per 100,000 population, and DALYs related to mortality vs. disability. Visualization options summarize complex epidemiological data, with the goal of improving communication and knowledge transfer for decision-making.

  6. The combined use of Green-Ampt model and Curve Number method as an empirical tool for loss estimation

    Science.gov (United States)

    Petroselli, A.; Grimaldi, S.; Romano, N.

    2012-12-01

    The Soil Conservation Service - Curve Number (SCS-CN) method is a popular rainfall-runoff model widely used to estimate losses and direct runoff from a given rainfall event, but its use is not appropriate at sub-daily time resolution. To overcome this drawback, a mixed procedure, referred to as CN4GA (Curve Number for Green-Ampt), was recently developed including the Green-Ampt (GA) infiltration model and aiming to distribute in time the information provided by the SCS-CN method. The main concept of the proposed mixed procedure is to use the initial abstraction and the total volume given by the SCS-CN to calibrate the Green-Ampt soil hydraulic conductivity parameter. The procedure is here applied on a real case study and a sensitivity analysis concerning the remaining parameters is presented; results show that CN4GA approach is an ideal candidate for the rainfall excess analysis at sub-daily time resolution, in particular for ungauged basin lacking of discharge observations.

  7. The South Wilmington Area remedial cost estimating methodology (RCEM) -- A planning tool and reality check for brownfield development

    International Nuclear Information System (INIS)

    Yancheski, T.B.; Swanson, J.E.

    1996-01-01

    The South Wilmington Area (SWA), which is comprised of 200 acres of multi-use urban lowlands adjacent to the Christina River, is a brownfields area that has been targeted for redevelopment/restoration as part of a major waterfront revitalization project for the City of Wilmington, Delaware. The vision for this riverfront development, which is being promoted by a state-funded development corporation, includes plans for a new harbor, convention and entertainment facilities, upscale residences, an urban wildlife refuge, and the restoration of the Christina River. However, the environmental quality of the SWA has been seriously impacted by an assortment of historic and current heavy industrial land-uses since the late 1800's, and extensive environmental cleanup of this area will be required as part of any redevelopment plan. Given that the environmental cleanup cost will be a major factor in determining the overall economic feasibility of brownfield development in the SWA, a reliable means of estimating potential preliminary remedial costs, without the expense of costly investigative and engineering studies, was needed to assist with this redevelopment initiative. The primary chemicals-of-concern (COCs) area-wide are lead and petroleum compounds, however, there are hot-spot occurrences of polynuclear aromatic hydrocarbons (PAHs), PCBs, and other heavy metals such as arsenic and mercury

  8. A method for estimating maternal and newborn lives saved from health-related investments funded by the UK government Department for International Development using the Lives Saved Tool

    Directory of Open Access Journals (Sweden)

    Ingrid K. Friberg

    2017-11-01

    Full Text Available Abstract Background In 2010, the UK Government Department for International Development (DFID committed through its 'Framework for results for reproductive, maternal and newborn health (RMNH' to save 50,000 maternal lives and 250,000 newborn lives by 2015. They also committed to monitoring the performance of this portfolio of investments to demonstrate transparency and accountability. Methods currently available to directly measure lives saved are cost-, time-, and labour-intensive. The gold standard for calculating the total number of lives saved would require measuring mortality with large scale population based surveys or annual vital events surveillance. Neither is currently available in all low- and middle-income countries. Estimating the independent effect of DFID support relative to all other effects on health would also be challenging. Methods The Lives Saved Tool (LiST is an evidence based software for modelling the effect of changes in health intervention coverage on reproductive, maternal, newborn and child mortality. A multi-country LiST-based analysis protocol was developed to retrospectively assess the total annual number of maternal and newborn lives saved from DFID aid programming in low- and middle-income countries. Results Annual LiST analyses using the latest program data from DFID country offices were conducted between 2013 and 2016, estimating the annual number of maternal and neonatal lives saved across 2010–2015. For each country, independent project results were aggregated into health intervention coverage estimates, with and in the absence of DFID funding. More than 80% of reported projects were suitable for inclusion in the analysis, with 151 projects analysed in the 2016 analysis. Between 2010 and 2014, it is estimated that DFID contributed to saving the lives of 15,000 women in pregnancy and childbirth with health programming and 88,000 with family planning programming. It is estimated that DFID health programming

  9. A method for estimating maternal and newborn lives saved from health-related investments funded by the UK government Department for International Development using the Lives Saved Tool.

    Science.gov (United States)

    Friberg, Ingrid K; Baschieri, Angela; Abbotts, Jo

    2017-11-07

    In 2010, the UK Government Department for International Development (DFID) committed through its 'Framework for results for reproductive, maternal and newborn health (RMNH)' to save 50,000 maternal lives and 250,000 newborn lives by 2015. They also committed to monitoring the performance of this portfolio of investments to demonstrate transparency and accountability. Methods currently available to directly measure lives saved are cost-, time-, and labour-intensive. The gold standard for calculating the total number of lives saved would require measuring mortality with large scale population based surveys or annual vital events surveillance. Neither is currently available in all low- and middle-income countries. Estimating the independent effect of DFID support relative to all other effects on health would also be challenging. The Lives Saved Tool (LiST) is an evidence based software for modelling the effect of changes in health intervention coverage on reproductive, maternal, newborn and child mortality. A multi-country LiST-based analysis protocol was developed to retrospectively assess the total annual number of maternal and newborn lives saved from DFID aid programming in low- and middle-income countries. Annual LiST analyses using the latest program data from DFID country offices were conducted between 2013 and 2016, estimating the annual number of maternal and neonatal lives saved across 2010-2015. For each country, independent project results were aggregated into health intervention coverage estimates, with and in the absence of DFID funding. More than 80% of reported projects were suitable for inclusion in the analysis, with 151 projects analysed in the 2016 analysis. Between 2010 and 2014, it is estimated that DFID contributed to saving the lives of 15,000 women in pregnancy and childbirth with health programming and 88,000 with family planning programming. It is estimated that DFID health programming contributed to saving 187,000 newborn lives. It is

  10. Estimating the yin-yang nature of Western herbs: a potential tool based on antioxidation-oxidation theory.

    Science.gov (United States)

    Gilca, Marilena; Gaman, Laura; Lixandru, Daniela; Stoian, Irina

    2014-01-01

    One of the biggest obstacles to progress in traditional Chinese medicine (TCM) development in Western countries is the difficulty of applying the traditional concepts to the Western medicinal plants, which are not traditionally described in ancient literature. During recent years, new advances in the field of understanding Yin/Yang aspects from a modern bioscientific point of view have led to the conclusion that antioxidationoxidation concepts might mirror a Yin-Yang relationship. This study was intended to integrate the Yin-Yang theory of the traditional Chinese medicine with modern antioxidation-oxidation theory, and to propose a biochemical tool based on redox parameters (e.g. antioxidant capacity, chemiluminescence-CL signal inducing capacity), usable for the classification of Western medicinal plants from Yin/Yang perspective. Trolox equivalent antioxidant capacity (TEAC) of six vegetal aqueous extracts (Symphitum officinalae (radix)-SYM, Inula helenium (radix)-INU, Calendula officinalis (flores)-CAL, Angelica arhanghelica (folium)ANG(F), Angelica arhanghelica (radix)-ANG(R), Ecbalium Elaterium (fruits)-ECB) and luminol-enhanced chemiluminescence of PMNL on addition of these vegetal extracts were measured. Percentages from the maximal or minimal values obtained were calculated for each extract (TEAC%, PMNL stimulation%, PMNL inhibition%, relative speed of action% (RSA%%)), specific Yin-Yang significance was assigned to each relative parameter. In the end, an integration of all the relative values was done, in order to find a global "Yin" or a "Yang" trait of each vegetal extract. TEAC decreased in the following order: SYM > INU > CAL >ANG(F) > ANG(R > ECB. Three vegetal extracts (SYM > INU > ECB) decreased the luminol-enhanced chemiluminescence of PMNL, two (ANG(R) > ANG(F)) increased it, while one (CAL) had a dual effect. After the integration of the percentages, CAL was found to have a global "Yang" trait, while the rest of the plants had a global "Yin

  11. The Asset Drivers, Well-being Interaction Matrix (ADWIM: A participatory tool for estimating future impacts on ecosystem services and livelihoods

    Directory of Open Access Journals (Sweden)

    T.D. Skewes

    2016-01-01

    Full Text Available Building an effective response for communities to climate change requires decision-support tools that deliver information which stakeholders find relevant for exploring potential short and long-term impacts on livelihoods. Established principles suggest that to successfully communicate scientific information, such tools must be transparent, replicable, relevant, credible, flexible, affordable and unbiased. In data-poor contexts typical of developing countries, they should also be able to integrate stakeholders’ knowledge and values, empowering them in the process. We present a participatory tool, the Asset Drivers Well-being Interaction Matrix (ADWIM, which estimates future impacts on ecosystem goods and services (EGS and communities’ well-being through the cumulative effects of system stressors. ADWIM consists of two modelling steps: an expert-informed, cumulative impact assessment for EGS; which is then integrated with a stakeholder-informed EGS valuation process carried out during adaptation planning workshops. We demonstrate the ADWIM process using examples from Nusa Tenggara Barat Province (NTB in eastern Indonesia. The semi-quantitative results provide an assessment of the relative impacts on EGS and human well-being under the ‘Business as Usual’ scenario of climate change and human population growth at different scales in NTB, information that is subsequently used for designing adaptation strategies. Based on these experiences, we discuss the relative strengths and weaknesses of ADWIM relative to principles of effective science communication and ecosystem services modelling. ADWIM’s apparent attributes as an analysis, decision support and communication tool promote its utility for participatory adaptation planning. We also highlight its relevance as a ‘boundary object’ to provide learning and reflection about the current and likely future importance of EGS to livelihoods in NTB.

  12. Mixing the Green-Ampt model and Curve Number method as an empirical tool for rainfall excess estimation in small ungauged catchments.

    Science.gov (United States)

    Grimaldi, S.; Petroselli, A.; Romano, N.

    2012-04-01

    The Soil Conservation Service - Curve Number (SCS-CN) method is a popular rainfall-runoff model that is widely used to estimate direct runoff from small and ungauged basins. The SCS-CN is a simple and valuable approach to estimate the total stream-flow volume generated by a storm rainfall, but it was developed to be used with daily rainfall data. To overcome this drawback, we propose to include the Green-Ampt (GA) infiltration model into a mixed procedure, which is referred to as CN4GA (Curve Number for Green-Ampt), aiming to distribute in time the information provided by the SCS-CN method so as to provide estimation of sub-daily incremental rainfall excess. For a given storm, the computed SCS-CN total net rainfall amount is used to calibrate the soil hydraulic conductivity parameter of the Green-Ampt model. The proposed procedure was evaluated by analyzing 100 rainfall-runoff events observed in four small catchments of varying size. CN4GA appears an encouraging tool for predicting the net rainfall peak and duration values and has shown, at least for the test cases considered in this study, a better agreement with observed hydrographs than that of the classic SCS-CN method.

  13. [Determine the patient's position towards psychiatric care: a simple tool to estimate the alliance and the motivation].

    Science.gov (United States)

    Versaevel, C; Samama, D; Jeanson, R; Lajugie, C; Dufeutrel, L; Defromont, L; Lebouteiller, V; Danel, T; Duhamel, A; Genin, M; Salleron, J; Cottencin, O

    2013-09-01

    For the brief systemic therapy (BST), the evaluation of the patient's position towards the care is a prerequisite to psychotherapy. Three positions of the patient are described. The "tourist's" position: the patient claims to have no problem and doesn't suffer. Someone asks him to make an appointment, sometimes with threats. The "complaint's" position: the patient claims to suffer, but attributes the responsibility of this suffering to others. These two positions are not good for beginning a therapy. The "customer's" position differs from both previous positions. The "customer" considers that he has a psychological problem which depends on him and he is motivated in the resolution of it. In theory, the "customer" is more motivated and the therapeutic alliance is better. It is for this reason that the BST estimates the position of the patient at first, to bring the patient to the "customer's" position. The objective of this study is to assess an interview which identifies the patient's position towards the care, and to validate the theoretical elaborations of the brief systemic therapy. The study concerns the follow-up of outpatients who consult a psychiatrist for the first time. The evaluation of the patients checks their position towards care using the Tourist-Complaint-Customer (TCC) inventory, how they suffer, the therapeutic alliance (scale Haq-2) and the compliance during care. The evaluation by the psychiatrists checks the suffering perceived, the motivation perceived and the diagnoses according to the DSM. The typology of these patients is made up of one half "complaint", a quarter of "tourist" and a quarter of "customer". The "customer's" position is correlated with the therapeutic alliance and the motivation perceived by the psychiatrist. The motivation perceived by the psychiatrist is correlated with the therapeutic alliance. These results correspond to the theoretical elaborations of the BST. the TCC inventory provides information on the motivation and

  14. Use of modified threat reduction assessments to estimate success of conservation measures within and adjacent to Kruger National Park, South Africa.

    Science.gov (United States)

    Anthony, Brandon P

    2008-12-01

    The importance of biodiversity as natural capital for economic development and sustaining human welfare is well documented. Nevertheless, resource degradation rates and persistent deterioration of human welfare in developing countries is increasingly worrisome. Developing effective monitoring and evaluation schemes and measuring biodiversity loss continue to pose unique challenges, particularly when there is a paucity of historical data. Threat reduction assessment (TRA) has been proposed as a method to measure conservation success and as a proxy measurement of conservation impact, monitoring threats to resources rather than changes to biological parameters themselves. This tool is considered a quick, practical alternative to more cost- and time-intensive approaches, but has inherent weaknesses. I conducted TRAs to evaluate the effectiveness of Kruger National Park (KNP) and Limpopo Province, South Africa, in mitigating threats to biodiversity from 1994 to 2004 in 4 geographical areas. I calculated TRA index values in these TRAs by using the original scoring developed by Margoluis and Salafsky (2001)and a modified scoring system that assigned negative mitigation values to incorporate new or worsening threats. Threats were standardized to allow comparisons across the sites. Modified TRA index values were significantly lower than values derived from the original scoring exercise. Five of the 11 standardized threats were present in all 4 assessment areas, 2 were restricted to KNP, 2 to Limpopo Province, and 2 only to Malamulele municipality. These results indicate, first, the need to integrate negative mitigation values into TRA scoring. By including negative values, investigators will be afforded a more accurate picture of biodiversity threats and of temporal and spatial trends across sites. Where the original TRA scoring was used to measure conservation success, reevaluation of these cases with the modified scoring is recommended. Second, practitioners must

  15. Development and validation of a food photography manual, as a tool for estimation of food portion size in epidemiological dietary surveys in Tunisia

    Directory of Open Access Journals (Sweden)

    Mongia Bouchoucha

    2016-08-01

    Full Text Available Background: Estimation of food portion sizes has always been a challenge in dietary studies on free-living individuals. The aim of this work was to develop and validate a food photography manual to improve the accuracy of the estimated size of consumed food portions. Methods: A manual was compiled from digital photos of foods commonly consumed by the Tunisian population. The food was cooked and weighed before taking digital photographs of three portion sizes. The manual was validated by comparing the method of 24-hour recall (using photos to the reference method [food weighing (FW]. In both the methods, the comparison focused on food intake amounts as well as nutritional issues. Validity was assessed by Bland–Altman limits of agreement. In total, 31 male and female volunteers aged 9–89 participated in the study. Results: We focused on eight food categories and compared their estimated amounts (using the 24-hour recall method to those actually consumed (using FW. Animal products and sweets were underestimated, whereas pasta, bread, vegetables, fruits, and dairy products were overestimated. However, the difference between the two methods is not statistically significant except for pasta (p<0.05 and dairy products (p<0.05. The coefficient of correlation between the two methods is highly significant, ranging from 0.876 for pasta to 0.989 for dairy products. Nutrient intake calculated for both methods showed insignificant differences except for fat (p<0.001 and dietary fiber (p<0.05. A highly significant correlation was observed between the two methods for all micronutrients. The test agreement highlights the lack of difference between the two methods. Conclusion: The difference between the 24-hour recall method using digital photos and the weighing method is acceptable. Our findings indicate that the food photography manual can be a useful tool for quantifying food portion sizes in epidemiological dietary surveys.

  16. Development and validation of a food photography manual, as a tool for estimation of food portion size in epidemiological dietary surveys in Tunisia.

    Science.gov (United States)

    Bouchoucha, Mongia; Akrout, Mouna; Bellali, Hédia; Bouchoucha, Rim; Tarhouni, Fadwa; Mansour, Abderraouf Ben; Zouari, Béchir

    2016-01-01

    Estimation of food portion sizes has always been a challenge in dietary studies on free-living individuals. The aim of this work was to develop and validate a food photography manual to improve the accuracy of the estimated size of consumed food portions. A manual was compiled from digital photos of foods commonly consumed by the Tunisian population. The food was cooked and weighed before taking digital photographs of three portion sizes. The manual was validated by comparing the method of 24-hour recall (using photos) to the reference method [food weighing (FW)]. In both the methods, the comparison focused on food intake amounts as well as nutritional issues. Validity was assessed by Bland-Altman limits of agreement. In total, 31 male and female volunteers aged 9-89 participated in the study. We focused on eight food categories and compared their estimated amounts (using the 24-hour recall method) to those actually consumed (using FW). Animal products and sweets were underestimated, whereas pasta, bread, vegetables, fruits, and dairy products were overestimated. However, the difference between the two methods is not statistically significant except for pasta (p<0.05) and dairy products (p<0.05). The coefficient of correlation between the two methods is highly significant, ranging from 0.876 for pasta to 0.989 for dairy products. Nutrient intake calculated for both methods showed insignificant differences except for fat (p<0.001) and dietary fiber (p<0.05). A highly significant correlation was observed between the two methods for all micronutrients. The test agreement highlights the lack of difference between the two methods. The difference between the 24-hour recall method using digital photos and the weighing method is acceptable. Our findings indicate that the food photography manual can be a useful tool for quantifying food portion sizes in epidemiological dietary surveys.

  17. Estimate of the technological costs of CO{sub 2} emission reductions in passenger cars. Emission reduction potentials and their costs; Technikkostenschaetzung fuer die CO{sub 2}-Emissionsminderung bei Pkw. Emissionsminderungspotenziale und ihre Kosten

    Energy Technology Data Exchange (ETDEWEB)

    Herbener, Reinhard; Jahn, Helge; Wetzel, Frank [Umweltbundesamt, Dessau-Rosslau (Germany). Fachgebiet I 3.2 - Schadstoffminderung und Energieeinsparung im Verkehr

    2008-08-06

    The Federal Environmental Office intended to identify the current fuel consumption reduction potential and the cost of efficiency-enhancing measures on passenger cars. For this purpose, an extensive bibliographic search was carried out, and experts from research institutes and from the automobile supplier industry were asked for their opinion. The results are published in table form. (orig.)

  18. A business intelligence approach using web search tools and online data reduction techniques to examine the value of product-enabled services

    DEFF Research Database (Denmark)

    Tanev, Stoyan; Liotta, Giacomo; Kleismantas, Andrius

    2015-01-01

    in Canada and Europe. It adopts an innovative methodology based on online textual data that could be implemented in advanced business intelligence tools aiming at the facilitation of innovation, marketing and business decision making. Combinations of keywords referring to different aspects of service value......-service innovation as a competitive advantage on the marketplace. On the other hand, the focus of EU firms on innovative hybrid offerings is not explicitly related to business differentiation and competitiveness....

  19. The VeTOOLS Project: an example of how to strengthen collaboration between scientists and Civil Protections in disaster risk reduction

    Science.gov (United States)

    Marti, Joan; Bartolini, Stefania; Becerril, Laura

    2016-04-01

    VeTOOLS is a project funded by the European Commission's Humanitarian Aid and Civil Protection department (ECHO), and aims at creating an integrated software platform specially designed to assess and manage volcanic risk. The project facilitates interaction and cooperation between scientists and Civil Protection Agencies in order to share, unify, and exchange procedures, methodologies and technologies to effectively reduce the impacts of volcanic disasters. The project aims at 1) improving and developing volcanic risk assessment and management capacities in active volcanic regions; 2) developing universal methodologies, scenario definitions, response strategies and alert protocols to cope with the full range of volcanic threats; 4) improving quantitative methods and tools for vulnerability and risk assessment; and 5) defining thresholds and protocols for civil protection. With these objectives, the VeTOOLS project points to two of the Sendai Framework resolutions for implementing it: i) Provide guidance on methodologies and standards for risk assessments, disaster risk modelling and the use of data; ii) Promote and support the availability and application of science and technology to decision-making, and offers a good example on how a close collaboration between science and civil protection is an effective way to contribute to DRR. European Commission ECHO Grant SI2.695524

  20. Analysis of metal artifact reduction tools for dental hardware in CT scans of the oral cavity: kVp, iterative reconstruction, dual-energy CT, metal artifact reduction software: does it make a difference?

    Energy Technology Data Exchange (ETDEWEB)

    Crop, An de; Hoof, Tom van; Herde, Katharina d' ; Thierens, Hubert; Bacher, Klaus [Ghent University, Department of Basic Medical Sciences, Gent (Belgium); Casselman, Jan; Vereecke, Elke; Bossu, Nicolas [AZ Sint Jan Bruges Ostend AV, Department of Radiology, Bruges (Belgium); Dierens, Melissa [Ghent University, Dental School, Unit for Oral and Maxillofacial Imaging, Ghent (Belgium); Pamplona, Jaime [Hospital Lisboa Central, Department of Neuroradiology, Lisbon (Portugal)

    2015-08-15

    Metal artifacts may negatively affect radiologic assessment in the oral cavity. The aim of this study was to evaluate different metal artifact reduction techniques for metal artifacts induced by dental hardware in CT scans of the oral cavity. Clinical image quality was assessed using a Thiel-embalmed cadaver. A Catphan phantom and a polymethylmethacrylate (PMMA) phantom were used to evaluate physical-technical image quality parameters such as artifact area, artifact index (AI), and contrast detail (IQF{sub inv}). Metal cylinders were inserted in each phantom to create metal artifacts. CT images of both phantoms and the Thiel-embalmed cadaver were acquired on a multislice CT scanner using 80, 100, 120, and 140 kVp; model-based iterative reconstruction (Veo); and synthesized monochromatic keV images with and without metal artifact reduction software (MARs). Four radiologists assessed the clinical image quality, using an image criteria score (ICS). Significant influence of increasing kVp and the use of Veo was found on clinical image quality (p = 0.007 and p = 0.014, respectively). Application of MARs resulted in a smaller artifact area (p < 0.05). However, MARs reconstructed images resulted in lower ICS. Of all investigated techniques, Veo shows to be most promising, with a significant improvement of both the clinical and physical-technical image quality without adversely affecting contrast detail. MARs reconstruction in CT images of the oral cavity to reduce dental hardware metallic artifacts is not sufficient and may even adversely influence the image quality. (orig.)

  1. Influenza vaccination coverage estimates in the fee-for service Medicare beneficiary population 2006 - 2016: Using population-based administrative data to support a geographic based near real-time tool.

    Science.gov (United States)

    Shen, Angela K; Warnock, Rob; Brereton, Stephaeno; McKean, Stephen; Wernecke, Michael; Chu, Steve; Kelman, Jeffrey A

    2018-04-11

    Older adults are at great risk of developing serious complications from seasonal influenza. We explore vaccination coverage estimates in the Medicare population through the use of administrative claims data and describe a tool designed to help shape outreach efforts and inform strategies to help raise influenza vaccination rates. This interactive mapping tool uses claims data to compare vaccination levels between geographic (i.e., state, county, zip code) and demographic (i.e., race, age) groups at different points in a season. Trends can also be compared across seasons. Utilization of this tool can assist key actors interested in prevention - medical groups, health plans, hospitals, and state and local public health authorities - in supporting strategies for reaching pools of unvaccinated beneficiaries where general national population estimates of coverage are less informative. Implementing evidence-based tools can be used to address persistent racial and ethnic disparities and prevent a substantial number of influenza cases and hospitalizations.

  2. Identification of abiotic and biotic reductive dechlorination in a chlorinated ethene plume after thermal source remediation by means of isotopic and molecular biology tools

    DEFF Research Database (Denmark)

    Badin, Alice; Broholm, Mette Martina; Jacobsen, Carsten S.

    2016-01-01

    -Cl isotope analysis together with the almost absent VC 13C depletion in comparison to cDCE 13C depletion suggested that cDCE was subject to abiotic degradation due to the presence of pyrite, possible surface-bound iron (II) or reduced iron sulphides in the downgradient part of the plume. This interpretation...... reduced redox conditions which favor active reductive dechlorination and/or may lead to a series of redox reactions which may consecutively trigger biotically induced abiotic degradation. Finally, this study illustrates the valuable complementary application of compound-specific isotopic analysis combined...

  3. Potential for waste reduction

    International Nuclear Information System (INIS)

    Warren, J.L.

    1990-01-01

    The author focuses on wastes considered hazardous under the Resource Conservation and Recovery Act. This chapter discusses wastes that are of interest as well as the factors affecting the quantity of waste considered available for waste reduction. Estimates are provided of the quantities of wastes generated. Estimates of the potential for waste reduction are meaningful only to the extent that one can understand the amount of waste actually being generated. Estimates of waste reduction potential are summarized from a variety of government and nongovernment sources

  4. Identification of abiotic and biotic reductive dechlorination in a chlorinated ethene plume after thermal source remediation by means of isotopic and molecular biology tools

    DEFF Research Database (Denmark)

    Badin, Alice; Broholm, Mette Martina; Jacobsen, Carsten S.

    2016-01-01

    Thermal tetrachloroethene (PCE) remediation by steam injection in a sandy aquifer led to the release of dissolved organic carbon (DOC) from aquifer sediments resulting in more reduced redox conditions, accelerated PCE biodegradation, and changes in microbial populations. These changes were...... documented by comparing data collected prior to the remediation event and eight years later. Based on the premise that dual C-Cl isotope slopes reflect ongoing degradation pathways, the slopes associated with PCE and TCE suggest the predominance of biotic reductive dechlorination near the source area. PCE...... is supported by the relative lack of Dhc in the downgradient part of the plume. The results of this study show that thermal remediation can enhance the biodegradation of chlorinated ethenes, and that this effect can be traced to the mobilisation of DOC due to steam injection. This, in turn, results in more...

  5. The effect of pharmacogenetic profiling with a clinical decision support tool on healthcare resource utilization and estimated costs in the elderly exposed to polypharmacy.

    Science.gov (United States)

    Brixner, D; Biltaji, E; Bress, A; Unni, S; Ye, X; Mamiya, T; Ashcraft, K; Biskupiak, J

    2016-01-01

    To compare healthcare resource utilization (HRU) and clinical decision-making for elderly patients based on cytochrome P450 (CYP) pharmacogenetic testing and the use of a comprehensive medication management clinical decision support tool (CDST), to a cohort of similar non-tested patients. An observational study compared a prospective cohort of patients ≥65 years subjected to pharmacogenetic testing to a propensity score (PS) matched historical cohort of untested patients in a claims database. Patients had a prescribed medication or dose change of at least one of 61 oral drugs or combinations of ≥3 drugs at enrollment. Four-month HRU outcomes examined included hospitalizations, emergency department (ED) and outpatient visits and provider acceptance of test recommendations. Costs were estimated using national data sources. There were 205 tested patients PS matched to 820 untested patients. Hospitalization rate was 9.8% in the tested group vs. 16.1% in the untested group (RR = 0.61, 95% CI = 0.39-0.95, p = 0.027), ED visit rate was 4.4% in the tested group vs. 15.4% in the untested group (RR = 0.29, 95% CI = 0.15-0.55, p = 0.0002) and outpatient visit rate was 71.7% in the tested group vs. 36.5% in the untested group (RR = 1.97, 95% CI = 1.74-2.23, p provider majority (95%) considered the test helpful and 46% followed CDST provided recommendations. Patients CYP DNA tested and treated according to the personalized prescribing system had a significant decrease in hospitalizations and emergency department visits, resulting in potential cost savings. Providers had a high satisfaction rate with the clinical utility of the system and followed recommendations when appropriate.

  6. Application of a Lifestyle-Based Tool to Estimate Premature Cardiovascular Disease Events in Young Adults: The Coronary Artery Risk Development in Young Adults (CARDIA) Study.

    Science.gov (United States)

    Gooding, Holly C; Ning, Hongyan; Gillman, Matthew W; Shay, Christina; Allen, Norrina; Goff, David C; Lloyd-Jones, Donald; Chiuve, Stephanie

    2017-09-01

    Few tools exist for assessing the risk for early atherosclerotic cardiovascular disease (ASCVD) events in young adults. To assess the performance of the Healthy Heart Score (HHS), a lifestyle-based tool that estimates ASCVD events in older adults, for ASCVD events occurring before 55 years of age. This prospective cohort study included 4893 US adults aged 18 to 30 years from the Coronary Artery Risk Development in Young Adults (CARDIA) study. Participants underwent measurement of lifestyle factors from March 25, 1985, through June 7, 1986, and were followed up for a median of 27.1 years (interquartile range, 26.9-27.2 years). Data for this study were analyzed from February 24 through December 12, 2016. The HHS includes age, smoking status, body mass index, alcohol intake, exercise, and a diet score composed of self-reported daily intake of cereal fiber, fruits and/or vegetables, nuts, sugar-sweetened beverages, and red and/or processed meats. The HHS in the CARDIA study was calculated using sex-specific equations produced by its derivation cohorts. The ability of the HHS to assess the 25-year risk for ASCVD (death from coronary heart disease, nonfatal myocardial infarction, and fatal or nonfatal ischemic stroke) in the total sample, in race- and sex-specific subgroups, and in those with and without clinical ASCVD risk factors at baseline. Model discrimination was assessed with the Harrell C statistic; model calibration, with Greenwood-Nam-D'Agostino statistics. The study population of 4893 participants included 2205 men (45.1%) and 2688 women (54.9%) with a mean (SD) age at baseline of 24.8 (3.6) years; 2483 (50.7%) were black; and 427 (8.7%) had at least 1 clinical ASCVD risk factor (hypertension, hyperlipidemia, or diabetes types 1 and 2). Among these participants, 64 premature ASCVD events occurred in women and 99 in men. The HHS showed moderate discrimination for ASCVD risk assessment in this diverse population of mostly healthy young adults (C statistic, 0

  7. Planning and clinical studies of a commercial orthopedic metal artifact reduction tool for CT simulations for head-and-neck radiotherapy

    International Nuclear Information System (INIS)

    Kon, Hyuck Jun; Ye, Sung Joon; Kim, Jung In; Park, Jong Min; Lee, Jae Gi; Heo, Tae Min; Kim Kyung Su; Chun, Young Mi; Callahan, Zachariah

    2013-01-01

    In computed tomography (CT) images, the presence of high Z materials induces typical streak artifacts, called metal artifacts which can pervert CT Hounsfield numbers in the reconstructed images. These artifact-induced distortion of CT images can impact on the dose calculation based on the CT images. In the radiation therapy of Head-and-Neck cancer because of the concave-shaped target volumes, the complex anatomy, a lot of sensitive normal tissues and air cavity structures, it is important to get accurate CT images for dose calculation. But dental implant is common for H and N patients so that it is hard to get undistorted CT images. Moreover because dental implants are generally with the air cavity like oral cavity and nasal cavity in the same CT slice, they can make lots of distortion. In this study, we focused on evaluating the distortion on air cavity by the metal artifact and the effectiveness of the commercial orthopedic metal artifact reduction function (O-MAR) about the metal artifacts induced by the dental implant. The O-MAR algorithm increases the accuracy of CT Hounsfield numbers and reducing noises. Thus, it can contribute to the entire radiation treatment planning process, especially for contouring/segmentation. Although there was no significant difference in dose distributions for most cases, the O-MAR correction was shown to have an impact on high dose regions in air cavity

  8. Planning and clinical studies of a commercial orthopedic metal artifact reduction tool for CT simulations for head-and-neck radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Kon, Hyuck Jun; Ye, Sung Joon [Interdisplinary Program in Radiation Applied Life Science, Seoul National University Graduate School, Seoul (Korea, Republic of); Kim, Jung In; Park, Jong Min; Lee, Jae Gi; Heo, Tae Min [Institute of Radiation Medicine, Seoul National University Medical Research Center, Seoul (Korea, Republic of); Kim Kyung Su [Dept. of Radiation Oncology, Seoul National University College of Medicine, Seoul (Korea, Republic of); Chun, Young Mi [Philips Healthcare Korea, Seoul (Korea, Republic of); Callahan, Zachariah [Program in Biomedical Radiation Sciences, Dept. of Transdisciplinary Studies, Graduate School of Convergence Science and Technology, Seoul National University, Seoul (Korea, Republic of)

    2013-11-15

    In computed tomography (CT) images, the presence of high Z materials induces typical streak artifacts, called metal artifacts which can pervert CT Hounsfield numbers in the reconstructed images. These artifact-induced distortion of CT images can impact on the dose calculation based on the CT images. In the radiation therapy of Head-and-Neck cancer because of the concave-shaped target volumes, the complex anatomy, a lot of sensitive normal tissues and air cavity structures, it is important to get accurate CT images for dose calculation. But dental implant is common for H and N patients so that it is hard to get undistorted CT images. Moreover because dental implants are generally with the air cavity like oral cavity and nasal cavity in the same CT slice, they can make lots of distortion. In this study, we focused on evaluating the distortion on air cavity by the metal artifact and the effectiveness of the commercial orthopedic metal artifact reduction function (O-MAR) about the metal artifacts induced by the dental implant. The O-MAR algorithm increases the accuracy of CT Hounsfield numbers and reducing noises. Thus, it can contribute to the entire radiation treatment planning process, especially for contouring/segmentation. Although there was no significant difference in dose distributions for most cases, the O-MAR correction was shown to have an impact on high dose regions in air cavity.

  9. Development of an assessment tool to measure students′ perceptions of respiratory care education programs: Item generation, item reduction, and preliminary validation

    Directory of Open Access Journals (Sweden)

    Ghazi Alotaibi

    2013-01-01

    Full Text Available Objectives: Students who perceived their learning environment positively are more likely to develop effective learning strategies, and adopt a deep learning approach. Currently, there is no validated instrument for measuring the educational environment of educational programs on respiratory care (RC. The aim of this study was to develop an instrument to measure students′ perception of the RC educational environment. Materials and Methods: Based on the literature review and an assessment of content validity by multiple focus groups of RC educationalists, potential items of the instrument relevant to RC educational environment construct were generated by the research group. The initial 71 item questionnaire was then field-tested on all students from the 3 RC programs in Saudi Arabia and was subjected to multi-trait scaling analysis. Cronbach′s alpha was used to assess internal consistency reliabilities. Results: Two hundred and twelve students (100% completed the survey. The initial instrument of 71 items was reduced to 65 across 5 scales. Convergent and discriminant validity assessment demonstrated that the majority of items correlated more highly with their intended scale than a competing one. Cronbach′s alpha exceeded the standard criterion of >0.70 in all scales except one. There was no floor or ceiling effect for scale or overall score. Conclusions: This instrument is the first assessment tool developed to measure the RC educational environment. There was evidence of its good feasibility, validity, and reliability. This first validation of the instrument supports its use by RC students to evaluate educational environment.

  10. Watching Stars Grow: The adaptation and creation of instructional material for the acquisition, reduction, and analysis of data using photometry tools at the WestRock Observatory.

    Science.gov (United States)

    O'Keeffe, Brendon; Johnson, Michael; Murphy Williams, Rosa Nina

    2018-06-01

    The WestRock observatory at Columbus State University provides laboratory and research opportunities to earth and space science students specializing in astrophysics and planetary geology. Through continuing improvements, the observatory has been expanding the types of research carried out by undergraduates. Photometric measurements are an essential tool for observational research, especially for objects of variable brightness.Using the American Association of Variable Star Observers (AAVSO) database, students choose variable star targets for observation. Students then perform observations to develop the ability to properly record, calibrate, and interpret the data. Results are then submitted to a large database of observations through the AAVSO.Standardized observation procedures will be developed in the form of manuals and instructional videos specific to the equipment housed in the WestRock Observatory. This procedure will be used by students conducting laboratory exercises and undergraduate research projects that utilize photometry. Such hands-on, direct observational experience will help to familiarize the students with observational techniques and contribute to an active dataset, which in turn will prepare them for future research in their field.In addition, this set of procedures and the data resulting from them will be used in the wider outreach programs of the WestRock Observatory, so that students and interested public nationwide can learn about both the process and importance of photometry in astronomical research.

  11. Poster — Thur Eve — 11: Validation of the orthopedic metallic artifact reduction tool for CT simulations at the Ottawa Hospital Cancer Centre

    International Nuclear Information System (INIS)

    Sutherland, J; Foottit, C

    2014-01-01

    Metallic implants in patients can produce image artifacts in kilovoltage CT simulation images which can introduce noise and inaccuracies in CT number, affecting anatomical segmentation and dose distributions. The commercial orthopedic metal artifact reduction algorithm (O-MAR) (Philips Healthcare System) was recently made available on CT simulation scanners at our institution. This study validated the clinical use of O-MAR by investigating its effects on CT number and dose distributions. O-MAR corrected and uncorrected images were acquired with a Philips Brilliance Big Bore CT simulator of a cylindrical solid water phantom that contained various plugs (including metal) of known density. CT number accuracy was investigated by determining the mean and standard deviation in regions of interest (ROI) within each plug for uncorrected and O-MAR corrected images and comparing with no-metal image values. Dose distributions were calculated using the Monaco treatment planning system. Seven open fields were equally spaced about the phantom around a ROI near the center of the phantom. These were compared to a “correct” dose distribution calculated by overriding electron densities a no-metal phantom image to produce an image containing metal but no artifacts. An overall improvement in CT number and dose distribution accuracy was achieved by applying the O-MAR correction. Mean CT numbers and standard deviations were found to be generally improved. Exceptions included lung equivalent media, which is consistent with vendor specified contraindications. Dose profiles were found to vary by ±4% between uncorrected or O-MAR corrected images with O-MAR producing doses closer to ground truth

  12. Poster — Thur Eve — 11: Validation of the orthopedic metallic artifact reduction tool for CT simulations at the Ottawa Hospital Cancer Centre

    Energy Technology Data Exchange (ETDEWEB)

    Sutherland, J; Foottit, C [The Ottawa Hospital Cancer Centre (Canada)

    2014-08-15

    Metallic implants in patients can produce image artifacts in kilovoltage CT simulation images which can introduce noise and inaccuracies in CT number, affecting anatomical segmentation and dose distributions. The commercial orthopedic metal artifact reduction algorithm (O-MAR) (Philips Healthcare System) was recently made available on CT simulation scanners at our institution. This study validated the clinical use of O-MAR by investigating its effects on CT number and dose distributions. O-MAR corrected and uncorrected images were acquired with a Philips Brilliance Big Bore CT simulator of a cylindrical solid water phantom that contained various plugs (including metal) of known density. CT number accuracy was investigated by determining the mean and standard deviation in regions of interest (ROI) within each plug for uncorrected and O-MAR corrected images and comparing with no-metal image values. Dose distributions were calculated using the Monaco treatment planning system. Seven open fields were equally spaced about the phantom around a ROI near the center of the phantom. These were compared to a “correct” dose distribution calculated by overriding electron densities a no-metal phantom image to produce an image containing metal but no artifacts. An overall improvement in CT number and dose distribution accuracy was achieved by applying the O-MAR correction. Mean CT numbers and standard deviations were found to be generally improved. Exceptions included lung equivalent media, which is consistent with vendor specified contraindications. Dose profiles were found to vary by ±4% between uncorrected or O-MAR corrected images with O-MAR producing doses closer to ground truth.

  13. Estimating Treatment Effects from Contaminated Multi-Period Education Experiments: The Dynamic Impacts of Class Size Reductions. NBER Working Paper No. 15200

    Science.gov (United States)

    Ding, Weili; Lehrer, Steven F.

    2009-01-01

    This paper introduces an empirical strategy to estimate dynamic treatment effects in randomized trials that provide treatment in multiple stages and in which various noncompliance problems arise such as attrition and selective transitions between treatment and control groups. Our approach is applied to the highly influential four year randomized…

  14. EFSA Panel on Biological Hazards (BIOHAZ); Scientific Opinion on an estimation of the public health impact of setting a new target for the reduction of Salmonella in turkeys

    DEFF Research Database (Denmark)

    Hald, Tine

    The quantitative contribution of turkeys and other major animal-food sources to the burden of human salmonellosis in the European Union was estimated. A ‘Turkey Target Salmonella Attribution Model’ (TT-SAM) based on the microbial-subtyping approach was used. TT-SAM includes data from 25 EU Member...... States, four animal-food sources of Salmonella and 23 Salmonella serovars. The model employs 2010 EU statutory monitoring data on Salmonella in animal populations (EU baseline survey data for pigs), data on reported cases of human salmonellosis and food availability data. It estimates that 2.6 %, 10.......6 %, 17.0 % and 56.8 % of the human salmonellosis cases are attributable to turkeys, broilers, laying hens (eggs) and pigs, respectively. The top-6 serovars of fattening turkeys that contribute most to human cases are S. Enteritidis, S. Kentucky, S. Typhimurium, S. Newport, S. Virchow and S. Saintpaul...

  15. Comparative estimation of soil and plant pollution in the impact area of air emissions from an aluminium plant after technogenic load reduction.

    Science.gov (United States)

    Evdokimova, Galina A; Mozgova, Natalya P

    2015-01-01

    The work provides a comparative analysis of changes in soil properties in the last 10-13 years along the pollution gradient of air emissions from Kandalaksha aluminium plant in connection with the reduction of their volume. The content of the priority pollutant fluorine (F) in atmospheric precipitation and in the organic horizon of soil in the plant impact zone significantly decreased in 2011-2013 compared to 2001. The aluminium concentrations reduced only in immediate proximity to the plant (2 km). The fluorine, calcium (Ca) and magnesium (Mg) concentrations are higher in liquid phase compared to solid phase thus these elements can migrated to greater distances from the pollution source (up to 15-20 km). Silicon (Si), aluminium (Al), iron (Fe) and phosphorus (P) can be found only in solid phases and in fall-out within the 5 km. The acidity of soil litter reduced by 2 pH units in the proximity to the plot within the 2 km. The zone of maximum soil contamination decreased from 2.5 km to 1.5 km from the emission source, the zones of heavy and moderate pollution reduced by 5 km in connection with the reduction of pollutant emissions in the plant. A high correlation between the fluorine concentrations in vegetables and litter was found. Higher fluorine concentrations in the soil result in its accumulation in plants. Mosses accumulate fluorine most intensively.

  16. Phantom measurements and computed estimates of breast dose with radiotherapy for Hodgkin's lymphoma: dose reduction with the use of the involved field

    International Nuclear Information System (INIS)

    Wirth, A.; Kron, T.; Sorell, G.; Cramb, J.; Wittwer, H.; Sullivan, K.

    2008-01-01

    Full text: The risk of breast cancer following radiotherapy for Hodgkin's lymphoma appears to be dose related. In this study we compared breast dose in an anthropomorphic phantom for conventional 'mantle'; upper mediastinal/bilateral neck (minimantle) and unilateral neck fields, and evaluated the accuracy of computer planned dose estimates for out-of-field doses. For each field, computer-planned breast dose (CPD) estimates were compared with thermolu-minescence dosimetry measurements in five locations within 'breast tissue'. CPD were also compared with ion chamber measurements in a slab phantom. Measured dose and CPD were within 20% of each other up to approximately 10 cm from the field edge. Beyond 10 cm, the CPD underestimated dose by a factor of 2 or more. The minimantle reduced the breast dose by a factor of approximately 10 compared with the mantle treatment. Treating the neck field lowered the breast dose by a further 50% or more. Modern involved-field radiotherapy for lymphoma substantially reduces breast dose compared with mantle fields. Computer dosimetery underestimated dose at larger distances from the field. This needs to be considered if computer dosimetery is used to estimate breast dose and, by extrapolation, breast cancer risk.

  17. Using learning curves on energy-efficient technologies to estimate future energy savings and emission reduction potentials in the U.S. iron and steel industry

    Energy Technology Data Exchange (ETDEWEB)

    Karali, Nihan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Park, Won Young [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McNeil, Michael A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-06-18

    Increasing concerns on non-sustainable energy use and climate change spur a growing research interest in energy efficiency potentials in various critical areas such as industrial production. This paper focuses on learning curve aspects of energy efficiency measures in the U.S iron and steel sector. A number of early-stage efficient technologies (i.e., emerging or demonstration technologies) are technically feasible and have the potential to make a significant contribution to energy saving and CO2 emissions reduction, but fall short economically to be included. However, they may also have the cost effective potential for significant cost reduction and/or performance improvement in the future under learning effects such as ‘learning-by-doing’. The investigation is carried out using ISEEM, a technology oriented, linear optimization model. We investigated how steel demand is balanced with/without the availability learning curve, compared to a Reference scenario. The retrofit (or investment in some cases) costs of energy efficient technologies decline in the scenario where learning curve is applied. The analysis also addresses market penetration of energy efficient technologies, energy saving, and CO2 emissions in the U.S. iron and steel sector with/without learning impact. Accordingly, the study helps those who use energy models better manage the price barriers preventing unrealistic diffusion of energy-efficiency technologies, better understand the market and learning system involved, predict future achievable learning rates more accurately, and project future savings via energy-efficiency technologies with presence of learning. We conclude from our analysis that, most of the existing energy efficiency technologies that are currently used in the U.S. iron and steel sector are cost effective. Penetration levels increases through the years, even though there is no price reduction. However, demonstration technologies are not economically

  18. Estimation of Structure-Borne Noise Reduction Effect of Steel Railway Bridge Equipped with Floating Ladder Track and Floating Reinforced-Concrete Deck

    Science.gov (United States)

    Watanabe, Tsutomu; Sogabe, Masamichi; Asanuma, Kiyoshi; Wakui, Hajime

    A number of steel railway bridges have been constructed in Japan. Thin steel members used for the bridges easily tend to vibrate and generate structure-borne noise. Accordingly, the number of constructions of steel railway bridges tends to decrease in the urban areas from a viewpoint of environmental preservation. Then, as a countermeasure against structure-borne noise generated from steel railway bridges, we have developed a new type of the steel railway bridge equipped with a floating-ladder track and a floating reinforced-concrete (RC) deck. As a result of train-running experiment, it became apparent that the new steel railway bridge installed by double floating system has reduced a vibration velocity level by 10.5 dB(A) at main girder web as compared with a steel railway bridge installed by directly fastened track. This reduction effect was achieved by the ladder track and RC deck supported by resilient materials.

  19. Use of California biomass in the production of transportation-fuel oxygenates: Estimates for reduction in CO2 emissions and greenhouse gas potential on a life cycle basis

    International Nuclear Information System (INIS)

    Kadam, K. L.; Camobreco, V. J.; Glazebrook, B. E.

    1999-01-01

    A set of environmental flows associated with two disposal options for thee types of California biomass - forest biomass, rice straw, chaparral - over their life cycles were studied, the emphasis being on energy consumption and greenhouse gas emissions. The two options studied were: producing ethyl-tertiary-butyl ether (ETBE) from biomass and biomass burning, and producing methyl-tertiary-butyl ether (MTBE) from natural gas. Results showed a lower (by 40 to 50 per cent) greenhouse effect impact, lower net values for carbon dioxide and fossil fuel energy consumption, and higher net values for renewable energy consumption for the ETBE option. Based on these results, the deployment of the biomass-to-ethanol ETBE option is recommended as the one that contributes most to the reduction of GHG emissions. 12 refs., 2 tabs., 5 figs

  20. FY 2000 report on the results of the development of the stand-by consumption power reduction technology. R and D of the stand-by consumption power reduction technology in relation to machine tools; 2000 nendo taikiji shohi denryoku sakugen gijutsu kaihatsu seika hokokusho. Kosaku kiki ni kanrensuru taikiji shohi denryoku sakugen gijutsu no kenkyu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    In relation to machine tools, a method was studied for reducing consumption power at the time of standing-by, keeping the present processing accuracy, processing efficiency and operability, and the results of the study were as follows. As to the chip conveyer, the stand-by consumption power was reduced by approximately 75% by reducing the motor capacity by reviewing the effective operation method and the structure. Concerning the mist collector, a reduction of approximately 85% was confirmed by adopting the efficient operation method and high-efficient motor. Relating to the oil controller, an experiment was made on the frame thermal circulation system and the self-forced air cooling system. In terms of the rise in temperature, the controller is lower than the existing one, but it was found out that the consumption power could be reduced by about 14% and about 98%, respectively. About the coolant pump, the optimum inclination angle/necessary minimum flow rate of oil pan were confirmed, and a reduction of 47% was confirmed. As to the hydraulic pump, study was made of a method to substitute other power source for oil pressure and the intermittent operation of hydraulic unit, and effects of each were confirmed. Further studies were made of the control disc, reduction in air consumption, etc. (NEDO)

  1. Estimating the Effect and Economic Impact of Absenteeism, Presenteeism, and Work Environment-Related Problems on Reductions in Productivity from a Managerial Perspective.

    Science.gov (United States)

    Strömberg, Carl; Aboagye, Emmanuel; Hagberg, Jan; Bergström, Gunnar; Lohela-Karlsson, Malin

    2017-09-01

    The aim of this study was to propose wage multipliers that can be used to estimate the costs of productivity loss for employers in economic evaluations, using detailed information from managers. Data were collected in a survey panel of 758 managers from different sectors of the labor market. Based on assumed scenarios of a period of absenteeism due to sickness, presenteeism and work environment-related problem episodes, and specified job characteristics (i.e., explanatory variables), managers assessed their impact on group productivity and cost (i.e., the dependent variable). In an ordered probit model, the extent of productivity loss resulting from job characteristics is predicted. The predicted values are used to derive wage multipliers based on the cost of productivity estimates provided by the managers. The results indicate that job characteristics (i.e., degree of time sensitivity of output, teamwork, or difficulty in replacing a worker) are linked to productivity loss as a result of health-related and work environment-related problems. The impact of impaired performance on productivity differs among various occupations. The mean wage multiplier is 1.97 for absenteeism, 1.70 for acute presenteeism, 1.54 for chronic presenteeism, and 1.72 for problems related to the work environment. This implies that the costs of health-related and work environment-related problems to organizations can exceed the worker's wage. The use of wage multipliers is recommended for calculating the cost of health-related and work environment-related productivity loss to properly account for actual costs. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  2. Estimating travel reduction associated with the use of telemedicine by patients and healthcare professionals: proposal for quantitative synthesis in a systematic review

    Directory of Open Access Journals (Sweden)

    Bahaadinbeigy Kambiz

    2011-08-01

    Full Text Available Abstract Background A major benefit offered by telemedicine is the avoidance of travel, by patients, their carers and health care professionals. Unfortunately, there is very little published information about the extent of avoided travel. We propose to undertake a systematic review of literature which reports credible data on the reductions in travel associated with the use of telemedicine. Method The conventional approach to quantitative synthesis of the results from multiple studies is to conduct a meta analysis. However, too much heterogeneity exists between available studies to allow a meaningful meta analysis of the avoided travel when telemedicine is used across all possible settings. We propose instead to consider all credible evidence on avoided travel through telemedicine by fitting a linear model which takes into account the relevant factors in the circumstances of the studies performed. We propose the use of stepwise multiple regression to identify which factors are significant. Discussion Our proposed approach is illustrated by the example of teledermatology. In a preliminary review of the literature we found 20 studies in which the percentage of avoided travel through telemedicine could be inferred (a total of 5199 patients. The mean percentage avoided travel reported in the 12 store-and-forward studies was 43%. In the 7 real-time studies and in a single study with a hybrid technique, 70% of the patients avoided travel. A simplified model based on the modality of telemedicine employed (i.e. real-time or store and forward explained 29% of the variance. The use of store and forward teledermatology alone was associated with 43% of avoided travel. The increase in the proportion of patients who avoided travel (25% when real-time telemedicine was employed was significant (P = 0.014. Service planners can use this information to weigh up the costs and benefits of the two approaches.

  3. Estimating travel reduction associated with the use of telemedicine by patients and healthcare professionals: proposal for quantitative synthesis in a systematic review.

    Science.gov (United States)

    Wootton, Richard; Bahaadinbeigy, Kambiz; Hailey, David

    2011-08-08

    A major benefit offered by telemedicine is the avoidance of travel, by patients, their carers and health care professionals. Unfortunately, there is very little published information about the extent of avoided travel. We propose to undertake a systematic review of literature which reports credible data on the reductions in travel associated with the use of telemedicine. The conventional approach to quantitative synthesis of the results from multiple studies is to conduct a meta analysis. However, too much heterogeneity exists between available studies to allow a meaningful meta analysis of the avoided travel when telemedicine is used across all possible settings. We propose instead to consider all credible evidence on avoided travel through telemedicine by fitting a linear model which takes into account the relevant factors in the circumstances of the studies performed. We propose the use of stepwise multiple regression to identify which factors are significant. Our proposed approach is illustrated by the example of teledermatology. In a preliminary review of the literature we found 20 studies in which the percentage of avoided travel through telemedicine could be inferred (a total of 5199 patients). The mean percentage avoided travel reported in the 12 store-and-forward studies was 43%. In the 7 real-time studies and in a single study with a hybrid technique, 70% of the patients avoided travel. A simplified model based on the modality of telemedicine employed (i.e. real-time or store and forward) explained 29% of the variance. The use of store and forward teledermatology alone was associated with 43% of avoided travel. The increase in the proportion of patients who avoided travel (25%) when real-time telemedicine was employed was significant (P = 0.014). Service planners can use this information to weigh up the costs and benefits of the two approaches.

  4. Reduction redux.

    Science.gov (United States)

    Shapiro, Lawrence

    2018-04-01

    Putnam's criticisms of the identity theory attack a straw man. Fodor's criticisms of reduction attack a straw man. Properly interpreted, Nagel offered a conception of reduction that captures everything a physicalist could want. I update Nagel, introducing the idea of overlap, and show why multiple realization poses no challenge to reduction so construed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Tanglegrams: A Reduction Tool for Mathematical Phylogenetics.

    Science.gov (United States)

    Matsen, Frederick A; Billey, Sara C; Kas, Arnold; Konvalinka, Matjaz

    2018-01-01

    Many discrete mathematics problems in phylogenetics are defined in terms of the relative labeling of pairs of leaf-labeled trees. These relative labelings are naturally formalized as tanglegrams, which have previously been an object of study in coevolutionary analysis. Although there has been considerable work on planar drawings of tanglegrams, they have not been fully explored as combinatorial objects until recently. In this paper, we describe how many discrete mathematical questions on trees "factor" through a problem on tanglegrams, and how understanding that factoring can simplify analysis. Depending on the problem, it may be useful to consider a unordered version of tanglegrams, and/or their unrooted counterparts. For all of these definitions, we show how the isomorphism types of tanglegrams can be understood in terms of double cosets of the symmetric group, and we investigate their automorphisms. Understanding tanglegrams better will isolate the distinct problems on leaf-labeled pairs of trees and reveal natural symmetries of spaces associated with such problems.

  6. Model reduction tools for nonlinear structural dynamics

    NARCIS (Netherlands)

    Slaats, P.M.A.; Jongh, de J.; Sauren, A.A.H.J.

    1995-01-01

    Three mode types are proposed for reducing nonlinear dynamical system equations, resulting from finite element discretizations: tangent modes, modal derivatives, and newly added static modes. Tangent modes are obtained from an eigenvalue problem with a momentary tangent stiffness matrix. Their

  7. Active3 noise reduction

    International Nuclear Information System (INIS)

    Holzfuss, J.

    1996-01-01

    Noise reduction is a problem being encountered in a variety of applications, such as environmental noise cancellation, signal recovery and separation. Passive noise reduction is done with the help of absorbers. Active noise reduction includes the transmission of phase inverted signals for the cancellation. This paper is about a threefold active approach to noise reduction. It includes the separation of a combined source, which consists of both a noise and a signal part. With the help of interaction with the source by scanning it and recording its response, modeling as a nonlinear dynamical system is achieved. The analysis includes phase space analysis and global radial basis functions as tools for the prediction used in a subsequent cancellation procedure. Examples are given which include noise reduction of speech. copyright 1996 American Institute of Physics

  8. FY 2000 report on the results of the technology development of energy use reduction of machine tools. Development of dry cutting use abrasion resistant/lubricous coated tools; 2000 nendo energy shiyo gorika kosaku kikai nado gijutsu kaihatsu seika hokokusho. Dry sessakuyo taimamo junkatsusei hifuku kogu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    For the purpose of energy conservation and reduction of environmental loads of machine tools, study was conducted on the dry cutting which is the cutting with no use of cutting oil, and the FY 2000 results were summed up. The study was made on dry cutting use abrasion resistance/lubricous coated tools coated with the composite membrane of which the cutting life become little lower than that of existing tools using coolant. In the survey of abrasion resistant/lubricous films, it was found out that in the adhesion to ultra-hard substrates, the DLC single-layer film consisting only of carbon indicated the same excellent adhesion as intermediate-layer inserts. As to the synthesis of abrasion resistant/lubricous films, the synthesis of the composite membrane (WC/C membrane) consisting of tungsten carbide (WC) and carbon (C) was made using arc ion plating device. The WC/C membrane is composed of W and C and has the structure in which at nm levels the layer with much W and the layer with less W were alternately piled. Study was made of devices necessary for the development of abrasion resistant/lubricous films and the film formation for drill. (NEDO)

  9. Integral Criticality Estimators in MCATK

    Energy Technology Data Exchange (ETDEWEB)

    Nolen, Steven Douglas [Los Alamos National Laboratory; Adams, Terry R. [Los Alamos National Laboratory; Sweezy, Jeremy Ed [Los Alamos National Laboratory

    2016-06-14

    The Monte Carlo Application ToolKit (MCATK) is a component-based software toolset for delivering customized particle transport solutions using the Monte Carlo method. Currently under development in the XCP Monte Carlo group at Los Alamos National Laboratory, the toolkit has the ability to estimate the ke f f and a eigenvalues for static geometries. This paper presents a description of the estimators and variance reduction techniques available in the toolkit and includes a preview of those slated for future releases. Along with the description of the underlying algorithms is a description of the available user inputs for controlling the iterations. The paper concludes with a comparison of the MCATK results with those provided by analytic solutions. The results match within expected statistical uncertainties and demonstrate MCATK’s usefulness in estimating these important quantities.

  10. Noise Reduction, Atmospheric Pressure Admittance Estimation and Long-Period Component Extraction in Time-Varying Gravity Signals Using Ensemble Empirical Mode Decomposition

    Directory of Open Access Journals (Sweden)

    Linsong Wang

    2015-01-01

    Full Text Available Time-varying gravity signals, with their nonlinear, non-stationary and multi-scale characteristics, record the physical responses of various geodynamic processes and consist of a blend of signals with various periods and amplitudes, corresponding to numerous phenomena. Superconducting gravimeter (SG records are processed in this study using a multi-scale analytical method and corrected for known effects to reduce noise, to study geodynamic phenomena using their gravimetric signatures. Continuous SG (GWR-C032 gravity and barometric data are decomposed into a series of intrinsic mode functions (IMFs using the ensemble empirical mode decomposition (EEMD method, which is proposed to alleviate some unresolved issues (the mode mixing problem and the end effect of the empirical mode decomposition (EMD. Further analysis of the variously scaled signals is based on a dyadic filter bank of the IMFs. The results indicate that removing the high-frequency IMFs can reduce the natural and man-made noise in the data, which are caused by electronic device noise, Earth background noise and the residual effects of pre-processing. The atmospheric admittances based on frequency changes are estimated from the gravity and the atmospheric pressure IMFs in various frequency bands. These time- and frequency-dependent admittance values can be used effectively to improve the atmospheric correction. Using the EEMD method as a filter, the long-period IMFs are extracted from the SG time-varying gravity signals spanning 7 years. The resulting gravity residuals are well correlated with the gravity effect caused by the _ polar motion after correcting for atmospheric effects.

  11. Use of Consumer Acceptability as a Tool to Determine the Level of Sodium Reduction: A Case Study on Beef Soup Substituted With Potassium Chloride and Soy-Sauce Odor.

    Science.gov (United States)

    Lee, Cho Long; Lee, Soh Min; Kim, Kwang-Ok

    2015-11-01

    In this study, consumer acceptability was considered as a tool of reducing sodium rather than just using it as a final examination of the successfulness of the substitution. This study consisted of 4 experimental steps. First, by gradually reducing the concentrations of NaCl, consumer rejection threshold (CRT) of NaCl in beef soup was examined. Then, the amount of KCl that can increase preference was examined in 2 low sodium beef soups, with sodium concentrations slightly above or below the CRT. Relative saltiness of various KCl and NaCl/KCl mixtures were also measured. Finally, consumers evaluated acceptability and intensities of sensory characteristics for 9 beef soup samples that differed with respect to NaCl content and/or KCl content with/without addition of salty-congruent odor (soy-sauce odor). The results showed that in the "above CRT" system, consumer acceptability as well as sensory profile of low sodium beef soup substituted using KCl had similar profile to the control although saltiness was not fully recovered, whereas in the "below CRT" system, consumer acceptability was not recovered using KCl solely as a substitute. Potential of using salty-congruent odor as a final touch to induce salty taste was observed; however, the results inferred the importance of having almost no artificialness in the odor and having harmony with the final product when using it as a strategy to substitute sodium. Overall, the results of the study implied the importance of considering consumer acceptability when approaching sodium reduction to better understand the potentials of the sodium substitutes and salty-congruent odor. Strategies attempting to reduce sodium contents in food have mainly substituted sodium to the level that provides equivalent salty taste and then examined consumer liking. However, these approaches may result in failure for consumer appeal. This study attempted to consider consumer acceptability as a tool of reducing sodium in beef soup substituted using

  12. Volume reduction by the incineration of the combustible radioactive solid samples from radioisotope usage at the utilization facility. Estimation of the distribution of low energy β-emitter using the imaging plate

    International Nuclear Information System (INIS)

    Yumoto, Yasuhiro; Hanafusa, Tadashi; Nagamatsu, Tomohiro; Okada, Shigeru

    1999-01-01

    We want to establish a system of volume reduction by the incineration of the combustible radioactive solid wastes from radioisotope usage at the utilization facility. We have been performing experiments using an experimental incineration system to examine the distribution of radionuclides during incineration and to collect basic data. To reproduce the realistic conditions of incineration of low-level radioactive wastes in an experimental system, we adopted new incineration methods in this study. Low level radioactive samples (LLRS) were set up in a mesh container of stainless steel and incinerated at high temperature (over 800 degC) generated by two sets of high calorie gas burners. Low energy β-emitters 35 S, 45 Ca, 33 P, and a high energy β-emitter 32 P were used for the experiment. Their translocation percentages in exhaust air and dust were estimated using the Imaging Plate. Distribution of radionuclides during the incineration was similar to that estimated by conventional methods by our study or to that reported in incineration of liquid scintillation cocktail waste. We concluded that the use of the Imaging Plates is a simple and reliable method for estimation of the distribution of low energy β-emitters in incineration gas and ash. (author)

  13. Mark-resight approach as a tool to estimate population size of one of the world’s smallest goose populations

    DEFF Research Database (Denmark)

    Clausen, Kevin Kuhlmann; Fælled, Casper Cæsar; Clausen, Preben

    2013-01-01

    The present study investigates the use of a mark–resight procedure to estimate total population size in a local goose population. Using colour-ring sightings of the increasingly scattered population of Light-bellied Brent Geese Branta bernicla hrota from their Danish staging areas, we estimate...... a total population size of 7845 birds (95% CI: 7252–8438). This is in good agreement with numbers obtained from total counts, emphasizing that this population, although steadily increasing, is still small compared with historic numbers....

  14. Prediction of toxicity and comparison of alternatives using WebTEST (Web-services Toxicity Estimation Software Tool)(Bled Slovenia)

    Science.gov (United States)

    A Java-based web service is being developed within the US EPA’s Chemistry Dashboard to provide real time estimates of toxicity values and physical properties. WebTEST can generate toxicity predictions directly from a simple URL which includes the endpoint, QSAR method, and ...

  15. Reduction Assessment of Agricultural Non-Point Source Pollutant Loading

    OpenAIRE

    Fu, YiCheng; Zang, Wenbin; Zhang, Jian; Wang, Hongtao; Zhang, Chunling; Shi, Wanli

    2018-01-01

    NPS (Non-point source) pollution has become a key impact element to watershed environment at present. With the development of technology, application of models to control NPS pollution has become a very common practice for resource management and Pollutant reduction control in the watershed scale of China. The SWAT (Soil and Water Assessment Tool) model is a semi-conceptual model, which was put forward to estimate pollutant production & the influences on water quantity-quality under different...

  16. Intervention dose estimation in health promotion programmes: a framework and a tool. Application to the diet and physical activity promotion PRALIMAP trial

    Directory of Open Access Journals (Sweden)

    Legrand Karine

    2012-09-01

    Full Text Available Abstract Background Although the outcomes of health promotion and prevention programmes may depend on the level of intervention, studies and trials often fail to take it into account. The objective of this work was to develop a framework within which to consider the implementation of interventions, and to propose a tool with which to measure the quantity and the quality of activities, whether planned or not, relevant to the intervention under investigation. The framework and the tool were applied to data from the diet and physical activity promotion PRALIMAP trial. Methods A framework allowing for calculation of an intervention dose in any health promotion programme was developed. A literature reviews revealed several relevant concepts that were considered in greater detail by a multidisciplinary working group. A method was devised with which to calculate the dose of intervention planned and that is actually received (programme-driven activities dose, as well as the amount of non-planned intervention (non-programme-driven activities dose. Results Indicators cover the roles of all those involved (supervisors, anchor personnel as receivers and providers, targets, in each intervention-related groups (IRG: basic setting in which a given intervention is planned by the programme and may differ in implementation level and for every intervention period. All indicators are described according to two domains (delivery, participation in two declensions (quantity and quality. Application to PRALIMAP data revealed important inter- and intra-IRG variability in intervention dose. Conclusions A literature analysis shows that the terminology in this area is not yet consolidated and that research is ongoing. The present work provides a methodological framework by specifying concepts, by defining new constructs and by developing multiple information synthesis methods which must be introduced from the programme's conception. Application to PRALIMAP underlined the

  17. Reduction of robot base parameters

    Energy Technology Data Exchange (ETDEWEB)

    Vandanjon, P O [CEA Centre d` Etudes de Saclay, 91 - Gif-sur-Yvette (France). Dept. des Procedes et Systemes Avances; Gautier, M [Nantes Univ., 44 (France)

    1996-12-31

    This paper is a new step in the search of minimum dynamic parameters of robots. In spite of planing exciting trajectories and using base parameters, some parameters remain not identifiable due to the perturbation effects. In this paper, we propose methods to reduce the set of base parameters in order to get an essential set of parameters. This new set defines a simplified identification model witch improves the noise immunity of the estimation process. It contributes also in reducing the computation burden of a simplified dynamic model. Different methods are proposed and are classified in two parts: methods, witch perform reduction and identification together, come from statistical field and methods, witch reduces the model before the identification thanks to a priori information, come from numerical field like the QR factorization. Statistical tools and QR reduction are shown to be efficient and adapted to determine the essential parameters. They can be applied to open-loop, or graph structured rigid robot, as well as flexible-link robot. Application for the PUMA 560 robot is given. (authors). 9 refs., 4 tabs.

  18. Reduction of robot base parameters

    International Nuclear Information System (INIS)

    Vandanjon, P.O.

    1995-01-01

    This paper is a new step in the search of minimum dynamic parameters of robots. In spite of planing exciting trajectories and using base parameters, some parameters remain not identifiable due to the perturbation effects. In this paper, we propose methods to reduce the set of base parameters in order to get an essential set of parameters. This new set defines a simplified identification model witch improves the noise immunity of the estimation process. It contributes also in reducing the computation burden of a simplified dynamic model. Different methods are proposed and are classified in two parts: methods, witch perform reduction and identification together, come from statistical field and methods, witch reduces the model before the identification thanks to a priori information, come from numerical field like the QR factorization. Statistical tools and QR reduction are shown to be efficient and adapted to determine the essential parameters. They can be applied to open-loop, or graph structured rigid robot, as well as flexible-link robot. Application for the PUMA 560 robot is given. (authors). 9 refs., 4 tabs

  19. Can we predict addiction to opioid analgesics? A possible tool to estimate the risk of opioid addiction in patients with pain.

    Science.gov (United States)

    Skala, Katrin; Reichl, Lukas; Ilias, Wilfried; Likar, Rudolf; Grogl-Aringer, Gabriele; Wallner, Christina; Schlaff, Golda; Herrmann, Peter; Lesch, Otto; Walter, Henriette

    2013-01-01

    The use of opioid analgesics in the treatment of chronic pain conditions has long been controversial. They have been reported to be relatively safe when prescribed with caution, but a brief and valid instrument to estimate a person's risk of addiction is still missing. The aim of this study was to investigate a self-rating questionnaire allowing an estimation of a person's risk of addiction to opioid analgesics. Retrospective review. Four Austrian hospitals. Seven hundred forty-one patients were interviewed. Of these, 634 patients were affected with chronic pain while 107 patients had a history of opioid addiction. Patients were interviewed about alcohol and nicotine consumption and family history of psychiatric disorders. Attitudes towards medication and the origin of pain were examined. We asked patients with an opioid addiction and patients suffering from chronic pain to complete a short questionnaire intended to help screen for addiction potential. Compared to the patients suffering from chronic pain, patients with an opioid addiction significantly more often had alcohol- and nicotine-related pathologies and psychiatric comorbidity. A family history of mental illness and developmental problems were significantly more frequent in this group. Compared to those not addicted, those with an opioid addiction had significantly higher expectations concerning the potential of medication to change one's mental state; they thought that psychological  factors might contribute to the pain they feel. The main limitation of this study is the use of a self-rating instrument which reduces objectivity and introduces the possibility of misreporting. Also, the 2 groups differ in number and are not homogenous. We found differences in questionnaire responses between patients with an opioid addiction and patients suffering from chronic pain to be dependent upon the prevalence of current or former addiction, psychiatric history, attitudes towards medication, and ideas about the

  20. Wire-mesh capped deposition sensors: Novel passive tool for coarse fraction flux estimation of radon thoron progeny in indoor environments

    International Nuclear Information System (INIS)

    Mayya, Y.S.; Mishra, Rosaline; Prajith, Rama; Sapra, B.K.; Kushwaha, H.S.

    2010-01-01

    Deposition-based 222 Rn and 220 Rn progeny sensors act as unique, passive tools for determining the long time-averaged progeny deposition fluxes in the environment. The use of these deposition sensors as progeny concentration monitors was demonstrated in typical indoor environments as conceptually superior alternatives to gas-based indirect monitoring methods. In the present work, the dependency of these deposition monitors on various environmental parameters is minimized by capping the deposition sensor with a suitable wire mesh. These wire-mesh capped deposition sensors measure the coarse fraction deposition flux, which is less dependent on the change in environmental parameters like ventilation rate and turbulence. The calibration of these wire-mesh capped coarse fraction progeny sensors was carried out by laboratory controlled experiments. These sensors were deployed both in indoor and in occupational environments having widely different ventilation rates. The obtained coarse fraction deposition velocities were fairly constant in these environments, which further confirmed that the signal on the wire-mesh capped sensors show the least dependency on the change in environmental parameters. This technique has the potential to serve as a passive particle sizer in the general context of nanoparticles using progeny species as surrogates. On the whole, there exists a strong case for developing a passive system that responds only to coarse fraction for providing alternative tools for dosimetry and environmental fine particle research. - Research highlights: → Wire-mesh capped deposition sensor measures the coarse fraction deposition flux → Coarse fraction deposition flux less dependent on environmental conditions → Wire-mesh capped deposition sensor as passive particle sizer

  1. Optimizing community case management strategies to achieve equitable reduction of childhood pneumonia mortality: An application of Equitable Impact Sensitive Tool (EQUIST) in five low- and middle-income countries.

    Science.gov (United States)

    Waters, Donald; Theodoratou, Evropi; Campbell, Harry; Rudan, Igor; Chopra, Mickey

    2012-12-01

    The aim of this study was to populate the Equitable Impact Sensitive Tool (EQUIST) framework with all necessary data and conduct the first implementation of EQUIST in studying cost-effectiveness of community case management of childhood pneumonia in 5 low- and middle-income countries with relation to equity impact. Wealth quintile-specific data were gathered or modelled for all contributory determinants of the EQUIST framework, namely: under-five mortality rate, cost of intervention, intervention effectiveness, current coverage of intervention and relative disease distribution. These were then combined statistically to calculate the final outcome of the EQUIST model for community case management of childhood pneumonia: US$ per life saved, in several different approaches to scaling-up. The current 'mainstream' approach to scaling-up of interventions is never the most cost-effective. Community-case management appears to strongly support an 'equity-promoting' approach to scaling-up, displaying the highest levels of cost-effectiveness in interventions targeted at the poorest quintile of each study country, although absolute cost differences vary by context. The relationship between cost-effectiveness and equity impact is complex, with many determinants to consider. One important way to increase intervention cost-effectiveness in poorer quintiles is to improve the efficiency and quality of delivery. More data are needed in all areas to increase the accuracy of EQUIST-based estimates.

  2. Quantitative tools for addressing hospital readmissions

    Directory of Open Access Journals (Sweden)

    Lagoe Ronald J

    2012-11-01

    Full Text Available Abstract Background Increased interest in health care cost containment is focusing attention on reduction of hospital readmissions. Major payors have already developed financial penalties for providers that generate excess readmissions. This subject has benefitted from the development of resources such as the Potentially Preventable Readmissions software. This process has encouraged hospitals to renew efforts to improve these outcomes. The aim of this study was to describe quantitative tools such as definitions, risk estimation, and tracking of patients for reducing hospital readmissions. Findings This study employed the Potentially Preventable Readmissions software to develop quantitative tools for addressing hospital readmissions. These tools included two definitions of readmissions that support identification and management of patients. They also included analytical approaches for estimation of the risk of readmission for individual patients by age, discharge status of the initial admission, and severity of illness. They also included patient specific spreadsheets for tracking of target populations and for evaluation of the impact of interventions. Conclusions The study demonstrated that quantitative tools including the development of definitions of readmissions, estimation of the risk of readmission, and patient specific spreadsheets could contribute to the improvement of patient outcomes in hospitals.

  3. The death of the Job plot, transparency, open science and online tools, uncertainty estimation methods and other developments in supramolecular chemistry data analysis.

    Science.gov (United States)

    Brynn Hibbert, D; Thordarson, Pall

    2016-10-25

    Data analysis is central to understanding phenomena in host-guest chemistry. We describe here recent developments in this field starting with the revelation that the popular Job plot method is inappropriate for most problems in host-guest chemistry and that the focus should instead be on systematically fitting data and testing all reasonable binding models. We then discuss approaches for estimating uncertainties in binding studies using case studies and simulations to highlight key issues. Related to this is the need for ready access to data and transparency in the methodology or software used, and we demonstrate an example a webportal () that aims to address this issue. We conclude with a list of best-practice protocols for data analysis in supramolecular chemistry that could easily be translated to other related problems in chemistry including measuring rate constants or drug IC 50 values.

  4. Estimation and optimization of flank wear and tool lifespan in finish turning of AISI 304 stainless steel using desirability function approach

    Directory of Open Access Journals (Sweden)

    Lakhdar Bouzid

    2018-10-01

    Full Text Available The wear of cutting tools remains a major obstacle. The effects of wear are not only antagonistic at the lifespan and productivity, but also harmful with the surface quality. The present work deals with some machinability studies on flank wear, surface roughness, and lifespan in finish turning of AISI 304 stainless steel using multilayer Ti(C,N/Al2O3/TiN coated carbide inserts. The machining experiments are conducted based on the response surface methodology (RSM. Combined effects of three cutting parameters, namely cutting speed, feed rate and cutting time on the two performance outputs (i.e. VB and Ra, and combined effects of two cutting parameters, namely cutting speed and feed rate on lifespan (T, are explored employing the analysis of variance (ANOVA. The relationship between the variables and the technological parameters is determined using a quadratic regression model and optimal cutting conditions for each performance level are established through desirability function approach (DFA optimization. The results show that the flank wear is influenced principally by the cutting time and in the second level by the cutting speed. In addition, it is indicated that the cutting time is the dominant factor affecting workpiece surface roughness followed by feed rate, while lifespan is influenced by cutting speed. The optimum level of input parameters for composite desirability was found Vc1-f1-t1 for VB, Ra and Vc1-f1 for T, with a maximum percentage of error 6.38%.

  5. Comparison of the near field/far field model and the advanced reach tool (ART) model V1.5: exposure estimates to benzene during parts washing with mineral spirits.

    Science.gov (United States)

    LeBlanc, Mallory; Allen, Joseph G; Herrick, Robert F; Stewart, James H

    2018-03-01

    The Advanced Reach Tool V1.5 (ART) is a mathematical model for occupational exposures conceptually based on, but implemented differently than, the "classic" Near Field/Far Field (NF/FF) exposure model. The NF/FF model conceptualizes two distinct exposure "zones"; the near field, within approximately 1m of the breathing zone, and the far field, consisting of the rest of the room in which the exposure occurs. ART has been reported to provide "realistic and reasonable worst case" estimates of the exposure distribution. In this study, benzene exposure during the use of a metal parts washer was modeled using ART V1.5, and compared to actual measured workers samples and to NF/FF model results from three previous studies. Next, the exposure concentrations expected to be exceeded 25%, 10% and 5% of the time for the exposure scenario were calculated using ART. Lastly, ART exposure estimates were compared with and without Bayesian adjustment. The modeled parts washing benzene exposure scenario included distinct tasks, e.g. spraying, brushing, rinsing and soaking/drying. Because ART can directly incorporate specific types of tasks that are part of the exposure scenario, the present analysis identified each task's determinants of exposure and performance time, thus extending the work of the previous three studies where the process of parts washing was modeled as one event. The ART 50th percentile exposure estimate for benzene (0.425ppm) more closely approximated the reported measured mean value of 0.50ppm than the NF/FF model estimates of 0.33ppm, 0.070ppm or 0.2ppm obtained from other modeling studies of this exposure scenario. The ART model with the Bayesian analysis provided the closest estimate to the measured value (0.50ppm). ART (with Bayesian adjustment) was then used to assess the 75th, the 90th and 95th percentile exposures, predicting that on randomly selected days during this parts washing exposure scenario, 25% of the benzene exposures would be above 0.70ppm; 10

  6. Wear-Induced Changes in FSW Tool Pin Profile: Effect of Process Parameters

    Science.gov (United States)

    Sahlot, Pankaj; Jha, Kaushal; Dey, G. K.; Arora, Amit

    2018-06-01

    Friction stir welding (FSW) of high melting point metallic (HMPM) materials has limited application due to tool wear and relatively short tool life. Tool wear changes the profile of the tool pin and adversely affects weld properties. A quantitative understanding of tool wear and tool pin profile is crucial to develop the process for joining of HMPM materials. Here we present a quantitative wear study of H13 steel tool pin profile for FSW of CuCrZr alloy. The tool pin profile is analyzed at multiple traverse distances for welding with various tool rotational and traverse speeds. The results indicate that measured wear depth is small near the pin root and significantly increases towards the tip. Near the pin tip, wear depth increases with increase in tool rotational speed. However, change in wear depth near the pin root is minimal. Wear depth also increases with decrease in tool traverse speeds. Tool pin wear from the bottom results in pin length reduction, which is greater for higher tool rotational speeds, and longer traverse distances. The pin profile changes due to wear and result in root defect for long traverse distance. This quantitative understanding of tool wear would be helpful to estimate tool wear, optimize process parameters, and tool pin shape during FSW of HMPM materials.

  7. Reduction corporoplasty.

    Science.gov (United States)

    Hakky, Tariq S; Martinez, Daniel; Yang, Christopher; Carrion, Rafael E

    2015-01-01

    Here we present the first video demonstration of reduction corporoplasty in the management of phallic disfigurement in a 17 year old man with a history sickle cell disease and priapism. Surgical management of aneurysmal dilation of the corpora has yet to be defined in the literature. We preformed bilateral elliptical incisions over the lateral corpora as management of aneurysmal dilation of the corpora to correct phallic disfigurement. The patient tolerated the procedure well and has resolution of his corporal disfigurement. Reduction corporoplasty using bilateral lateral elliptical incisions in the management of aneurysmal dilation of the corpora is a safe an feasible operation in the management of phallic disfigurement.

  8. Estimated Visceral Adipose Tissue, but Not Body Mass Index, Is Associated with Reductions in Glomerular Filtration Rate Based on Cystatin C in the Early Stages of Chronic Kidney Disease

    Directory of Open Access Journals (Sweden)

    Ana Karina Teixeira da Cunha França

    2014-01-01

    Full Text Available Information on the association between obesity and initial phases of chronic kidney disease (CKD is still limited, principally those regarding the influence of visceral adipose tissue. We investigated whether the visceral adipose tissue is more associated with reductions in glomerular filtration rate (GFR than total and abdominal obesity in hypertensive individuals with stage 1-2 CKD. A cross-sectional study was implemented which involved 241 hypertensive patients undergoing treatment at a primary health care facility. GFR was estimated using equations based on creatinine and cystatin C levels. Explanatory variables included body mass index (BMI, waist circumference (WC, and estimated visceral adipose tissue (eVAT. The mean age was 59.6±9.2 years old and 75.9% were female. According to BMI, 28.2% of subjects were obese. Prevalence of increased WC and eVAT was 63.9% and 58.5%, respectively. Results from the assessment of GFR by BMI, WC, and eVAT categories showed that only women with increased eVAT (≥150 cm2 had a lower mean GFR by Larsson (P=0.016, Levey 2 (P=0.005, and Levey 3 (P=0.008 equations. The same result was not observed when the MDRD equation was employed. No association was found between BMI, WC, eVAT, and GFR using only serum creatinine. In the early stages of CKD, increased eVAT in hypertensive women was associated with decreased GFR based on cystatin C.

  9. Development of Prediction Tool for Sound Absorption and Sound Insulation for Sound Proof Properties

    OpenAIRE

    Yoshio Kurosawa; Takao Yamaguchi

    2015-01-01

    High frequency automotive interior noise above 500 Hz considerably affects automotive passenger comfort. To reduce this noise, sound insulation material is often laminated on body panels or interior trim panels. For a more effective noise reduction, the sound reduction properties of this laminated structure need to be estimated. We have developed a new calculate tool that can roughly calculate the sound absorption and insulation properties of laminate structure and handy ...

  10. Peak-Temperature (Tp) estimates with Raman micro-spectroscopy on carbonaceous material (RSCM) as a tool for distinguishing tectometamorphic regimes in the Tauern Window (Eastern Alps, Austria)

    Science.gov (United States)

    Scharf, A.; Ziemann, M. A.; Handy, M. R.

    2012-04-01

    Raman micro-spectroscopy of CM in 201 samples from the eastern part of the Tauern Window reveal the overprinting of HP subduction metamorphism, post-nappe HT metamorphism and late orogenic crustal attenuation during exhumation. The following patterns of our CM data lend insight into this evolution, especially when considered in the context of the distribution of mineral parageneses, radiometric ages and structures in the Tauern Window: (1) a continuous increase in Tp (330-500°C) across nappe boundaries between two oceanic units (Valais, Piemont) in the NE part of the Tauern Window indicates that temperatures equilibrated after accretion and nappe stacking. The Tp gradient preserved in this area is ca. 10°C/km; (2) a higher Tp gradient (20-25°C/km) in the footwall of a major top-SE extensional shear zone affecting the same units at the E end of the Tauern Window reveals that the previously equilibrated Tp gradient was attenuated during doming and exhumation; (3) identical Tp estimates (500°C) -within error and for a given calibration (ref. below) - are recorded at the top and bottom of a moderately E-dipping basement nappe (Storz Nappe) within a foreland-dipping duplex (the Venediger Nappe Complex, VNC) forming the basement core of the Tauern Window. The Tp value at the top of this nappe occurs at the base of the attenuated Tp gradient described in (2), whereas the Tp at the bottom of the nappe is typical for high Tp values (530-640°C) in the core of the duplex that is exposed in a post-nappe dome (Hochalm) in the SE part of the Tauern Window. We intepret Tp values in the central part of the Tauern Window (530°C) that contain relict HP assemblages and are unaffected by doming as the maximum temperature of subduction-related metamorphism. Existing radiometric data in the area as well as from related units in other parts of the Tauern Window indicate that the thermal peak of HP metamorphism occurred at 38-40 Ma (Kurz et al. 2008, refs therein), followed by HT

  11. Paper-based microfluidic devices on the crime scene: A simple tool for rapid estimation of post-mortem interval using vitreous humour.

    Science.gov (United States)

    Garcia, Paulo T; Gabriel, Ellen F M; Pessôa, Gustavo S; Santos Júnior, Júlio C; Mollo Filho, Pedro C; Guidugli, Ruggero B F; Höehr, Nelci F; Arruda, Marco A Z; Coltro, Wendell K T

    2017-06-29

    This paper describes for the first time the use of paper-based analytical devices at crime scenes to estimate the post-mortem interval (PMI), based on the colorimetric determination of Fe 2+ in vitreous humour (VH) samples. Experimental parameters such as the paper substrate, the microzone diameter, the sample volume and the 1,10-phenanthroline (o-phen) concentration were optimised in order to ensure the best analytical performance. Grade 1 CHR paper, microzone with diameter of 5 mm, a sample volume of 4 μL and an o-phen concentration of 0.05 mol/L were chosen as the optimum experimental conditions. A good linear response was observed for a concentration range of Fe 2+ between 2 and 10 mg/L and the calculated values for the limit of detection (LOD) and limit of quantification (LOQ) were 0.3 and 0.9 mg/L, respectively. The specificity of the Fe 2+ colorimetric response was tested in the presence of the main interfering agents and no significant differences were found. After selecting the ideal experimental conditions, four HV samples were investigated on paper-based devices. The concentration levels of Fe 2+ achieved for samples #1, #2, #3 and #4 were 0.5 ± 0.1, 0.7 ± 0.1, 1.2 ± 0.1 and 15.1 ± 0.1 mg/L, respectively. These values are in good agreement with those calculated by ICP-MS. It important to note that the concentration levels measured using both techniques are proportional to the PMI. The limitation of the proposed analytical device is that it is restricted to a PMI greater than 1 day. The capability of providing an immediate answer about the PMI on the crime scene without any sophisticated instrumentation is a great achievement in modern instrumentation for forensic chemistry. The strategy proposed in this study could be helpful in many criminal investigations. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Omitted variable bias in crash reduction factors.

    Science.gov (United States)

    2015-09-01

    Transportation planners and traffic engineers are increasingly turning to crash reduction factors to evaluate changes in road : geometric and design features in order to reduce crashes. Crash reduction factors are typically estimated based on segment...

  13. Control Strategy Tool (CoST)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The EPA Control Strategy Tool (CoST) is a software tool for projecting potential future control scenarios, their effects on emissions and estimated costs. This tool...

  14. Reduction Corporoplasty

    Directory of Open Access Journals (Sweden)

    Tariq S. Hakky

    2015-04-01

    Full Text Available Objective Here we present the first video demonstration of reduction corporoplasty in the management of phallic disfigurement in a 17 year old man with a history sickle cell disease and priapism. Introduction Surgical management of aneurysmal dilation of the corpora has yet to be defined in the literature. Materials and Methods: We preformed bilateral elliptical incisions over the lateral corpora as management of aneurysmal dilation of the corpora to correct phallic disfigurement. Results The patient tolerated the procedure well and has resolution of his corporal disfigurement. Conclusions Reduction corporoplasty using bilateral lateral elliptical incisions in the management of aneurysmal dilation of the corpora is a safe an feasible operation in the management of phallic disfigurement.

  15. Estimation of the Thickness and the Material Combination of the Thermal Stress Control Layer (TSCL) for the Stellite21 Hardfaced STD61 Hot Working Tool Steel Using Three-Dimensional Finite Element Analysis

    International Nuclear Information System (INIS)

    Park, Na-Ra; Ahn, Dong-Gyu; Oh, Jin-Woo

    2014-01-01

    The research on a thermal stress control layer (TSCL) begins to undertake to reduce residual stress and strain in the vicinity of the joined region between the hardfacing layer and the base part. The goal of this paper is to estimate the material combination and the thickness of TSCL for the Stellite21 hardfaced STD61 hot working tool steel via three-dimensional finite element analysis (FEA). TSCL is created by the combination of Stellite21 and STD61. The thickness of TSCL ranges from 0.5 mm to 1.5 mm. The influence of the material combination and the thickness of TSCL on temperature, thermal stress and thermal strain distributions of the hardfaced part have been investigated. The results of the investigation have been revealed that a proper material combination of TSCL is Stellite21 of 50 % and STD61 of 50 %, and its appropriate thickness is 1.0 mm

  16. Estimated potential of energy saving and reduction of the demand commercial buildings illumination; Potencial estimado de ahorro de energia y reduccion de la demanda en iluminacion de edificios comerciales

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez Gomez, Victor Hugo; Morillon Galvez, David [Posgrado en Energetica de la DEPFI-UNAM, Mexico, D. F. (Mexico)

    1999-07-01

    In this paper the estimated energy saving potential in illumination, the energy end use and the technology used in commercial buildings of different use is analyzed. Estimation that departs from information of the Fideicomiso para el Ahorro de Energia (FIDE) demonstrative cases, presents the energy saving and the demand reduction in a sample of 29 buildings, among which are shopping malls, hospitals, schools, hotels, restaurants and public buildings in which energy saving programs have been carried out, with measures such as the cleaning of the luminaries and its replacement for more efficient ones. The average saving obtained is of 21.81%, in the following areas: illumination, air conditioning and others. In addition, in a sample of 4 buildings, it was observed that before applying the energy saving programs, two of them did not fulfill with the norm NOM-007-1995 (electric power density in interior lighting systems W/m{sup 2}) and later did fulfill the values and criteria of the norm. [Spanish] En el presente trabajo se analiza el potencial estimado de ahorro de energia en iluminacion, el uso final de la energia y la tecnologia empleada en edificios comerciales de uso distinto. Estimacion que parte de la informacion de los casos demostrativos del Fideicomiso para el Ahorro de Energia (FIDE), se presenta el ahorro y reduccion de la demanda en una muestra de 29 edificios, entre los que se tienen centros comerciales, hospitales, escuelas, hoteles, restaurantes y edificios publicos en los cuales se ha llevado a cabo programas de ahorro de energia, con medidas como la limpieza de las luminarias y su remplazo por otros mas eficientes. El ahorro promedio obtenido es de 21.81%, en las siguientes areas: iluminacion, aire acondicionado y otros. Ademas, en una muestra de 4 edificios, se observo que antes de aplicar los programas de ahorro de energia, dos no cumplian con la norma NOM-007-ENER-1995 (densidad de potencia electrica en alumbrado interior W/m{sup 2}) y

  17. Climate Action Planning Tool | NREL

    Science.gov (United States)

    NREL's Climate Action Planning Tool provides a quick, basic estimate of how various technology options can contribute to an overall climate action plan for your research campus. Use the tool to Tool Calculation Formulas and Assumptions Climate Neutral Research Campuses Website Climate Neutral

  18. Snubber reduction

    International Nuclear Information System (INIS)

    Olson, D.E.; Singh, A.K.

    1986-01-01

    Many safety-related piping systems in nuclear power plants have been oversupported. Since snubbers make up a large percentage of the pipe supports or restraints used in a plant, a plant's snubber population is much larger than required to adequately restrain the piping. This has resulted in operating problems and unnecessary expenses for maintenance and inservice inspections (ISIs) of snubbers. This paper presents an overview of snubber reduction, including: the incentives for removing snubbers, a historical perspective on how piping became oversupported, why it is possible to remove snubbers, and the costs and benefits of doing so

  19. Experimental evaluation of tool run-out in micro milling

    Science.gov (United States)

    Attanasio, Aldo; Ceretti, Elisabetta

    2018-05-01

    This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.

  20. Slope-Area Computation Program Graphical User Interface 1.0—A Preprocessing and Postprocessing Tool for Estimating Peak Flood Discharge Using the Slope-Area Method

    Science.gov (United States)

    Bradley, D. Nathan

    2012-01-01

    The slope-area method is a technique for estimating the peak discharge of a flood after the water has receded (Dalrymple and Benson, 1967). This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks (HWMs) on trees or buildings. These indicators of flood stage are combined with measurements of the cross-sectional geometry of the stream, estimates of channel roughness, and a mathematical model that balances the total energy of the flow between cross sections. This is in contrast to a “direct” measurement of discharge during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a gage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the stream gage, resulting in more accurate computation of high flows. The Slope-Area Computation program (SAC; Fulford, 1994) is an implementation of the slope-area method that computes a peak-discharge estimate from inputs of water-surface slope (from surveyed HWMs), channel geometry, and estimated channel roughness. SAC is a command line program written in Fortran that reads input data from a formatted text file and prints results to another formatted text file. Preparing the input file can be time-consuming and prone to errors. This document describes the SAC graphical user interface (GUI), a crossplatform “wrapper” application that prepares the SAC input file, executes the program, and helps the user interpret the output. The SAC GUI is an update and enhancement of the slope-area method (SAM; Hortness, 2004; Berenbrock, 1996), an earlier spreadsheet tool used to aid field personnel in the completion of a slope-area measurement. The SAC GUI reads survey data

  1. Forensic surface metrology: tool mark evidence.

    Science.gov (United States)

    Gambino, Carol; McLaughlin, Patrick; Kuo, Loretta; Kammerman, Frani; Shenkin, Peter; Diaczuk, Peter; Petraco, Nicholas; Hamby, James; Petraco, Nicholas D K

    2011-01-01

    Over the last several decades, forensic examiners of impression evidence have come under scrutiny in the courtroom due to analysis methods that rely heavily on subjective morphological comparisons. Currently, there is no universally accepted system that generates numerical data to independently corroborate visual comparisons. Our research attempts to develop such a system for tool mark evidence, proposing a methodology that objectively evaluates the association of striated tool marks with the tools that generated them. In our study, 58 primer shear marks on 9 mm cartridge cases, fired from four Glock model 19 pistols, were collected using high-resolution white light confocal microscopy. The resulting three-dimensional surface topographies were filtered to extract all "waviness surfaces"-the essential "line" information that firearm and tool mark examiners view under a microscope. Extracted waviness profiles were processed with principal component analysis (PCA) for dimension reduction. Support vector machines (SVM) were used to make the profile-gun associations, and conformal prediction theory (CPT) for establishing confidence levels. At the 95% confidence level, CPT coupled with PCA-SVM yielded an empirical error rate of 3.5%. Complementary, bootstrap-based computations for estimated error rates were 0%, indicating that the error rate for the algorithmic procedure is likely to remain low on larger data sets. Finally, suggestions are made for practical courtroom application of CPT for assigning levels of confidence to SVM identifications of tool marks recorded with confocal microscopy. Copyright © 2011 Wiley Periodicals, Inc.

  2. VBioindex: A Visual Tool to Estimate Biodiversity

    Directory of Open Access Journals (Sweden)

    Dong Su Yu

    2015-09-01

    Full Text Available Biological diversity, also known as biodiversity, is an important criterion for measuring the value of an ecosystem. As biodiversity is closely related to human welfare and quality of life, many efforts to restore and maintain the biodiversity of species have been made by government agencies and non-governmental organizations, thereby drawing a substantial amount of international attention. In the fields of biological research, biodiversity is widely measured using traditional statistical indices such as the Shannon-Wiener index, species richness, evenness, and relative dominance of species. However, some biologists and ecologists have difficulty using these indices because they require advanced mathematical knowledge and computational techniques. Therefore, we developed VBioindex, a user-friendly program that is capable of measuring the Shannon-Wiener index, species richness, evenness, and relative dominance. VBioindex serves as an easy to use interface and visually represents the results in the form of a simple chart and in addition, VBioindex offers functions for long-term investigations of datasets using time-series analyses.

  3. Radon reduction

    International Nuclear Information System (INIS)

    Hamilton, M.A.

    1990-01-01

    During a radon gas screening program, elevated levels of radon gas were detected in homes on Mackinac Island, Mich. Six homes on foundations with crawl spaces were selected for a research project aimed at reducing radon gas concentrations, which ranged from 12.9 to 82.3 pCi/l. Using isolation and ventilation techniques, and variations thereof, radon concentrations were reduced to less than 1 pCi/l. This paper reports that these reductions were achieved using 3.5 mil cross laminated or 10 mil high density polyethylene plastic as a barrier without sealing to the foundation or support piers, solid and/or perforated plastic pipe and mechanical fans. Wind turbines were found to be ineffective at reducing concentrations to acceptable levels. Homeowners themselves installed all materials

  4. Tool-specific performance of vibration-reducing gloves for attenuating fingers-transmitted vibration

    Science.gov (United States)

    Welcome, Daniel E.; Dong, Ren G.; Xu, Xueyan S.; Warren, Christopher; McDowell, Thomas W.

    2016-01-01

    BACKGROUND Fingers-transmitted vibration can cause vibration-induced white finger. The effectiveness of vibration-reducing (VR) gloves for reducing hand transmitted vibration to the fingers has not been sufficiently examined. OBJECTIVE The objective of this study is to examine tool-specific performance of VR gloves for reducing finger-transmitted vibrations in three orthogonal directions (3D) from powered hand tools. METHODS A transfer function method was used to estimate the tool-specific effectiveness of four typical VR gloves. The transfer functions of the VR glove fingers in three directions were either measured in this study or during a previous study using a 3D laser vibrometer. More than seventy vibration spectra of various tools or machines were used in the estimations. RESULTS When assessed based on frequency-weighted acceleration, the gloves provided little vibration reduction. In some cases, the gloves amplified the vibration by more than 10%, especially the neoprene glove. However, the neoprene glove did the best when the assessment was based on unweighted acceleration. The neoprene glove was able to reduce the vibration by 10% or more of the unweighted vibration for 27 out of the 79 tools. If the dominant vibration of a tool handle or workpiece was in the shear direction relative to the fingers, as observed in the operation of needle scalers, hammer chisels, and bucking bars, the gloves did not reduce the vibration but increased it. CONCLUSIONS This study confirmed that the effectiveness for reducing vibration varied with the gloves and the vibration reduction of each glove depended on tool, vibration direction to the fingers, and finger location. VR gloves, including certified anti-vibration gloves do not provide much vibration reduction when judged based on frequency-weighted acceleration. However, some of the VR gloves can provide more than 10% reduction of the unweighted vibration for some tools or workpieces. Tools and gloves can be matched for

  5. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  6. Chatter and machine tools

    CERN Document Server

    Stone, Brian

    2014-01-01

    Focussing on occurrences of unstable vibrations, or Chatter, in machine tools, this book gives important insights into how to eliminate chatter with associated improvements in product quality, surface finish and tool wear. Covering a wide range of machining processes, including turning, drilling, milling and grinding, the author uses his research expertise and practical knowledge of vibration problems to provide solutions supported by experimental evidence of their effectiveness. In addition, this book contains links to supplementary animation programs that help readers to visualise the ideas detailed in the text. Advancing knowledge in chatter avoidance and suggesting areas for new innovations, Chatter and Machine Tools serves as a handbook for those desiring to achieve significant reductions in noise, longer tool and grinding wheel life and improved product finish.

  7. The development and discussion of computerized visual perception assessment tool for Chinese characters structures - Concurrent estimation of the overall ability and the domain ability in item response theory approach.

    Science.gov (United States)

    Wu, Huey-Min; Lin, Chin-Kai; Yang, Yu-Mao; Kuo, Bor-Chen

    2014-11-12

    Visual perception is the fundamental skill required for a child to recognize words, and to read and write. There was no visual perception assessment tool developed for preschool children based on Chinese characters in Taiwan. The purposes were to develop the computerized visual perception assessment tool for Chinese Characters Structures and to explore the psychometrical characteristic of assessment tool. This study adopted purposive sampling. The study evaluated 551 kindergarten-age children (293 boys, 258 girls) ranging from 46 to 81 months of age. The test instrument used in this study consisted of three subtests and 58 items, including tests of basic strokes, single-component characters, and compound characters. Based on the results of model fit analysis, the higher-order item response theory was used to estimate the performance in visual perception, basic strokes, single-component characters, and compound characters simultaneously. Analyses of variance were used to detect significant difference in age groups and gender groups. The difficulty of identifying items in a visual perception test ranged from -2 to 1. The visual perception ability of 4- to 6-year-old children ranged from -1.66 to 2.19. Gender did not have significant effects on performance. However, there were significant differences among the different age groups. The performance of 6-year-olds was better than that of 5-year-olds, which was better than that of 4-year-olds. This study obtained detailed diagnostic scores by using a higher-order item response theory model to understand the visual perception of basic strokes, single-component characters, and compound characters. Further statistical analysis showed that, for basic strokes and compound characters, girls performed better than did boys; there also were differences within each age group. For single-component characters, there was no difference in performance between boys and girls. However, again the performance of 6-year-olds was better than

  8. Determination of reduction yield of lithium metal reduction process

    International Nuclear Information System (INIS)

    Choi, In Kyu; Cho, Young Hwan; Kim, Taek Jin; Jee, Kwang Young

    2004-01-01

    Metal reduction of spent oxide fuel is the first step for the effective storage of spent fuel in Korea as well as transmutation purpose of long-lived radio-nuclides. During the reduction of uranium oxide by lithium metal to uranium metal, lithium oxide is stoichiometrically produced. By determining the concentration of lithium oxide in lithium chloride, we can estimate that how much uranium oxide is converted to uranium metal. Previous method to determine the lithium oxide concentration in lithium chloride is tedious and timing consuming. This paper describe the on-line monitoring method of lithium oxide during the reduction process

  9. Time improvement of photoelectric effect calculation for absorbed dose estimation

    International Nuclear Information System (INIS)

    Massa, J M; Wainschenker, R S; Doorn, J H; Caselli, E E

    2007-01-01

    Ionizing radiation therapy is a very useful tool in cancer treatment. It is very important to determine absorbed dose in human tissue to accomplish an effective treatment. A mathematical model based on affected areas is the most suitable tool to estimate the absorbed dose. Lately, Monte Carlo based techniques have become the most reliable, but they are time expensive. Absorbed dose calculating programs using different strategies have to choose between estimation quality and calculating time. This paper describes an optimized method for the photoelectron polar angle calculation in photoelectric effect, which is significant to estimate deposited energy in human tissue. In the case studies, time cost reduction nearly reached 86%, meaning that the time needed to do the calculation is approximately 1/7 th of the non optimized approach. This has been done keeping precision invariant

  10. Estimating rare events in biochemical systems using conditional sampling

    Science.gov (United States)

    Sundar, V. S.

    2017-01-01

    The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.

  11. Modern contraceptive use, unmet need, and demand satisfied among women of reproductive age who are married or in a union in the focus countries of the Family Planning 2020 initiative: a systematic analysis using the Family Planning Estimation Tool.

    Science.gov (United States)

    Cahill, Niamh; Sonneveldt, Emily; Stover, John; Weinberger, Michelle; Williamson, Jessica; Wei, Chuchu; Brown, Win; Alkema, Leontine

    2018-03-03

    The London Summit on Family Planning in 2012 inspired the Family Planning 2020 (FP2020) initiative and the 120×20 goal of having an additional 120 million women and adolescent girls become users of modern contraceptives in 69 of the world's poorest countries by the year 2020. Working towards achieving 120 × 20 is crucial for ultimately achieving the Sustainable Development Goals of universal access and satisfying demand for reproductive health. Thus, a performance assessment is required to determine countries' progress. An updated version of the Family Planning Estimation Tool (FPET) was used to construct estimates and projections of the modern contraceptive prevalence rate (mCPR), unmet need for, and demand satisfied with modern methods of contraception among women of reproductive age who are married or in a union in the focus countries of the FP2020 initiative. We assessed current levels of family planning indicators and changes between 2012 and 2017. A counterfactual analysis was used to assess if recent levels of mCPR exceeded pre-FP2020 expectations. In 2017, the mCPR among women of reproductive age who are married or in a union in the FP2020 focus countries was 45·7% (95% uncertainty interval [UI] 42·4-49·1), unmet need for modern methods was 21·6% (19·7-23·9), and the demand satisfied with modern methods was 67·9% (64·4-71·1). Between 2012 and 2017 the number of women of reproductive age who are married or in a union who use modern methods increased by 28·8 million (95% UI 5·8-52·5). At the regional level, Asia has seen the mCPR among women of reproductive age who are married or in a union grow from 51·0% (95% UI 48·5-53·4) to 51·8% (47·3-56·5) between 2012 and 2017, which is slow growth, particularly when compared with a change from 23·9% (22·9-25·0) to 28·5% (26·8-30·2) across Africa. At the country level, based on a counterfactual analysis, we found that 61% of the countries that have made a commitment to FP2020 exceeded pre

  12. Dose Reduction Techniques

    International Nuclear Information System (INIS)

    WAGGONER, L.O.

    2000-01-01

    As radiation safety specialists, one of the things we are required to do is evaluate tools, equipment, materials and work practices and decide whether the use of these products or work practices will reduce radiation dose or risk to the environment. There is a tendency for many workers that work with radioactive material to accomplish radiological work the same way they have always done it rather than look for new technology or change their work practices. New technology is being developed all the time that can make radiological work easier and result in less radiation dose to the worker or reduce the possibility that contamination will be spread to the environment. As we discuss the various tools and techniques that reduce radiation dose, keep in mind that the radiological controls should be reasonable. We can not always get the dose to zero, so we must try to accomplish the work efficiently and cost-effectively. There are times we may have to accept there is only so much you can do. The goal is to do the smart things that protect the worker but do not hinder him while the task is being accomplished. In addition, we should not demand that large amounts of money be spent for equipment that has marginal value in order to save a few millirem. We have broken the handout into sections that should simplify the presentation. Time, distance, shielding, and source reduction are methods used to reduce dose and are covered in Part I on work execution. We then look at operational considerations, radiological design parameters, and discuss the characteristics of personnel who deal with ALARA. This handout should give you an overview of what it takes to have an effective dose reduction program

  13. Dose Reduction Techniques

    Energy Technology Data Exchange (ETDEWEB)

    WAGGONER, L.O.

    2000-05-16

    As radiation safety specialists, one of the things we are required to do is evaluate tools, equipment, materials and work practices and decide whether the use of these products or work practices will reduce radiation dose or risk to the environment. There is a tendency for many workers that work with radioactive material to accomplish radiological work the same way they have always done it rather than look for new technology or change their work practices. New technology is being developed all the time that can make radiological work easier and result in less radiation dose to the worker or reduce the possibility that contamination will be spread to the environment. As we discuss the various tools and techniques that reduce radiation dose, keep in mind that the radiological controls should be reasonable. We can not always get the dose to zero, so we must try to accomplish the work efficiently and cost-effectively. There are times we may have to accept there is only so much you can do. The goal is to do the smart things that protect the worker but do not hinder him while the task is being accomplished. In addition, we should not demand that large amounts of money be spent for equipment that has marginal value in order to save a few millirem. We have broken the handout into sections that should simplify the presentation. Time, distance, shielding, and source reduction are methods used to reduce dose and are covered in Part I on work execution. We then look at operational considerations, radiological design parameters, and discuss the characteristics of personnel who deal with ALARA. This handout should give you an overview of what it takes to have an effective dose reduction program.

  14. Authoring Tools

    Science.gov (United States)

    Treviranus, Jutta

    Authoring tools that are accessible and that enable authors to produce accessible Web content play a critical role in web accessibility. Widespread use of authoring tools that comply to the W3C Authoring Tool Accessibility Guidelines (ATAG) would ensure that even authors who are neither knowledgeable about nor particularly motivated to produce accessible content do so by default. The principles and techniques of ATAG are discussed. Some examples of accessible authoring tools are described including authoring tool content management components such as TinyMCE. Considerations for creating an accessible collaborative environment are also covered. As part of providing accessible content, the debate between system-based personal optimization and one universally accessible site configuration is presented. The issues and potential solutions to address the accessibility crisis presented by the advent of rich internet applications are outlined. This challenge must be met to ensure that a large segment of the population is able to participate in the move toward the web as a two-way communication mechanism.

  15. Tool steels

    DEFF Research Database (Denmark)

    Højerslev, C.

    2001-01-01

    On designing a tool steel, its composition and heat treatment parameters are chosen to provide a hardened and tempered martensitic matrix in which carbides are evenly distributed. In this condition the matrix has an optimum combination of hardness andtoughness, the primary carbides provide...... resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...... serves primarily two purpose (i) to improve the hardenabillity and (ii) to provide harder and thermally more stable carbides than cementite. Assuming proper heattreatment, the properties of a tool steel depends on the which alloying elements are added and their respective concentrations....

  16. Management Tools

    Science.gov (United States)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  17. Professional liability insurance in Obstetrics and Gynaecology: estimate of the level of knowledge about malpractice insurance policies and definition of an informative tool for the management of the professional activity

    Directory of Open Access Journals (Sweden)

    Scurria Serena

    2011-12-01

    Full Text Available Abstract Background In recent years, due to the increasingly hostile environment in the medical malpractice field and related lawsuits in Italy, physicians began informing themselves regarding their comprehensive medical malpractice coverage. Methods In order to estimate the level of knowledge of medical professionals on liability insurance coverage for healthcare malpractice, a sample of 60 hospital health professionals of the obstetrics and gynaecology area of Messina (Sicily, Italy were recluted. A survey was administered to evaluate their knowledge as to the meaning of professional liability insurance coverage but above all on the most frequent policy forms ("loss occurrence", "claims made" and "I-II risk". Professionals were classified according to age and professional title and descriptive statistics were calculated for all the professional groups and answers. Results Most of the surveyed professionals were unaware or had very bad knowledge of the professional liability insurance coverage negotiated by the general manager, so most of the personnel believed it useful to subscribe individual "private" policies. Several subjects declared they were aware of the possibility of obtaining an extended coverage for gross negligence and substantially all the surveyed had never seen the loss occurrence and claims made form of the policy. Moreover, the sample was practically unaware of the related issues about insurance coverage for damages related to breaches on informed consent. The results revealed the relative lack of knowledge--among the operators in the field of obstetrics and gynaecology--of the effective coverage provided by the policies signed by the hospital managers for damages in medical malpractice. The authors thus proposed a useful information tool to help professionals working in obstetrics and gynaecology regarding aspects of insurance coverage provided on the basis of Italian civil law. Conclusion Italy must introduce a compulsory

  18. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    International Nuclear Information System (INIS)

    Grassi, Giacomo; Monni, Suvi; Achard, Frederic; Mollicone, Danilo; Federici, Sandro

    2008-01-01

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation

  19. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    Energy Technology Data Exchange (ETDEWEB)

    Grassi, Giacomo; Monni, Suvi; Achard, Frederic [Institute for Environment and Sustainability, Joint Research Centre of the European Commission, I-21020 Ispra (Italy); Mollicone, Danilo [Department of Geography, University of Alcala de Henares, Madrid (Spain); Federici, Sandro

    2008-07-15

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation.

  20. Design tools

    Science.gov (United States)

    Anton TenWolde; Mark T. Bomberg

    2009-01-01

    Overall, despite the lack of exact input data, the use of design tools, including models, is much superior to the simple following of rules of thumbs, and a moisture analysis should be standard procedure for any building envelope design. Exceptions can only be made for buildings in the same climate, similar occupancy, and similar envelope construction. This chapter...

  1. Twitter as a Potential Disaster Risk Reduction Tool. Part III: Evaluating Variables that Promoted Regional Twitter Use for At-risk Populations During the 2013 Hattiesburg F4 Tornado.

    Science.gov (United States)

    Cooper, Guy Paul; Yeager, Violet; Burkle, Frederick M; Subbarao, Italo

    2015-06-29

    Study goals attempt to identify the variables most commonly associated with successful tweeted messages and determine which variables have the most influence in promoting exponential dissemination of information (viral spreading of the message) and trending (becoming popular) in the given disaster affected region. Part II describes the detailed extraction and triangulation filtration methodological approach to acquiring twitter data for the 2013 Hattiesburg Tornado. The data was then divided into two 48 hour windows before and after the tornado impact with a 2 hour pre-tornado buffer to capture tweets just prior to impact. Criteria-based analysis was completed for Tweets and users. The top 100 pre-Tornado and post-Tornado retweeted users were compared to establish the variability among the top retweeted users during the 4 day span.  Pre-Tornado variables that were correlated to higher retweeted rates include total user tweets (0.324), and total times message retweeted (0.530).  Post-Tornado variables that were correlated to higher retweeted rates include total hashtags in a retweet (0.538) and hashtags #Tornado (0.378) and #Hattiesburg (0.254). Overall hashtags usage significantly increased during the storm. Pre-storm there were 5,763 tweets with a hashtag and post-storm there was 13,598 using hashtags. Twitter's unique features allow it to be considered a unique social media tool applicable for emergency managers and public health officials for rapid and accurate two way communication.  Additionally, understanding how variables can be properly manipulated plays a key role in understanding how to use this social media platform for effective, accurate, and rapid mass information communication.

  2. GumTree: Data reduction

    Energy Technology Data Exchange (ETDEWEB)

    Rayner, Hugh [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)]. E-mail: hrz@ansto.gov.au; Hathaway, Paul [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Hauser, Nick [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Fei, Yang [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Franceschini, Ferdi [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Lam, Tony [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    2006-11-15

    Access to software tools for interactive data reduction, visualisation and analysis during a neutron scattering experiment enables instrument users to make informed decisions regarding the direction and success of their experiment. ANSTO aims to enhance the experiment experience of its facility's users by integrating these data reduction tools with the instrument control interface for immediate feedback. GumTree is a software framework and application designed to support an Integrated Scientific Experimental Environment, for concurrent access to instrument control, data acquisition, visualisation and analysis software. The Data Reduction and Analysis (DRA) module is a component of the GumTree framework that allows users to perform data reduction, correction and basic analysis within GumTree while an experiment is running. It is highly integrated with GumTree, able to pull experiment data and metadata directly from the instrument control and data acquisition components. The DRA itself uses components common to all instruments at the facility, providing a consistent interface. It features familiar ISAW-based 1D and 2D plotting, an OpenGL-based 3D plotter and peak fitting performed by fityk. This paper covers the benefits of integration, the flexibility of the DRA module, ease of use for the interface and audit trail generation.

  3. GumTree: Data reduction

    International Nuclear Information System (INIS)

    Rayner, Hugh; Hathaway, Paul; Hauser, Nick; Fei, Yang; Franceschini, Ferdi; Lam, Tony

    2006-01-01

    Access to software tools for interactive data reduction, visualisation and analysis during a neutron scattering experiment enables instrument users to make informed decisions regarding the direction and success of their experiment. ANSTO aims to enhance the experiment experience of its facility's users by integrating these data reduction tools with the instrument control interface for immediate feedback. GumTree is a software framework and application designed to support an Integrated Scientific Experimental Environment, for concurrent access to instrument control, data acquisition, visualisation and analysis software. The Data Reduction and Analysis (DRA) module is a component of the GumTree framework that allows users to perform data reduction, correction and basic analysis within GumTree while an experiment is running. It is highly integrated with GumTree, able to pull experiment data and metadata directly from the instrument control and data acquisition components. The DRA itself uses components common to all instruments at the facility, providing a consistent interface. It features familiar ISAW-based 1D and 2D plotting, an OpenGL-based 3D plotter and peak fitting performed by fityk. This paper covers the benefits of integration, the flexibility of the DRA module, ease of use for the interface and audit trail generation

  4. Health gain by salt reduction in europe: a modelling study.

    Directory of Open Access Journals (Sweden)

    Marieke A H Hendriksen

    Full Text Available Excessive salt intake is associated with hypertension and cardiovascular diseases. Salt intake exceeds the World Health Organization population nutrition goal of 5 grams per day in the European region. We assessed the health impact of salt reduction in nine European countries (Finland, France, Ireland, Italy, Netherlands, Poland, Spain, Sweden and United Kingdom. Through literature research we obtained current salt intake and systolic blood pressure levels of the nine countries. The population health modeling tool DYNAMO-HIA including country-specific disease data was used to predict the changes in prevalence of ischemic heart disease and stroke for each country estimating the effect of salt reduction through its effect on blood pressure levels. A 30% salt reduction would reduce the prevalence of stroke by 6.4% in Finland to 13.5% in Poland. Ischemic heart disease would be decreased by 4.1% in Finland to 8.9% in Poland. When salt intake is reduced to the WHO population nutrient goal, it would reduce the prevalence of stroke from 10.1% in Finland to 23.1% in Poland. Ischemic heart disease would decrease by 6.6% in Finland to 15.5% in Poland. The number of postponed deaths would be 102,100 (0.9% in France, and 191,300 (2.3% in Poland. A reduction of salt intake to 5 grams per day is expected to substantially reduce the burden of cardiovascular disease and mortality in several European countries.

  5. EFSA Panel on Biological Hazards (BIOHAZ); Scientific Opinion on a quantitative estimation of the public health impact of setting a new target for the reduction of Salmonella in broilers

    DEFF Research Database (Denmark)

    Hald, Tine

    This assessment relates the percentage of broiler-associated human salmonellosis cases to different Salmonella prevalences in broiler flocks in the European Union. It considers the contribution and relevance of different Salmonella serovars found in broilers to human salmonellosis. The model......-SAM model) employes data from the EU Baseline Surveys and EU statutory monitoring on Salmonella in animal-food sources, data on incidence of human salmonellosis and food availability data. It is estimated that around 2.4%, 65%, 28% and 4.5% of the human salmonellosis cases are attributable to broilers......, laying hens (eggs), pigs and turkeys respectively. Of the broiler-associated human salmonellosis cases, around 42% and 23% are estimated to be due to the serovars Salmonella Enteritidis and Salmonella Infantis respectively, while other serovars individually contributed less than 5%. Different scenarios...

  6. Finding optimal exact reducts

    KAUST Repository

    AbouEisha, Hassan M.

    2014-01-01

    The problem of attribute reduction is an important problem related to feature selection and knowledge discovery. The problem of finding reducts with minimum cardinality is NP-hard. This paper suggests a new algorithm for finding exact reducts

  7. Reduction of soil erosion on forest roads

    Science.gov (United States)

    Edward R. Burroughs; John G. King

    1989-01-01

    Presents the expected reduction in surface erosion from selected treatments applied to forest road traveledways, cutslopes, fillslopes, and ditches. Estimated erosion reduction is expressed as functions of ground cover, slope gradient, and soil properties whenever possible. A procedure is provided to select rock riprap size for protection of the road ditch.

  8. Methane emission reduction: an application of FUND

    NARCIS (Netherlands)

    Tol, R.S.J.; Heintz, R.J.; Lammers, P.E.M.

    2003-01-01

    Methane is, after carbon dioxide, the most important anthropogenic greenhouse gas. Governments plan to abate methane emissions. A crude set of estimates of reduction costs is included in FUND, an integrated assessment model of climate change. In a cost-benefit analysis, methane emission reduction is

  9. Breast Reduction Surgery

    Science.gov (United States)

    ... considering breast reduction surgery, consult a board-certified plastic surgeon. It's important to understand what breast reduction surgery entails — including possible risks and complications — as ...

  10. Parameter Estimation

    DEFF Research Database (Denmark)

    Sales-Cruz, Mauricio; Heitzig, Martina; Cameron, Ian

    2011-01-01

    of optimisation techniques coupled with dynamic solution of the underlying model. Linear and nonlinear approaches to parameter estimation are investigated. There is also the application of maximum likelihood principles in the estimation of parameters, as well as the use of orthogonal collocation to generate a set......In this chapter the importance of parameter estimation in model development is illustrated through various applications related to reaction systems. In particular, rate constants in a reaction system are obtained through parameter estimation methods. These approaches often require the application...... of algebraic equations as the basis for parameter estimation.These approaches are illustrated using estimations of kinetic constants from reaction system models....

  11. Los Alamos Waste Management Cost Estimation Model

    International Nuclear Information System (INIS)

    Matysiak, L.M.; Burns, M.L.

    1994-03-01

    This final report completes the Los Alamos Waste Management Cost Estimation Project, and includes the documentation of the waste management processes at Los Alamos National Laboratory (LANL) for hazardous, mixed, low-level radioactive solid and transuranic waste, development of the cost estimation model and a user reference manual. The ultimate goal of this effort was to develop an estimate of the life cycle costs for the aforementioned waste types. The Cost Estimation Model is a tool that can be used to calculate the costs of waste management at LANL for the aforementioned waste types, under several different scenarios. Each waste category at LANL is managed in a separate fashion, according to Department of Energy requirements and state and federal regulations. The cost of the waste management process for each waste category has not previously been well documented. In particular, the costs associated with the handling, treatment and storage of the waste have not been well understood. It is anticipated that greater knowledge of these costs will encourage waste generators at the Laboratory to apply waste minimization techniques to current operations. Expected benefits of waste minimization are a reduction in waste volume, decrease in liability and lower waste management costs

  12. Model reduction of parametrized systems

    CERN Document Server

    Ohlberger, Mario; Patera, Anthony; Rozza, Gianluigi; Urban, Karsten

    2017-01-01

    The special volume offers a global guide to new concepts and approaches concerning the following topics: reduced basis methods, proper orthogonal decomposition, proper generalized decomposition, approximation theory related to model reduction, learning theory and compressed sensing, stochastic and high-dimensional problems, system-theoretic methods, nonlinear model reduction, reduction of coupled problems/multiphysics, optimization and optimal control, state estimation and control, reduced order models and domain decomposition methods, Krylov-subspace and interpolatory methods, and applications to real industrial and complex problems. The book represents the state of the art in the development of reduced order methods. It contains contributions from internationally respected experts, guaranteeing a wide range of expertise and topics. Further, it reflects an important effor t, carried out over the last 12 years, to build a growing research community in this field. Though not a textbook, some of the chapters ca...

  13. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  14. Dietary protein reduction on microbial protein, amino acids digestibility, and body retention in beef cattle. I. Digestibility sites and ruminal synthesis estimated by purine bases and 15N as markers.

    Science.gov (United States)

    Mariz, Lays Débora Silva; Amaral, Paloma de Melo; Valadares Filho, Sebastião de Campos; Santos, Stefanie Alvarenga; Marcondes, Marcos Inácio; Prados, Laura Franco; Carneiro Pacheco, Marcos Vinícius; Zanetti, Diego; de Castro Menezes, Gustavo Chamon; Faciola, Antonio P

    2018-06-04

    The objectives of this study were to evaluate the effect of reducing dietary CP contents on 1) total and partial nutrient digestion and nitrogen balance and 2) on microbial crude protein (MCP) synthesis and true MCP digestibility in the small intestine obtained with 15N and purine bases (PB) in beef cattle. Eight bulls (4 Nellore and 4 Crossbred Angus × Nellore) cannulated in the rumen and ileum were distributed in duplicated 4 × 4 Latin squares. The diets consisted of increasing CP contents: 100, 120, or 140 g CP/kg DM offered ad libitum, and restricted intake (RI) diet with 120 g CP/kg DM. The experiment lasted four 17-d periods, with 10 d for adaptation to diets and another 7 for data collection. Omasal digesta flow was obtained using Co-EDTA and indigestible NDF (iNDF) as markers, and to estimate ileal digesta flow only iNDF was used. From days 11 to 17 of each experimental period, ruminal infusions of Co-EDTA (5.0 g/d) and 15N (7.03 g of ammonium sulfate enriched with 10% of 15N atoms) were performed. There was no effect of CP contents (linear effect, P = 0.55 and quadratic effect, P = 0.11) on ruminal OM digestibility. Intake of CP linearly increased (P content (P ruminally degradable OM and true ruminally degradable OM) had a quadratic tendency (P = 0.07 and P = 0.08, respectively) to CP increasing and was numerically greatest at 120 g CP/kg DM. The adjusted equations for estimating true intestinal digestibility of MCP (Y1) and total CP (Y2) were, respectively, as follows: Y1 =--16.724(SEM = 40.06) + 0.86X(SEM = 0.05) and Y2 = -43.81(SEM = 49.19) + 0.75X(SEM = 0.05). It was concluded that diets with 120 g/kg of CP optimize the microbial synthesis and efficiency and ruminal ash and protein NDF digestibility, resulting in a better use of N compounds in the rumen. The PB technique can be used as an alternative to the 15N to estimate microbial synthesis.

  15. Infinitary Combinatory Reduction Systems: Normalising Reduction Strategies

    NARCIS (Netherlands)

    Ketema, J.; Simonsen, Jakob Grue

    2010-01-01

    We study normalising reduction strategies for infinitary Combinatory Reduction Systems (iCRSs). We prove that all fair, outermost-fair, and needed-fair strategies are normalising for orthogonal, fully-extended iCRSs. These facts properly generalise a number of results on normalising strategies in

  16. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia; Harmandaris, Vagelis; Katsoulakis, Markos A.; Plechac, Petr

    2015-01-01

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics

  17. Tools for Managing Repository Objects

    OpenAIRE

    Banker, Rajiv D.; Isakowitz, Tomas; Kauffman, Robert J.; Kumar, Rachna; Zweig, Dani

    1993-01-01

    working Paper Series: STERN IS-93-46 The past few years have seen the introduction of repository-based computer aided software engineering (CASE) tools which may finally enable us to develop software which is reliable and affordable. With the new tools come new challenges for management: Repository-based CASE changes software development to such an extent that traditional approaches to estimation, performance, and productivity assessment may no longer suffice - if they ever...

  18. A primary estimation of PCDD/Fs release reduction from non-wood pulp and paper industry in China based on the investigation of pulp bleaching with chlorine converting to chlorine dioxide.

    Science.gov (United States)

    Xiao, Qingcong; Song, Xiaoqian; Li, Wenchao; Zhang, Yuanna; Wang, Hongchen

    2017-10-01

    Chlorine bleaching technology (C process, CEH process, H process and theirs combination), which was identified as a primary formation source of PCDD/Fs, is still widely used by the vast majority of Chinese non-wood pulp and paper mills (non-wood PMs). The purpose of this study was to provide information and data support for further eliminating dioxin for non-wood PMs in China, and especially to evaluate the PCDD/Fs release reduction for those mills converting their pulp bleaching processes from CEH to ECF. The PCDD/Fs concentrations of the bleached pulp and bleaching wastewater with ECF bleaching were in the ranges of 0.13-0.8 ng TEQ kg -1 , and 0.15-1.9 pg TEQ L -1 , respectively, which were far lower than those with CEH process, indicating that the ECF process is an effective alternative bleaching technology to replace CEH in Chinese non-wood PMs to reduce dioxin release. The release factor via flue gas of the alkali recovery boiler in Chinese non-wood PMs was first reported to be 0.092 μg TEQ Ad t -1 in this study. On the assumption that pulp bleaching processes of all Chinese non-wood PMs were converted from CEH to ECF, the annual release of PCDD/Fs via the bleaching wastewater and bleached pulp would be reduced by 79.1%, with a total of 1.60 g TEQ. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. 极化敏感阵列的DOA及极化参数降维估计算法%Dimension-reduction for DOA and polarization estimation based on polarization sensitive array

    Institute of Scientific and Technical Information of China (English)

    曾富红; 曲志昱; 司伟建

    2017-01-01

    为解决极化MUSIC算法运算量大的问题,提出了一种适用于极化敏感阵列的秩亏损MUSIC算法.在极化MUSIC算法的基础上,通过运用矩阵秩亏损原理将谱函数进行降维优化成只与空域参数相关的二维谱函数,大大降低了谱峰搜索过程中的运算量,同时保证了波达方向(DOA)估计精度.在获得入射信号的DOA之后,通过公式可直接计算得到入射信号的极化参数,具有较低的运算量.通过仿真实验可以验证秩亏损MUSIC算法存在着较高的估计精度,并通过将其与极化MUSIC算法的计算复杂度进行对比,可以发现秩亏损MUSIC算法具有较好的实时性,在入射信号相同并含有极化信息的条件下,秩亏损MUSIC算法的计算复杂度相较于极化MUSIC算法降低了104数量级.%To solve the problem that polarization MUSIC algorithm has large computational complexity, a rank loss MUSIC algorithm suitable for polarization sensitive array was proposed.On the basis of polarization MUSIC algorithm, using the principle of matrix rank loss to reduce the dimensionality of the spectral function to two-dimensional spectral function related to the airspace parameters, the computation significantly was decreased but also guarantees the direction of arrival(DOA) estimation accuracy.After getting the DOA of the incident signal, the polarization parameters can be directly calculated by formula which has low computational complexity.Simulation results show that the proposed algorithm has high estimation precision.The proposed algorithm has better real-time performance compared with that polarization MUSIC algorithm.

  20. Welfare Effects of Tariff Reduction Formulas

    DEFF Research Database (Denmark)

    Guldager, Jan G.; Schröder, Philipp J.H.

    WTO negotiations rely on tariff reduction formulas. It has been argued that formula approaches are of increasing importance in trade talks, because of the large number of countries involved, the wider dispersion in initial tariffs (e.g. tariff peaks) and gaps between bound and applied tariff rate....... No single formula dominates for all conditions. The ranking of the three tools depends on the degree of product differentiation in the industry, and the achieved reduction in the average tariff....

  1. Reduction in language testing

    DEFF Research Database (Denmark)

    Dimova, Slobodanka; Jensen, Christian

    2013-01-01

    /video recorded speech samples and written reports produced by two experienced raters after testing. Our findings suggest that reduction or reduction-like pronunciation features are found in tested L2 speech, but whenever raters identify and comment on such reductions, they tend to assess reductions negatively......This study represents an initial exploration of raters' comments and actual realisations of form reductions in L2 test speech performances. Performances of three L2 speakers were selected as case studies and illustrations of how reductions are evaluated by the raters. The analysis is based on audio...

  2. Estimation of spectral kurtosis

    Science.gov (United States)

    Sutawanir

    2017-03-01

    Rolling bearings are the most important elements in rotating machinery. Bearing frequently fall out of service for various reasons: heavy loads, unsuitable lubrications, ineffective sealing. Bearing faults may cause a decrease in performance. Analysis of bearing vibration signals has attracted attention in the field of monitoring and fault diagnosis. Bearing vibration signals give rich information for early detection of bearing failures. Spectral kurtosis, SK, is a parameter in frequency domain indicating how the impulsiveness of a signal varies with frequency. Faults in rolling bearings give rise to a series of short impulse responses as the rolling elements strike faults, SK potentially useful for determining frequency bands dominated by bearing fault signals. SK can provide a measure of the distance of the analyzed bearings from a healthy one. SK provides additional information given by the power spectral density (psd). This paper aims to explore the estimation of spectral kurtosis using short time Fourier transform known as spectrogram. The estimation of SK is similar to the estimation of psd. The estimation falls in model-free estimation and plug-in estimator. Some numerical studies using simulations are discussed to support the methodology. Spectral kurtosis of some stationary signals are analytically obtained and used in simulation study. Kurtosis of time domain has been a popular tool for detecting non-normality. Spectral kurtosis is an extension of kurtosis in frequency domain. The relationship between time domain and frequency domain analysis is establish through power spectrum-autocovariance Fourier transform. Fourier transform is the main tool for estimation in frequency domain. The power spectral density is estimated through periodogram. In this paper, the short time Fourier transform of the spectral kurtosis is reviewed, a bearing fault (inner ring and outer ring) is simulated. The bearing response, power spectrum, and spectral kurtosis are plotted to

  3. Integrated Wind Power Planning Tool

    DEFF Research Database (Denmark)

    Rosgaard, M. H.; Hahmann, Andrea N.; Nielsen, T. S.

    This poster describes the status as of April 2012 of the Public Service Obligation (PSO) funded project PSO 10464 \\Integrated Wind Power Planning Tool". The project goal is to integrate a meso scale numerical weather prediction (NWP) model with a statistical tool in order to better predict short...... term power variation from off shore wind farms, as well as to conduct forecast error assessment studies in preparation for later implementation of such a feature in an existing simulation model. The addition of a forecast error estimation feature will further increase the value of this tool, as it...

  4. Semantic Mediation Tool for Risk Reduction, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project focuses on providing an infrastructure to aid the building of ontologies from existing NASA applications, in a manner that leads to long-term risk...

  5. Lean-Six Sigma: tools for rapid cycle cost reduction.

    Science.gov (United States)

    Caldwell, Chip

    2006-10-01

    Organizational costs can be grouped as process cost, cost of quality, and cost of poor quality. Providers should train managers in the theory and application of Lean-Six Sigma, including the seven categories of waste and how to remove them. Healthcare financial executives should work with managers in eliminating waste to improve service and reduce costs.

  6. Estimating Risk Parameters

    OpenAIRE

    Aswath Damodaran

    1999-01-01

    Over the last three decades, the capital asset pricing model has occupied a central and often controversial place in most corporate finance analysts’ tool chests. The model requires three inputs to compute expected returns – a riskfree rate, a beta for an asset and an expected risk premium for the market portfolio (over and above the riskfree rate). Betas are estimated, by most practitioners, by regressing returns on an asset against a stock index, with the slope of the regression being the b...

  7. MCNP variance reduction overview

    International Nuclear Information System (INIS)

    Hendricks, J.S.; Booth, T.E.

    1985-01-01

    The MCNP code is rich in variance reduction features. Standard variance reduction methods found in most Monte Carlo codes are available as well as a number of methods unique to MCNP. We discuss the variance reduction features presently in MCNP as well as new ones under study for possible inclusion in future versions of the code

  8. Modern Reduction Methods

    CERN Document Server

    Andersson, Pher G

    2008-01-01

    With its comprehensive overview of modern reduction methods, this book features high quality contributions allowing readers to find reliable solutions quickly and easily. The monograph treats the reduction of carbonyles, alkenes, imines and alkynes, as well as reductive aminations and cross and heck couplings, before finishing off with sections on kinetic resolutions and hydrogenolysis. An indispensable lab companion for every chemist.

  9. Sulfate reduction in freshwater peatlands

    Energy Technology Data Exchange (ETDEWEB)

    Oequist, M.

    1996-12-31

    This text consist of two parts: Part A is a literature review on microbial sulfate reduction with emphasis on freshwater peatlands, and part B presents the results from a study of the relative importance of sulfate reduction and methane formation for the anaerobic decomposition in a boreal peatland. The relative importance of sulfate reduction and methane production for the anaerobic decomposition was studied in a small raised bog situated in the boreal zone of southern Sweden. Depth distribution of sulfate reduction- and methane production rates were measured in peat sampled from three sites (A, B, and C) forming an minerotrophic-ombrotrophic gradient. SO{sub 4}{sup 2-} concentrations in the three profiles were of equal magnitude and ranged from 50 to 150 {mu}M. In contrast, rates of sulfate reduction were vastly different: Maximum rates in the three profiles were obtained at a depth of ca. 20 cm below the water table. In A it was 8 {mu}M h{sup -1} while in B and C they were 1 and 0.05 {mu}M h{sup -1}, respectively. Methane production rates, however, were more uniform across the three nutrient regimes. Maximum rates in A (ca. 1.5 {mu}g d{sup -1} g{sup -1}) were found 10 cm below the water table, in B (ca. 1.0 {mu}g d{sup -1} g{sup -1}) in the vicinity of the water table, and in C (0.75 {mu}g d{sup -1} g{sup -1}) 20 cm below the water table. In all profiles both sulfate reduction and methane production rates were negligible above the water table. The areal estimates of methane production for the profiles were 22.4, 9.0 and 6.4 mmol m{sup -2} d{sup -1}, while the estimates for sulfate reduction were 26.4, 2.5, and 0.1 mmol m{sup -2} d{sup -1}, respectively. The calculated turnover times at the sites were 1.2, 14.2, and 198.7 days, respectively. The study shows that sulfate reducing bacteria are important for the anaerobic degradation in the studied peatland, especially in the minerotrophic sites, while methanogenic bacteria dominate in ombrotrophic sites Examination

  10. Sulfate reduction in freshwater peatlands

    International Nuclear Information System (INIS)

    Oequist, M.

    1996-01-01

    This text consist of two parts: Part A is a literature review on microbial sulfate reduction with emphasis on freshwater peatlands, and part B presents the results from a study of the relative importance of sulfate reduction and methane formation for the anaerobic decomposition in a boreal peatland. The relative importance of sulfate reduction and methane production for the anaerobic decomposition was studied in a small raised bog situated in the boreal zone of southern Sweden. Depth distribution of sulfate reduction- and methane production rates were measured in peat sampled from three sites (A, B, and C) forming an minerotrophic-ombrotrophic gradient. SO 4 2- concentrations in the three profiles were of equal magnitude and ranged from 50 to 150 μM. In contrast, rates of sulfate reduction were vastly different: Maximum rates in the three profiles were obtained at a depth of ca. 20 cm below the water table. In A it was 8 μM h -1 while in B and C they were 1 and 0.05 μM h -1 , respectively. Methane production rates, however, were more uniform across the three nutrient regimes. Maximum rates in A (ca. 1.5 μg d -1 g -1 ) were found 10 cm below the water table, in B (ca. 1.0 μg d -1 g -1 ) in the vicinity of the water table, and in C (0.75 μg d -1 g -1 ) 20 cm below the water table. In all profiles both sulfate reduction and methane production rates were negligible above the water table. The areal estimates of methane production for the profiles were 22.4, 9.0 and 6.4 mmol m -2 d -1 , while the estimates for sulfate reduction were 26.4, 2.5, and 0.1 mmol m -2 d -1 , respectively. The calculated turnover times at the sites were 1.2, 14.2, and 198.7 days, respectively. The study shows that sulfate reducing bacteria are important for the anaerobic degradation in the studied peatland, especially in the minerotrophic sites, while methanogenic bacteria dominate in ombrotrophic sites Examination paper. 67 refs, 6 figs, 3 tabs

  11. REDUCTION CAPACITY OF SALTSTONE AND SALTSTONE COMPONENTS

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, K.; Kaplan, D.

    2009-11-30

    The duration that saltstone retains its ability to immobilize some key radionuclides, such as technetium (Tc), plutonium (Pu), and neptunium (Np), depends on its capacity to maintain a low redox status (or low oxidation state). The reduction capacity is a measure of the mass of reductants present in the saltstone; the reductants are the active ingredients that immobilize Tc, Pu, and Np. Once reductants are exhausted, the saltstone loses its ability to immobilize these radionuclides. The reduction capacity values reported here are based on the Ce(IV)/Fe(II) system. The Portland cement (198 {micro}eq/g) and especially the fly ash (299 {micro}eq/g) had a measurable amount of reduction capacity, but the blast furnace slag (820 {micro}eq/g) not surprisingly accounted for most of the reduction capacity. The blast furnace slag contains ferrous iron and sulfides which are strong reducing and precipitating species for a large number of solids. Three saltstone samples containing 45% slag or one sample containing 90% slag had essentially the same reduction capacity as pure slag. There appears to be some critical concentration between 10% and 45% slag in the Saltstone formulation that is needed to create the maximum reduction capacity. Values from this work supported those previously reported, namely that the reduction capacity of SRS saltstone is about 820 {micro}eq/g; this value is recommended for estimating the longevity that the Saltstone Disposal Facility will retain its ability to immobilize radionuclides.

  12. Advanced pressure tube sampling tools

    International Nuclear Information System (INIS)

    Wittich, K.C.; King, J.M.

    2002-01-01

    Deuterium concentration is an important parameter that must be assessed to evaluate the Fitness for service of CANDU pressure tubes. In-reactor pressure tube sampling allows accurate deuterium concentration assessment to be made without the expenses associated with fuel channel removal. This technology, which AECL has developed over the past fifteen years, has become the standard method for deuterium concentration assessment. AECL is developing a multi-head tool that would reduce in-reactor handling overhead by allowing one tool to sequentially sample at all four axial pressure tube locations before removal from the reactor. Four sets of independent cutting heads, like those on the existing sampling tools, facilitate this incorporating proven technology demonstrated in over 1400 in-reactor samples taken to date. The multi-head tool is delivered by AECL's Advanced Delivery Machine or other similar delivery machines. Further, AECL has developed an automated sample handling system that receives and processes the tool once out of the reactor. This system retrieves samples from the tool, dries, weighs and places them in labelled vials which are then directed into shielded shipping flasks. The multi-head wet sampling tool and the automated sample handling system are based on proven technology and offer continued savings and dose reduction to utilities in a competitive electricity market. (author)

  13. Use of remote sensing tools for severity analysis and greenhouse gases estimation in large forest fires. Case study of La Rufina forest fire, VI Region of L. G. B. O´Higgins, Chile

    Directory of Open Access Journals (Sweden)

    P. Vidal

    2017-12-01

    Full Text Available Wildfires destroy thousands of hectares of vegetation every year in Chile, a phenomenon that has steadily increased over time, both in terms of the number of fires and the area affected. Since 1985 until 2016 have occurred 1,476 wildfires severe in intensity (> 200 ha, that burned a total of about 1,243,407 ha of vegetation, and an average of 40,000 ha affected per year. Depending on the type and intensity of the fire, there are different levels of severity with which the fire affects the vegetation, a variation that is crucial for the estimation GEI in the event. The purpose of this research was to analyze the burn severity of Rufina wildfires occurred in 1999, in the VI Region of L. G. B. O’Higgins in Chile, south of the capital Santiago, using Landsat 5 TM and Landsat 7 ETM+ imagery, including in the analysis the estimated greenhouse gases emitted in relation to with the vegetation and burn severity. Burn severity was estimated through the Normalized Burn Ratio (dNBR and GEI with the equation proposed by IPCC in 2006, which was adjusted with the combustion efficiency coefficients proposed by De Santis et al. (2010. The results show that around 16,783 ha were affected by fires of different severity and the native forest and tree plantations were affected by high severity. The ton of GEI for each level of burn severity and type of vegetation was estimated, being carbon dioxide (CO2 the main GEI emitted to the atmosphere in the fire. The highest emissions occurred in the areas of grasslands and scrublands, with high severity, with values ranging between 186 and 170 t/ha respectively

  14. Application of AFINCH as a tool for evaluating the effects of streamflow-gaging-network size and composition on the accuracy and precision of streamflow estimates at ungaged locations in the southeast Lake Michigan hydrologic subregion

    Science.gov (United States)

    Koltun, G.F.; Holtschlag, David J.

    2010-01-01

    Bootstrapping techniques employing random subsampling were used with the AFINCH (Analysis of Flows In Networks of CHannels) model to gain insights into the effects of variation in streamflow-gaging-network size and composition on the accuracy and precision of streamflow estimates at ungaged locations in the 0405 (Southeast Lake Michigan) hydrologic subregion. AFINCH uses stepwise-regression techniques to estimate monthly water yields from catchments based on geospatial-climate and land-cover data in combination with available streamflow and water-use data. Calculations are performed on a hydrologic-subregion scale for each catchment and stream reach contained in a National Hydrography Dataset Plus (NHDPlus) subregion. Water yields from contributing catchments are multiplied by catchment areas and resulting flow values are accumulated to compute streamflows in stream reaches which are referred to as flow lines. AFINCH imposes constraints on water yields to ensure that observed streamflows are conserved at gaged locations.  Data from the 0405 hydrologic subregion (referred to as Southeast Lake Michigan) were used for the analyses. Daily streamflow data were measured in the subregion for 1 or more years at a total of 75 streamflow-gaging stations during the analysis period which spanned water years 1971–2003. The number of streamflow gages in operation each year during the analysis period ranged from 42 to 56 and averaged 47. Six sets (one set for each censoring level), each composed of 30 random subsets of the 75 streamflow gages, were created by censoring (removing) approximately 10, 20, 30, 40, 50, and 75 percent of the streamflow gages (the actual percentage of operating streamflow gages censored for each set varied from year to year, and within the year from subset to subset, but averaged approximately the indicated percentages).Streamflow estimates for six flow lines each were aggregated by censoring level, and results were analyzed to assess (a) how the

  15. LDRD report nonlinear model reduction

    Energy Technology Data Exchange (ETDEWEB)

    Segalman, D.; Heinstein, M.

    1997-09-01

    The very general problem of model reduction of nonlinear systems was made tractable by focusing on the very large subclass consisting of linear subsystems connected by nonlinear interfaces. Such problems constitute a large part of the nonlinear structural problems encountered in addressing the Sandia missions. A synthesis approach to this class of problems was developed consisting of: detailed modeling of the interface mechanics; collapsing the interface simulation results into simple nonlinear interface models; constructing system models by assembling model approximations of the linear subsystems and the nonlinear interface models. These system models, though nonlinear, would have very few degrees of freedom. A paradigm problem, that of machine tool vibration, was selected for application of the reduction approach outlined above. Research results achieved along the way as well as the overall modeling of a specific machine tool have been very encouraging. In order to confirm the interface models resulting from simulation, it was necessary to develop techniques to deduce interface mechanics from experimental data collected from the overall nonlinear structure. A program to develop such techniques was also pursued with good success.

  16. Un outil moderne d'estimation pour l'industrie pétrolière : les modèles mathématiques de coût A Modern Tool of Estimation for the Oil Industry: Mathematical Models of Cost

    Directory of Open Access Journals (Sweden)

    Fournier G.

    2006-11-01

    , the costs and schedules of conception and manufacturing, for the considered product. Now, technological evolution, which is faster and faster, often forbids to use classical estimation methods. Asa matter of fact, either they need too detailed a description of the activities necessary to design the product(analytical or detailed approach, or they make it necessary to dispose of too many neighbouring reference points(analogical approach. Therefore, it is fitting to dispose of a complementary tool. Mathematical models of cost have been developed in orderto answer this need. They are based on a functional description and make use of universal relationships , that originate in the following principle : the cost of an equipment is linked to its internal thermodynamics. In other words, other things being equal , it is all the more expensive as it manipulates , a lot of energy per mass unit, or else as it requires less mass to manipulatea given amount of energy. Practically, if one still makes reference to former products, it is not compulsory for the latter to be analogous , with the one to be valorized; they are used for calibrating the corporate culture , of the company. The purpose of the work broached at the Institut Français du Pétrole (IFP is to show that this kind of methods may be very useful for the petroleum industry. Two actual examples have been dealt with (the Packinox plate heat exchangers and the automotive engines and are the subject of the last part of this article. They f ollow more theoretical a process that presents the diff erent approaches and emphasizes their respective advantages and drawbacks.

  17. Approximate zero-variance Monte Carlo estimation of Markovian unreliability

    International Nuclear Information System (INIS)

    Delcoux, J.L.; Labeau, P.E.; Devooght, J.

    1997-01-01

    Monte Carlo simulation has become an important tool for the estimation of reliability characteristics, since conventional numerical methods are no more efficient when the size of the system to solve increases. However, evaluating by a simulation the probability of occurrence of very rare events means playing a very large number of histories of the system, which leads to unacceptable computation times. Acceleration and variance reduction techniques have to be worked out. We show in this paper how to write the equations of Markovian reliability as a transport problem, and how the well known zero-variance scheme can be adapted to this application. But such a method is always specific to the estimation of one quality, while a Monte Carlo simulation allows to perform simultaneously estimations of diverse quantities. Therefore, the estimation of one of them could be made more accurate while degrading at the same time the variance of other estimations. We propound here a method to reduce simultaneously the variance for several quantities, by using probability laws that would lead to zero-variance in the estimation of a mean of these quantities. Just like