WorldWideScience

Sample records for reduction estimation tool

  1. Spreadsheet tool for estimating noise reduction costs

    International Nuclear Information System (INIS)

    Frank, L.; Senden, V.; Leszczynski, Y.

    2009-01-01

    The Northeast Capital Industrial Association (NCIA) represents industry in Alberta's industrial heartland. The organization is in the process of developing a regional noise management plan (RNMP) for their member companies. The RNMP includes the development of a noise reduction cost spreadsheet tool to conduct reviews of practical noise control treatments available for individual plant equipment, inclusive of ranges of noise attenuation achievable, which produces a budgetary prediction of the installed cost of practical noise control treatments. This paper discussed the noise reduction cost spreadsheet tool, with particular reference to noise control best practices approaches and spreadsheet tool development such as prerequisite, assembling data required, approach, and unit pricing database. Use and optimization of the noise reduction cost spreadsheet tool was also discussed. It was concluded that the noise reduction cost spreadsheet tool is an easy interactive tool to estimate implementation costs related to different strategies and options of noise control mitigating measures and was very helpful in gaining insight for noise control planning purposes. 2 tabs.

  2. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  3. Modelling stillbirth mortality reduction with the Lives Saved Tool

    Directory of Open Access Journals (Sweden)

    Hannah Blencowe

    2017-11-01

    Full Text Available Abstract Background The worldwide burden of stillbirths is large, with an estimated 2.6 million babies stillborn in 2015 including 1.3 million dying during labour. The Every Newborn Action Plan set a stillbirth target of ≤12 per 1000 in all countries by 2030. Planning tools will be essential as countries set policy and plan investment to scale up interventions to meet this target. This paper summarises the approach taken for modelling the impact of scaling-up health interventions on stillbirths in the Lives Saved tool (LiST, and potential future refinements. Methods The specific application to stillbirths of the general method for modelling the impact of interventions in LiST is described. The evidence for the effectiveness of potential interventions to reduce stillbirths are reviewed and the assumptions of the affected fraction of stillbirths who could potentially benefit from these interventions are presented. The current assumptions and their effects on stillbirth reduction are described and potential future improvements discussed. Results High quality evidence are not available for all parameters in the LiST stillbirth model. Cause-specific mortality data is not available for stillbirths, therefore stillbirths are modelled in LiST using an attributable fraction approach by timing of stillbirths (antepartum/ intrapartum. Of 35 potential interventions to reduce stillbirths identified, eight interventions are currently modelled in LiST. These include childbirth care, induction for prolonged pregnancy, multiple micronutrient and balanced energy supplementation, malaria prevention and detection and management of hypertensive disorders of pregnancy, diabetes and syphilis. For three of the interventions, childbirth care, detection and management of hypertensive disorders of pregnancy, and diabetes the estimate of effectiveness is based on expert opinion through a Delphi process. Only for malaria is coverage information available, with coverage

  4. Development of a simple estimation tool for LMFBR construction cost

    International Nuclear Information System (INIS)

    Yoshida, Kazuo; Kinoshita, Izumi

    1999-01-01

    A simple tool for estimating the construction costs of liquid-metal-cooled fast breeder reactors (LMFBRs), 'Simple Cost' was developed in this study. Simple Cost is based on a new estimation formula that can reduce the amount of design data required to estimate construction costs. Consequently, Simple cost can be used to estimate the construction costs of innovative LMFBR concepts for which detailed design has not been carried out. The results of test calculation show that Simple Cost provides cost estimations equivalent to those obtained with conventional methods within the range of plant power from 325 to 1500 MWe. Sensitivity analyses for typical design parameters were conducted using Simple Cost. The effects of four major parameters - reactor vessel diameter, core outlet temperature, sodium handling area and number of secondary loops - on the construction costs of LMFBRs were evaluated quantitatively. The results show that the reduction of sodium handling area is particularly effective in reducing construction costs. (author)

  5. Toxicity Estimation Software Tool (TEST)

    Science.gov (United States)

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  6. Tools to support GHG emissions reduction : a regional effort, part 1 - carbon footprint estimation and decision support.

    Science.gov (United States)

    2010-09-01

    Tools are proposed for carbon footprint estimation of transportation construction projects and decision support : for construction firms that must make equipment choice and usage decisions that affect profits, project duration : and greenhouse gas em...

  7. Tools for estimating VMT reductions from built environment changes.

    Science.gov (United States)

    2013-06-01

    Built environment characteristics are associated with walking, bicycling, transit use, and vehicle : miles traveled (VMT). Developing built environments supportive of walking, bicycling, and transit use : can help meet state VMT reduction goals. But ...

  8. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    Science.gov (United States)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  9. Estimating Tool-Tissue Forces Using a 3-Degree-of-Freedom Robotic Surgical Tool.

    Science.gov (United States)

    Zhao, Baoliang; Nelson, Carl A

    2016-10-01

    Robot-assisted minimally invasive surgery (MIS) has gained popularity due to its high dexterity and reduced invasiveness to the patient; however, due to the loss of direct touch of the surgical site, surgeons may be prone to exert larger forces and cause tissue damage. To quantify tool-tissue interaction forces, researchers have tried to attach different kinds of sensors on the surgical tools. This sensor attachment generally makes the tools bulky and/or unduly expensive and may hinder the normal function of the tools; it is also unlikely that these sensors can survive harsh sterilization processes. This paper investigates an alternative method by estimating tool-tissue interaction forces using driving motors' current, and validates this sensorless force estimation method on a 3-degree-of-freedom (DOF) robotic surgical grasper prototype. The results show that the performance of this method is acceptable with regard to latency and accuracy. With this tool-tissue interaction force estimation method, it is possible to implement force feedback on existing robotic surgical systems without any sensors. This may allow a haptic surgical robot which is compatible with existing sterilization methods and surgical procedures, so that the surgeon can obtain tool-tissue interaction forces in real time, thereby increasing surgical efficiency and safety.

  10. GIS Tools to Estimate Average Annual Daily Traffic

    Science.gov (United States)

    2012-06-01

    This project presents five tools that were created for a geographical information system to estimate Annual Average Daily : Traffic using linear regression. Three of the tools can be used to prepare spatial data for linear regression. One tool can be...

  11. TEST (Toxicity Estimation Software Tool) Ver 4.1

    Science.gov (United States)

    The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to allow users to easily estimate toxicity and physical properties using a variety of QSAR methodologies. T.E.S.T allows a user to estimate toxicity without requiring any external programs. Users can input a chem...

  12. An Overview Of Tool For Response Action Cost Estimating (TRACE)

    International Nuclear Information System (INIS)

    Ferries, S.R.; Klink, K.L.; Ostapkowicz, B.

    2012-01-01

    Tools and techniques that provide improved performance and reduced costs are important to government programs, particularly in current times. An opportunity for improvement was identified for preparation of cost estimates used to support the evaluation of response action alternatives. As a result, CH2M HILL Plateau Remediation Company has developed Tool for Response Action Cost Estimating (TRACE). TRACE is a multi-page Microsoft Excel(reg s ign) workbook developed to introduce efficiencies into the timely and consistent production of cost estimates for response action alternatives. This tool combines costs derived from extensive site-specific runs of commercially available remediation cost models with site-specific and estimator-researched and derived costs, providing the best estimating sources available. TRACE also provides for common quantity and key parameter links across multiple alternatives, maximizing ease of updating estimates and performing sensitivity analyses, and ensuring consistency.

  13. Selection of portable tools for use in a size reduction facility

    International Nuclear Information System (INIS)

    Hawley, L.N.

    1986-07-01

    A range of portable tools are identified for development and eventual use within a remote operations facility for the size reduction of plutonium contaminated materials. The process of selection defines the work to be performed within the facility and matches this to the general categories of suitable tools. Specific commercial tools are then selected or, where none exists, proposals are made for the development of special tools. (author)

  14. The cardiovascular event reduction tool (CERT)--a simplified cardiac risk prediction model developed from the West of Scotland Coronary Prevention Study (WOSCOPS).

    Science.gov (United States)

    L'Italien, G; Ford, I; Norrie, J; LaPuerta, P; Ehreth, J; Jackson, J; Shepherd, J

    2000-03-15

    The clinical decision to treat hypercholesterolemia is premised on an awareness of patient risk, and cardiac risk prediction models offer a practical means of determining such risk. However, these models are based on observational cohorts where estimates of the treatment benefit are largely inferred. The West of Scotland Coronary Prevention Study (WOSCOPS) provides an opportunity to develop a risk-benefit prediction model from the actual observed primary event reduction seen in the trial. Five-year Cox model risk estimates were derived from all WOSCOPS subjects (n = 6,595 men, aged 45 to 64 years old at baseline) using factors previously shown to be predictive of definite fatal coronary heart disease or nonfatal myocardial infarction. Model risk factors included age, diastolic blood pressure, total cholesterol/ high-density lipoprotein ratio (TC/HDL), current smoking, diabetes, family history of fatal coronary heart disease, nitrate use or angina, and treatment (placebo/ 40-mg pravastatin). All risk factors were expressed as categorical variables to facilitate risk assessment. Risk estimates were incorporated into a simple, hand-held slide rule or risk tool. Risk estimates were identified for 5-year age bands (45 to 65 years), 4 categories of TC/HDL ratio ( or = 7.5), 2 levels of diastolic blood pressure ( or = 90 mm Hg), from 0 to 3 additional risk factors (current smoking, diabetes, family history of premature fatal coronary heart disease, nitrate use or angina), and pravastatin treatment. Five-year risk estimates ranged from 2% in very low-risk subjects to 61% in the very high-risk subjects. Risk reduction due to pravastatin treatment averaged 31%. Thus, the Cardiovascular Event Reduction Tool (CERT) is a risk prediction model derived from the WOSCOPS trial. Its use will help physicians identify patients who will benefit from cholesterol reduction.

  15. Careers Education: An Effective Tool for Poverty Reduction | Okafor ...

    African Journals Online (AJOL)

    Careers Education: An Effective Tool for Poverty Reduction. ... Open Access DOWNLOAD FULL TEXT Subscription or Fee Access ... The research was carried out based mainly on the secondary source of data. Among other things, the study ...

  16. MURMoT. Design and Application of Microbial Uranium Reduction Monitoring Tools

    Energy Technology Data Exchange (ETDEWEB)

    Loeffler, Frank E. [Univ. of Tennessee, Knoxville, TN (United States)

    2014-12-31

    Uranium (U) contamination in the subsurface is a major remediation challenge at many DOE sites. Traditional site remedies present enormous costs to DOE; hence, enhanced bioremediation technologies (i.e., biostimulation and bioaugmentation) combined with monitoring efforts are being considered as cost-effective corrective actions to address subsurface contamination. This research effort improved understanding of the microbial U reduction process and developed new tools for monitoring microbial activities. Application of these tools will promote science-based site management decisions that achieve contaminant detoxification, plume control, and long-term stewardship in the most efficient manner. The overarching hypothesis was that the design, validation and application of a suite of new molecular and biogeochemical tools advance process understanding, and improve environmental monitoring regimes to assess and predict in situ U immobilization. Accomplishments: This project (i) advanced nucleic acid-based approaches to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-detoxifying bacteria; (ii) developed proteomics workflows for detection of metal reduction biomarker proteins in laboratory cultures and contaminated site groundwater; (iii) developed and demonstrated the utility of U isotopic fractionation using high precision mass spectrometry to quantify U(VI) reduction for a range of reduction mechanisms and environmental conditions; and (iv) validated the new tools using field samples from U-contaminated IFRC sites, and demonstrated their prognostic and diagnostic capabilities in guiding decision making for environmental remediation and long-term site stewardship.

  17. An Accurate FFPA-PSR Estimator Algorithm and Tool for Software Effort Estimation

    Directory of Open Access Journals (Sweden)

    Senthil Kumar Murugesan

    2015-01-01

    Full Text Available Software companies are now keen to provide secure software with respect to accuracy and reliability of their products especially related to the software effort estimation. Therefore, there is a need to develop a hybrid tool which provides all the necessary features. This paper attempts to propose a hybrid estimator algorithm and model which incorporates quality metrics, reliability factor, and the security factor with a fuzzy-based function point analysis. Initially, this method utilizes a fuzzy-based estimate to control the uncertainty in the software size with the help of a triangular fuzzy set at the early development stage. Secondly, the function point analysis is extended by the security and reliability factors in the calculation. Finally, the performance metrics are added with the effort estimation for accuracy. The experimentation is done with different project data sets on the hybrid tool, and the results are compared with the existing models. It shows that the proposed method not only improves the accuracy but also increases the reliability, as well as the security, of the product.

  18. Use of GIS in the estimation and development of risk reduction technology

    International Nuclear Information System (INIS)

    Ha, Jae Joo

    1998-03-01

    The occurrence probability of a severe accident in the nuclear power plant is very small because the safety of a plant and the public is considered in the design and operation of a nuclear power plant. However, if a severe accident occurs, the establishment of a reduction strategy of damages resulting from it is essential because the effect of it on the human and the environment is very large. The important criterion which determines the severity of an accident is risk, which is defined as the product of its frequently and the consequence. The establishment of countermeasures in order to estimate and reduce risks quantitatively can be a very powerful tool to minimize the effect of an accident on the human and the environment. The research on the establishment of a framework which integrates a geographic information system (GIS), a database management system (DBMS), and decision making support system (DMSS) is considered very actively. Based on these systems, we can accomplish the estimation and display of risks and the development of reduction methodologies which are essential parts of an accident management of a nuclear power plant. The GIS plays a role to support users to systematize and comprehend spatial relationships of information which are necessary for the decision making. Through the DBMS, we can establish and manage spatial and attribute data, and use them in the query and selection. The DMSS is a computer-based information system which makes a necessary decision easily. In this study, we reviewed the fundamental concepts of a GIS and examined the methodology for the use of it in the estimation and display of risks. Also, we established the fundamental GIS platform of a Yonggwang site and the necessary database systems for the estimation of risks. (author). 17 refs., 9 tabs., 34 figs

  19. An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System

    Science.gov (United States)

    1989-09-01

    C~4p DTIC S ELECTE fl JAN12 19 .1R ~OF S%. B -U AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L...Ohio go 91 022 AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L. Hanson Major, USAF...Department of Defense. AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Presented to the

  20. The Lives Saved Tool (LiST) as a model for diarrhea mortality reduction

    Science.gov (United States)

    2014-01-01

    Background Diarrhea is a leading cause of morbidity and mortality among children under five years of age. The Lives Saved Tool (LiST) is a model used to calculate deaths averted or lives saved by past interventions and for the purposes of program planning when costly and time consuming impact studies are not possible. Discussion LiST models the relationship between coverage of interventions and outputs, such as stunting, diarrhea incidence and diarrhea mortality. Each intervention directly prevents a proportion of diarrhea deaths such that the effect size of the intervention is multiplied by coverage to calculate lives saved. That is, the maximum effect size could be achieved at 100% coverage, but at 50% coverage only 50% of possible deaths are prevented. Diarrhea mortality is one of the most complex causes of death to be modeled. The complexity is driven by the combination of direct prevention and treatment interventions as well as interventions that operate indirectly via the reduction in risk factors, such as stunting and wasting. Published evidence is used to quantify the effect sizes for each direct and indirect relationship. Several studies have compared measured changes in mortality to LiST estimates of mortality change looking at different sets of interventions in different countries. While comparison work has generally found good agreement between the LiST estimates and measured mortality reduction, where data availability is weak, the model is less likely to produce accurate results. LiST can be used as a component of program evaluation, but should be coupled with more complete information on inputs, processes and outputs, not just outcomes and impact. Summary LiST is an effective tool for modeling diarrhea mortality and can be a useful alternative to large and expensive mortality impact studies. Predicting the impact of interventions or comparing the impact of more than one intervention without having to wait for the results of large and expensive

  1. A MORET tool to assist code bias estimation

    International Nuclear Information System (INIS)

    Fernex, F.; Richet, Y.; Letang, E.

    2003-01-01

    This new Graphical User Interface (GUI) developed in JAVA is one of the post-processing tools for MORET4 code. It aims to help users to estimate the importance of the k eff bias due to the code in order to better define the upper safety limit. Moreover, it allows visualizing the distance between an actual configuration case and evaluated critical experiments. This tool depends on a validated experiments database, on sets of physical parameters and on various statistical tools allowing interpolating the calculation bias of the database or displaying the projections of experiments on a reduced base of parameters. The development of this tool is still in progress. (author)

  2. Estimate of Possible CO2 Emission Reduction in Slovenia

    International Nuclear Information System (INIS)

    Plavcak, V.-P.; Jevsek, F.; Tirsek, A.

    1998-01-01

    The first estimation of possible CO 2 emission reduction, according to the obligations from Kyoto Protocol, is prepared. The results show that the required 8% reduction of greenhouses gases in Slovenia in the period from 2008 to 2012 with regard to year 1986 will require a through analytical treatment not only in electric power sector but also in transport and industry sectors, which are the main pollutants. (author)

  3. Energy Saving Melting and Revert Reduction Technology (E-SMARRT): Design Support for Tooling Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Dongtao

    2011-09-23

    High pressure die casting is an intrinsically efficient net shape process and improvements in energy efficiency are strongly dependent on design and process improvements that reduce scrap rates so that more of the total consumed energy goes into acceptable, usable castings. Computer simulation has become widely used within the industry but use is not universal. Further, many key design decisions must be made before the simulation can be run and expense in terms of money and time often limits the number of decision iterations that can be explored. This work continues several years of work creating simple, very fast, design tools that can assist with the early stage design decisions so that the benefits of simulation can be maximized and, more importantly, so that the chances of first shot success are maximized. First shot success and better running processes contributes to less scrap and significantly better energy utilization by the process. This new technology was predicted to result in an average energy savings of 1.83 trillion BTUs/year over a 10 year period. Current (2011) annual energy saving estimates over a ten year period, based on commercial introduction in 2012, a market penetration of 30% by 2015 is 1.89 trillion BTUs/year by 2022. Along with these energy savings, reduction of scrap and improvement in yield will result in a reduction of the environmental emissions associated with the melting and pouring of the metal which will be saved as a result of this technology. The average annual estimate of CO2 reduction per year through 2022 is 0.037 Million Metric Tons of Carbon Equivalent (MM TCE).

  4. Estimation of toxicity using a Java based software tool

    Science.gov (United States)

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  5. Estimating CO2 Emission Reduction of Non-capture CO2 Utilization (NCCU) Technology

    International Nuclear Information System (INIS)

    Lee, Ji Hyun; Lee, Dong Woog; Gyu, Jang Se; Kwak, No-Sang; Lee, In Young; Jang, Kyung Ryoung; Shim, Jae-Goo; Choi, Jong Shin

    2015-01-01

    Estimating potential of CO 2 emission reduction of non-capture CO 2 utilization (NCCU) technology was evaluated. NCCU is sodium bicarbonate production technology through the carbonation reaction of CO 2 contained in the flue gas. For the estimating the CO 2 emission reduction, process simulation using process simulator (PRO/II) based on a chemical plant which could handle CO 2 of 100 tons per day was performed, Also for the estimation of the indirect CO 2 reduction, the solvay process which is a conventional technology for the production of sodium carbonate/sodium bicarbonate, was studied. The results of the analysis showed that in case of the solvay process, overall CO 2 emission was estimated as 48,862 ton per year based on the energy consumption for the production of NaHCO 3 (7.4 GJ/tNaHCO 3 ). While for the NCCU technology, the direct CO 2 reduction through the CO 2 carbonation was estimated as 36,500 ton per year and the indirect CO 2 reduction through the lower energy consumption was 46,885 ton per year which lead to 83,385 ton per year in total. From these results, it could be concluded that sodium bicarbonate production technology through the carbonation reaction of CO 2 contained in the flue was energy efficient and could be one of the promising technology for the low CO 2 emission technology.

  6. RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

    Energy Technology Data Exchange (ETDEWEB)

    Robert S. Balch; Ron Broadhead

    2005-03-01

    Incomplete or sparse data such as geologic or formation characteristics introduce a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries have demonstrated beneficial results when working with sparse data. State-of-the-art expert exploration tools, relying on a database, and computer maps generated by neural networks and user inputs, have been developed through the use of ''fuzzy'' logic, a mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk has been reduced with the use of these properly verified and validated ''Fuzzy Expert Exploration (FEE) Tools.'' Through the course of this project, FEE Tools and supporting software were developed for two producing formations in southeast New Mexico. Tools of this type can be beneficial in many regions of the U.S. by enabling risk reduction in oil and gas prospecting as well as decreased prospecting and development costs. In today's oil industry environment, many smaller exploration companies lack the resources of a pool of expert exploration personnel. Downsizing, volatile oil prices, and scarcity of domestic exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the U.S. as reserves are depleted. The FEE Tools benefit a diverse group in the U.S., allowing a more efficient use of scarce funds, and potentially reducing dependence on foreign oil and providing lower product prices for consumers.

  7. Bats: A new tool for AMS data reduction

    International Nuclear Information System (INIS)

    Wacker, L.; Christl, M.; Synal, H.-A.

    2010-01-01

    A data evaluation program was developed at ETH Zurich to meet the requirements of the new compact AMS systems MICADAS and TANDY in addition to the large EN-Tandem accelerator. The program, called 'BATS', is designed to automatically calculate standard and blank corrected results for measured samples. After almost one year of routine operation with the MICADAS C-14 system BATS has proven to be an easy to use data reduction tool that requires minimal user input. Here we present the fundamental principle and the algorithms used in BATS for standard-sized radiocarbon measurements.

  8. Mapping grey matter reductions in schizophrenia: an anatomical likelihood estimation analysis of voxel-based morphometry studies.

    Science.gov (United States)

    Fornito, A; Yücel, M; Patti, J; Wood, S J; Pantelis, C

    2009-03-01

    Voxel-based morphometry (VBM) is a popular tool for mapping neuroanatomical changes in schizophrenia patients. Several recent meta-analyses have identified the brain regions in which patients most consistently show grey matter reductions, although they have not examined whether such changes reflect differences in grey matter concentration (GMC) or grey matter volume (GMV). These measures assess different aspects of grey matter integrity, and may therefore reflect different pathological processes. In this study, we used the Anatomical Likelihood Estimation procedure to analyse significant differences reported in 37 VBM studies of schizophrenia patients, incorporating data from 1646 patients and 1690 controls, and compared the findings of studies using either GMC or GMV to index grey matter differences. Analysis of all studies combined indicated that grey matter reductions in a network of frontal, temporal, thalamic and striatal regions are among the most frequently reported in literature. GMC reductions were generally larger and more consistent than GMV reductions, and were more frequent in the insula, medial prefrontal, medial temporal and striatal regions. GMV reductions were more frequent in dorso-medial frontal cortex, and lateral and orbital frontal areas. These findings support the primacy of frontal, limbic, and subcortical dysfunction in the pathophysiology of schizophrenia, and suggest that the grey matter changes observed with MRI may not necessarily result from a unitary pathological process.

  9. Reduction of variance in spectral estimates for correction of ultrasonic aberration.

    Science.gov (United States)

    Astheimer, Jeffrey P; Pilkington, Wayne C; Waag, Robert C

    2006-01-01

    A variance reduction factor is defined to describe the rate of convergence and accuracy of spectra estimated from overlapping ultrasonic scattering volumes when the scattering is from a spatially uncorrelated medium. Assuming that the individual volumes are localized by a spherically symmetric Gaussian window and that centers of the volumes are located on orbits of an icosahedral rotation group, the factor is minimized by adjusting the weight and radius of each orbit. Conditions necessary for the application of the variance reduction method, particularly for statistical estimation of aberration, are examined. The smallest possible value of the factor is found by allowing an unlimited number of centers constrained only to be within a ball rather than on icosahedral orbits. Computations using orbits formed by icosahedral vertices, face centers, and edge midpoints with a constraint radius limited to a small multiple of the Gaussian width show that a significant reduction of variance can be achieved from a small number of centers in the confined volume and that this reduction is nearly the maximum obtainable from an unlimited number of centers in the same volume.

  10. Estimation of tool wear during CNC milling using neural network-based sensor fusion

    Science.gov (United States)

    Ghosh, N.; Ravi, Y. B.; Patra, A.; Mukhopadhyay, S.; Paul, S.; Mohanty, A. R.; Chattopadhyay, A. B.

    2007-01-01

    Cutting tool wear degrades the product quality in manufacturing processes. Monitoring tool wear value online is therefore needed to prevent degradation in machining quality. Unfortunately there is no direct way of measuring the tool wear online. Therefore one has to adopt an indirect method wherein the tool wear is estimated from several sensors measuring related process variables. In this work, a neural network-based sensor fusion model has been developed for tool condition monitoring (TCM). Features extracted from a number of machining zone signals, namely cutting forces, spindle vibration, spindle current, and sound pressure level have been fused to estimate the average flank wear of the main cutting edge. Novel strategies such as, signal level segmentation for temporal registration, feature space filtering, outlier removal, and estimation space filtering have been proposed. The proposed approach has been validated by both laboratory and industrial implementations.

  11. Real-Time Estimation for Cutting Tool Wear Based on Modal Analysis of Monitored Signals

    Directory of Open Access Journals (Sweden)

    Yongjiao Chi

    2018-05-01

    Full Text Available There is a growing body of literature that recognizes the importance of product safety and the quality problems during processing. The working status of cutting tools may lead to project delay and cost overrun if broken down accidentally, and tool wear is crucial to processing precision in mechanical manufacturing, therefore, this study contributes to this growing area of research by monitoring condition and estimating wear. In this research, an effective method for tool wear estimation was constructed, in which, the signal features of machining process were extracted by ensemble empirical mode decomposition (EEMD and were used to estimate the tool wear. Based on signal analysis, vibration signals that had better linear relationship with tool wearing process were decomposed, then the intrinsic mode functions (IMFs, frequency spectrums of IMFs and the features relating to amplitude changes of frequency spectrum were obtained. The trend that tool wear changes with the features was fitted by Gaussian fitting function to estimate the tool wear. Experimental investigation was used to verify the effectiveness of this method and the results illustrated the correlation between tool wear and the modal features of monitored signals.

  12. Estimated emission reductions from California's enhanced Smog Check program.

    Science.gov (United States)

    Singer, Brett C; Wenzel, Thomas P

    2003-06-01

    The U.S. Environmental Protection Agency requires that states evaluate the effectiveness of their vehicle emissions inspection and maintenance (I/M) programs. This study demonstrates an evaluation approach that estimates mass emission reductions over time and includes the effect of I/M on vehicle deterioration. It includes a quantitative assessment of benefits from pre-inspection maintenance and repairs and accounts for the selection bias effect that occurs when intermittent high emitters are tested. We report estimates of one-cycle emission benefits of California's Enhanced Smog Check program, ca. 1999. Program benefits equivalent to metric tons per day of prevented emissions were calculated with a "bottom-up" approach that combined average per vehicle reductions in mass emission rates (g/gal) with average per vehicle activity, resolved by model year. Accelerated simulation mode test data from the statewide vehicle information database (VID) and from roadside Smog Check testing were used to determine 2-yr emission profiles of vehicles passing through Smog Check and infer emission profiles that would occur without Smog Check. The number of vehicles participating in Smog Check was also determined from the VID. We estimate that in 1999 Smog Check reduced tailpipe emissions of HC, CO, and NO(x) by 97, 1690, and 81 t/d, respectively. These correspond to 26, 34, and 14% of the HC, CO, and NO(x) that would have been emitted by vehicles in the absence of Smog Check. These estimates are highly sensitive to assumptions about vehicle deterioration in the absence of Smog Check. Considering the estimated uncertainty in these assumptions yields a range for calculated benefits: 46-128 t/d of HC, 860-2200 t/d of CO, and 60-91 t/d of NO(x). Repair of vehicles that failed an initial, official Smog Check appears to be the most important mechanism of emission reductions, but pre-inspection maintenance and repair also contributed substantially. Benefits from removal of nonpassing

  13. Temporal rainfall estimation using input data reduction and model inversion

    Science.gov (United States)

    Wright, A. J.; Vrugt, J. A.; Walker, J. P.; Pauwels, V. R. N.

    2016-12-01

    Floods are devastating natural hazards. To provide accurate, precise and timely flood forecasts there is a need to understand the uncertainties associated with temporal rainfall and model parameters. The estimation of temporal rainfall and model parameter distributions from streamflow observations in complex dynamic catchments adds skill to current areal rainfall estimation methods, allows for the uncertainty of rainfall input to be considered when estimating model parameters and provides the ability to estimate rainfall from poorly gauged catchments. Current methods to estimate temporal rainfall distributions from streamflow are unable to adequately explain and invert complex non-linear hydrologic systems. This study uses the Discrete Wavelet Transform (DWT) to reduce rainfall dimensionality for the catchment of Warwick, Queensland, Australia. The reduction of rainfall to DWT coefficients allows the input rainfall time series to be simultaneously estimated along with model parameters. The estimation process is conducted using multi-chain Markov chain Monte Carlo simulation with the DREAMZS algorithm. The use of a likelihood function that considers both rainfall and streamflow error allows for model parameter and temporal rainfall distributions to be estimated. Estimation of the wavelet approximation coefficients of lower order decomposition structures was able to estimate the most realistic temporal rainfall distributions. These rainfall estimates were all able to simulate streamflow that was superior to the results of a traditional calibration approach. It is shown that the choice of wavelet has a considerable impact on the robustness of the inversion. The results demonstrate that streamflow data contains sufficient information to estimate temporal rainfall and model parameter distributions. The extent and variance of rainfall time series that are able to simulate streamflow that is superior to that simulated by a traditional calibration approach is a

  14. A Tool for Estimating Variability in Wood Preservative Treatment Retention

    Science.gov (United States)

    Patricia K. Lebow; Adam M. Taylor; Timothy M. Young

    2015-01-01

    Composite sampling is standard practice for evaluation of preservative retention levels in preservative-treated wood. Current protocols provide an average retention value but no estimate of uncertainty. Here we describe a statistical method for calculating uncertainty estimates using the standard sampling regime with minimal additional chemical analysis. This tool can...

  15. Planning Tools For Estimating Radiation Exposure At The National Ignition Facility

    International Nuclear Information System (INIS)

    Verbeke, J.; Young, M.; Brereton, S.; Dauffy, L.; Hall, J.; Hansen, L.; Khater, H.; Kim, S.; Pohl, B.; Sitaraman, S.

    2010-01-01

    A set of computational tools was developed to help estimate and minimize potential radiation exposure to workers from material activation in the National Ignition Facility (NIF). AAMI (Automated ALARA-MCNP Interface) provides an efficient, automated mechanism to perform the series of calculations required to create dose rate maps for the entire facility with minimal manual user input. NEET (NIF Exposure Estimation Tool) is a web application that combines the information computed by AAMI with a given shot schedule to compute and display the dose rate maps as a function of time. AAMI and NEET are currently used as work planning tools to determine stay-out times for workers following a given shot or set of shots, and to help in estimating integrated doses associated with performing various maintenance activities inside the target bay. Dose rate maps of the target bay were generated following a low-yield 10 16 D-T shot and will be presented in this paper.

  16. A Web-Based Tool to Estimate Pollutant Loading Using LOADEST

    Directory of Open Access Journals (Sweden)

    Youn Shik Park

    2015-09-01

    Full Text Available Collecting and analyzing water quality samples is costly and typically requires significant effort compared to streamflow data, thus water quality data are typically collected at a low frequency. Regression models, identifying a relationship between streamflow and water quality data, are often used to estimate pollutant loads. A web-based tool using LOAD ESTimator (LOADEST as a core engine with four modules was developed to provide user-friendly interfaces and input data collection via web access. The first module requests and receives streamflow and water quality data from the U.S. Geological Survey. The second module retrieves watershed area for computation of pollutant loads per unit area. The third module examines potential error of input datasets for LOADEST runs, and the last module computes estimated and allowable annual average pollutant loads and provides tabular and graphical LOADEST outputs. The web-based tool was applied to two watersheds in this study, one agriculturally-dominated and one urban-dominated. It was found that annual sediment load at the urban-dominant watershed exceeded the target load; therefore, the web-based tool identified correctly the watershed requiring best management practices to reduce pollutant loads.

  17. Bats: A new tool for AMS data reduction

    Energy Technology Data Exchange (ETDEWEB)

    Wacker, L., E-mail: Wacker@phys.ethz.c [Ion Beam Physics, ETH Zurich (Switzerland); Christl, M.; Synal, H.-A. [Ion Beam Physics, ETH Zurich (Switzerland)

    2010-04-15

    A data evaluation program was developed at ETH Zurich to meet the requirements of the new compact AMS systems MICADAS and TANDY in addition to the large EN-Tandem accelerator. The program, called 'BATS', is designed to automatically calculate standard and blank corrected results for measured samples. After almost one year of routine operation with the MICADAS C-14 system BATS has proven to be an easy to use data reduction tool that requires minimal user input. Here we present the fundamental principle and the algorithms used in BATS for standard-sized radiocarbon measurements.

  18. Development of pollution reduction strategies for Mexico City: Estimating cost and ozone reduction effectiveness

    International Nuclear Information System (INIS)

    Thayer, G.R.; Hardie, R.W.; Barrera-Roldan, A.

    1993-01-01

    This reports on the collection and preparation of data (costs and air quality improvement) for the strategic evaluation portion of the Mexico City Air Quality Research Initiative (MARI). Reports written for the Mexico City government by various international organizations were used to identify proposed options along with estimates of cost and emission reductions. Information from appropriate options identified by SCAQMD for Southem California were also used in the analysis. A linear optimization method was used to select a group of options or a strategy to be evaluated by decision analysis. However, the reduction of ozone levels is not a linear function of the reduction of hydrocarbon and NO x emissions. Therefore, a more detailed analysis was required for ozone. An equation for a plane on an isopleth calculated with a trajectory model was obtained using two endpoints that bracket the expected total ozone precursor reductions plus the starting concentrations for hydrocarbons and NO x . The relationship between ozone levels and the hydrocarbon and NO x concentrations was assumed to lie on this plane. This relationship was used in the linear optimization program to select the options comprising a strategy

  19. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  20. Artificial Neural Network-Based Clutter Reduction Systems for Ship Size Estimation in Maritime Radars

    Directory of Open Access Journals (Sweden)

    M. P. Jarabo-Amores

    2010-01-01

    Full Text Available The existence of clutter in maritime radars deteriorates the estimation of some physical parameters of the objects detected over the sea surface. For that reason, maritime radars should incorporate efficient clutter reduction techniques. Due to the intrinsic nonlinear dynamic of sea clutter, nonlinear signal processing is needed, what can be achieved by artificial neural networks (ANNs. In this paper, an estimation of the ship size using an ANN-based clutter reduction system followed by a fixed threshold is proposed. High clutter reduction rates are achieved using 1-dimensional (horizontal or vertical integration modes, although inaccurate ship width estimations are achieved. These estimations are improved using a 2-dimensional (rhombus integration mode. The proposed system is compared with a CA-CFAR system, denoting a great performance improvement and a great robustness against changes in sea clutter conditions and ship parameters, independently of the direction of movement of the ocean waves and ships.

  1. GARDEC, Estimation of dose-rates reduction by garden decontamination

    International Nuclear Information System (INIS)

    Togawa, Orihiko

    2006-01-01

    1 - Description of program or function: GARDEC estimates the reduction of dose rates by garden decontamination. It provides the effect of different decontamination Methods, the depth of soil to be considered, dose-rate before and after decontamination and the reduction factor. 2 - Methods: This code takes into account three Methods of decontamination : (i)digging a garden in a special way, (ii) a removal of the upper layer of soil, and (iii) covering with a shielding layer of soil. The dose-rate conversion factor is defined as the external dose-rate, in the air, at a given height above the ground from a unit concentration of a specific radionuclide in each soil layer

  2. Estimating CO{sub 2} Emission Reduction of Non-capture CO{sub 2} Utilization (NCCU) Technology

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ji Hyun; Lee, Dong Woog; Gyu, Jang Se; Kwak, No-Sang; Lee, In Young; Jang, Kyung Ryoung; Shim, Jae-Goo [KEPCO Research Institute, Daejon (Korea, Republic of); Choi, Jong Shin [Korea East-West Power Co., LTD(ETP), Ulsan (Korea, Republic of)

    2015-10-15

    Estimating potential of CO{sub 2} emission reduction of non-capture CO{sub 2} utilization (NCCU) technology was evaluated. NCCU is sodium bicarbonate production technology through the carbonation reaction of CO{sub 2} contained in the flue gas. For the estimating the CO{sub 2} emission reduction, process simulation using process simulator (PRO/II) based on a chemical plant which could handle CO{sub 2} of 100 tons per day was performed, Also for the estimation of the indirect CO{sub 2} reduction, the solvay process which is a conventional technology for the production of sodium carbonate/sodium bicarbonate, was studied. The results of the analysis showed that in case of the solvay process, overall CO{sub 2} emission was estimated as 48,862 ton per year based on the energy consumption for the production of NaHCO{sub 3} (7.4 GJ/tNaHCO{sub 3}). While for the NCCU technology, the direct CO{sub 2} reduction through the CO{sub 2} carbonation was estimated as 36,500 ton per year and the indirect CO{sub 2} reduction through the lower energy consumption was 46,885 ton per year which lead to 83,385 ton per year in total. From these results, it could be concluded that sodium bicarbonate production technology through the carbonation reaction of CO{sub 2} contained in the flue was energy efficient and could be one of the promising technology for the low CO{sub 2} emission technology.

  3. Estimating mortality risk reduction and economic benefits from controlling ozone air pollution

    National Research Council Canada - National Science Library

    Committee on Estimating Mortality Risk Reduction Benefits from Decreasing Tropospheric Ozone Exposure

    2008-01-01

    ... in life expectancy, and to assess methods for estimating the monetary value of the reduced risk of premature death and increased life expectancy in the context of health-benefits analysis. Estimating Mortality Risk Reduction and Economic Benefits from Controlling Ozone Air Pollution details the committee's findings and posits several recommendations to address these issues.

  4. A tool for the estimation of the distribution of landslide area in R

    Science.gov (United States)

    Rossi, M.; Cardinali, M.; Fiorucci, F.; Marchesini, I.; Mondini, A. C.; Santangelo, M.; Ghosh, S.; Riguer, D. E. L.; Lahousse, T.; Chang, K. T.; Guzzetti, F.

    2012-04-01

    We have developed a tool in R (the free software environment for statistical computing, http://www.r-project.org/) to estimate the probability density and the frequency density of landslide area. The tool implements parametric and non-parametric approaches to the estimation of the probability density and the frequency density of landslide area, including: (i) Histogram Density Estimation (HDE), (ii) Kernel Density Estimation (KDE), and (iii) Maximum Likelihood Estimation (MLE). The tool is available as a standard Open Geospatial Consortium (OGC) Web Processing Service (WPS), and is accessible through the web using different GIS software clients. We tested the tool to compare Double Pareto and Inverse Gamma models for the probability density of landslide area in different geological, morphological and climatological settings, and to compare landslides shown in inventory maps prepared using different mapping techniques, including (i) field mapping, (ii) visual interpretation of monoscopic and stereoscopic aerial photographs, (iii) visual interpretation of monoscopic and stereoscopic VHR satellite images and (iv) semi-automatic detection and mapping from VHR satellite images. Results show that both models are applicable in different geomorphological settings. In most cases the two models provided very similar results. Non-parametric estimation methods (i.e., HDE and KDE) provided reasonable results for all the tested landslide datasets. For some of the datasets, MLE failed to provide a result, for convergence problems. The two tested models (Double Pareto and Inverse Gamma) resulted in very similar results for large and very large datasets (> 150 samples). Differences in the modeling results were observed for small datasets affected by systematic biases. A distinct rollover was observed in all analyzed landslide datasets, except for a few datasets obtained from landslide inventories prepared through field mapping or by semi-automatic mapping from VHR satellite imagery

  5. Parameter Estimation of the Thermal Network Model of a Machine Tool Spindle by Self-made Bluetooth Temperature Sensor Module

    Directory of Open Access Journals (Sweden)

    Yuan-Chieh Lo

    2018-02-01

    Full Text Available Thermal characteristic analysis is essential for machine tool spindles because sudden failures may occur due to unexpected thermal issue. This article presents a lumped-parameter Thermal Network Model (TNM and its parameter estimation scheme, including hardware and software, in order to characterize both the steady-state and transient thermal behavior of machine tool spindles. For the hardware, the authors develop a Bluetooth Temperature Sensor Module (BTSM which accompanying with three types of temperature-sensing probes (magnetic, screw, and probe. Its specification, through experimental test, achieves to the precision ±(0.1 + 0.0029|t| °C, resolution 0.00489 °C, power consumption 7 mW, and size Ø40 mm × 27 mm. For the software, the heat transfer characteristics of the machine tool spindle correlative to rotating speed are derived based on the theory of heat transfer and empirical formula. The predictive TNM of spindles was developed by grey-box estimation and experimental results. Even under such complicated operating conditions as various speeds and different initial conditions, the experiments validate that the present modeling methodology provides a robust and reliable tool for the temperature prediction with normalized mean square error of 99.5% agreement, and the present approach is transferable to the other spindles with a similar structure. For realizing the edge computing in smart manufacturing, a reduced-order TNM is constructed by Model Order Reduction (MOR technique and implemented into the real-time embedded system.

  6. The Influence of Tool Texture on Friction and Lubrication in Strip Reduction Testing

    DEFF Research Database (Denmark)

    Sulaiman, Mohd Hafis Bin; Christiansen, Peter; Bay, Niels Oluf

    2017-01-01

    While texturing of workpiece surfaces to promote lubrication in metal forming has beenapplied for several decades, tool surface texturing is rather new. In the present paper, tool texturing is studied as a method to prevent galling. A strip reduction test was conducted with tools provided...... with shallow, longitudinal pockets oriented perpendicular to the sliding direction. The pockets had small angles to the workpiece surface and the distance between them were varied. The experiments reveal that the distance between pockets should be larger than the pocket width, thereby creating a topography...... similar to flat table mountains to avoid mechanical interlocking in the valleys; otherwise, an increase in drawing load and pick-up on the tools are observed. The textured tool surface lowers friction and improves lubrication performance, provided that the distance between pockets is 2–4 times larger than...

  7. Vapor Intrusion Estimation Tool for Unsaturated Zone Contaminant Sources. User’s Guide

    Science.gov (United States)

    2016-08-30

    estimation process when applying the tool. The tool described here is focused on vapor-phase diffusion from the current vadose zone source , and is not...from the current defined vadose zone source ). The estimated soil gas contaminant concentration obtained from the pre-modeled scenarios for a building...need a full site-specific numerical model to assess the impacts beyond the current vadose zone source . 35 5.0 References Brennan, R.A., N

  8. Reduction of inequalities in health: assessing evidence-based tools

    Directory of Open Access Journals (Sweden)

    Shea Beverley

    2006-09-01

    Full Text Available Abstract Background The reduction of health inequalities is a focus of many national and international health organisations. The need for pragmatic evidence-based approaches has led to the development of a number of evidence-based equity initiatives. This paper describes a new program that focuses upon evidence- based tools, which are useful for policy initiatives that reduce inequities. Methods This paper is based on a presentation that was given at the "Regional Consultation on Policy Tools: Equity in Population Health Reports," held in Toronto, Canada in June 2002. Results Five assessment tools were presented. 1. A database of systematic reviews on the effects of educational, legal, social, and health interventions to reduce unfair inequalities is being established through the Cochrane and Campbell Collaborations. 2 Decision aids and shared decision making can be facilitated in disadvantaged groups by 'health coaches' to help people become better decision makers, negotiators, and navigators of the health system; a pilot study in Chile has provided proof of this concept. 3. The CIET Cycle: Combining adapted cluster survey techniques with qualitative methods, CIET's population based applications support evidence-based decision making at local and national levels. The CIET map generates maps directly from survey or routine institutional data, to be used as evidence-based decisions aids. Complex data can be displayed attractively, providing an important tool for studying and comparing health indicators among and between different populations. 4. The Ottawa Equity Gauge is applying the Global Equity Gauge Alliance framework to an industrialised country setting. 5 The Needs-Based Health Assessment Toolkit, established to assemble information on which clinical and health policy decisions can be based, is being expanded to ensure a focus on distribution and average health indicators. Conclusion Evidence-based planning tools have much to offer the

  9. Multidimensional Rank Reduction Estimator for Parametric MIMO Channel Models

    Directory of Open Access Journals (Sweden)

    Marius Pesavento

    2004-08-01

    Full Text Available A novel algebraic method for the simultaneous estimation of MIMO channel parameters from channel sounder measurements is developed. We consider a parametric multipath propagation model with P discrete paths where each path is characterized by its complex path gain, its directions of arrival and departure, time delay, and Doppler shift. This problem is treated as a special case of the multidimensional harmonic retrieval problem. While the well-known ESPRIT-type algorithms exploit shift-invariance between specific partitions of the signal matrix, the rank reduction estimator (RARE algorithm exploits their internal Vandermonde structure. A multidimensional extension of the RARE algorithm is developed, analyzed, and applied to measurement data recorded with the RUSK vector channel sounder in the 2 GHz band.

  10. Weight Estimation Tool for Children Aged 6 to 59 Months in Limited-Resource Settings.

    Science.gov (United States)

    Ralston, Mark E; Myatt, Mark A

    2016-01-01

    A simple, reliable anthropometric tool for rapid estimation of weight in children would be useful in limited-resource settings where current weight estimation tools are not uniformly reliable, nearly all global under-five mortality occurs, severe acute malnutrition is a significant contributor in approximately one-third of under-five mortality, and a weight scale may not be immediately available in emergencies to first-response providers. To determine the accuracy and precision of mid-upper arm circumference (MUAC) and height as weight estimation tools in children under five years of age in low-to-middle income countries. This was a retrospective observational study. Data were collected in 560 nutritional surveys during 1992-2006 using a modified Expanded Program of Immunization two-stage cluster sample design. Locations with high prevalence of acute and chronic malnutrition. A total of 453,990 children met inclusion criteria (age 6-59 months; weight ≤ 25 kg; MUAC 80-200 mm) and exclusion criteria (bilateral pitting edema; biologically implausible weight-for-height z-score (WHZ), weight-for-age z-score (WAZ), and height-for-age z-score (HAZ) values). Weight was estimated using Broselow Tape, Hong Kong formula, and database MUAC alone, height alone, and height and MUAC combined. Mean percentage difference between true and estimated weight, proportion of estimates accurate to within ± 25% and ± 10% of true weight, weighted Kappa statistic, and Bland-Altman bias were reported as measures of tool accuracy. Standard deviation of mean percentage difference and Bland-Altman 95% limits of agreement were reported as measures of tool precision. Database height was a more accurate and precise predictor of weight compared to Broselow Tape 2007 [B], Broselow Tape 2011 [A], and MUAC. Mean percentage difference between true and estimated weight was +0.49% (SD = 10.33%); proportion of estimates accurate to within ± 25% of true weight was 97.36% (95% CI 97.40%, 97.46%); and

  11. Revised estimates for ozone reduction by shuttle operation

    Science.gov (United States)

    Potter, A. E.

    1978-01-01

    Previous calculations by five different modeling groups of the effect of space shuttle operations on the ozone layer yielded an estimate of 0.2 percent ozone reduction for the Northern Hemisphere at 60 launches per year. Since these calculations were made, the accepted rate constant for the reaction between hydroperoxyl and nitric oxide to yield hydroxyl and nitrogen dioxide, HO2 + NO yields OH + NO2, was revised upward by more than an order of magnitude, with a resultant increase in the predicted ozone reduction for chlorofluoromethanes by a factor of approximately 2. New calculations of the shuttle effect were made with use of the new rate constant data, again by five different modeling groups. The new value of the shuttle effect on the ozone layer was found to be 0.25 percent. The increase resulting from the revised rate constant is considerably less for space shuttle operations than for chlorofluoromethane production, because the new rate constant also increases the calculated rate of downward transport of shuttle exhaust products out of the stratosphere.

  12. Estimating Longitudinal Risks and Benefits From Cardiovascular Preventive Therapies Among Medicare Patients: The Million Hearts Longitudinal ASCVD Risk Assessment Tool: A Special Report From the American Heart Association and American College of Cardiology.

    Science.gov (United States)

    Lloyd-Jones, Donald M; Huffman, Mark D; Karmali, Kunal N; Sanghavi, Darshak M; Wright, Janet S; Pelser, Colleen; Gulati, Martha; Masoudi, Frederick A; Goff, David C

    2017-03-28

    The Million Hearts Initiative has a goal of preventing 1 million heart attacks and strokes-the leading causes of mortality-through several public health and healthcare strategies by 2017. The American Heart Association and American College of Cardiology support the program. The Cardiovascular Risk Reduction Model was developed by Million Hearts and the Center for Medicare & Medicaid Services as a strategy to assess a value-based payment approach toward reduction in 10-year predicted risk of atherosclerotic cardiovascular disease (ASCVD) by implementing cardiovascular preventive strategies to manage the "ABCS" (aspirin therapy in appropriate patients, blood pressure control, cholesterol management, and smoking cessation). The purpose of this special report is to describe the development and intended use of the Million Hearts Longitudinal ASCVD Risk Assessment Tool. The Million Hearts Tool reinforces and builds on the "2013 ACC/AHA Guideline on the Assessment of Cardiovascular Risk" by allowing clinicians to estimate baseline and updated 10-year ASCVD risk estimates for primary prevention patients adhering to the appropriate ABCS over time, alone or in combination. The tool provides updated risk estimates based on evidence from high-quality systematic reviews and meta-analyses of the ABCS therapies. This novel approach to personalized estimation of benefits from risk-reducing therapies in primary prevention may help target therapies to those in whom they will provide the greatest benefit, and serves as the basis for a Center for Medicare & Medicaid Services program designed to evaluate the Million Hearts Cardiovascular Risk Reduction Model. Copyright © 2017 American Heart Association, Inc., and the American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  13. Using the soil and water assessment tool to estimate achievable water quality targets through implementation of beneficial management practices in an agricultural watershed.

    Science.gov (United States)

    Yang, Qi; Benoy, Glenn A; Chow, Thien Lien; Daigle, Jean-Louis; Bourque, Charles P-A; Meng, Fan-Rui

    2012-01-01

    Runoff from crop production in agricultural watersheds can cause widespread soil loss and degradation of surface water quality. Beneficial management practices (BMPs) for soil conservation are often implemented as remedial measures because BMPs can reduce soil erosion and improve water quality. However, the efficacy of BMPs may be unknown because it can be affected by many factors, such as farming practices, land-use, soil type, topography, and climatic conditions. As such, it is difficult to estimate the impacts of BMPs on water quality through field experiments alone. In this research, the Soil and Water Assessment Tool was used to estimate achievable performance targets of water quality indicators (sediment and soluble P loadings) after implementation of combinations of selected BMPs in the Black Brook Watershed in northwestern New Brunswick, Canada. Four commonly used BMPs (flow diversion terraces [FDTs], fertilizer reductions, tillage methods, and crop rotations), were considered individually and in different combinations. At the watershed level, the best achievable sediment loading was 1.9 t ha(-1) yr(-1) (89% reduction compared with default scenario), with a BMP combination of crop rotation, FDT, and no-till. The best achievable soluble P loading was 0.5 kg ha(-1) yr(-1) (62% reduction), with a BMP combination of crop rotation and FDT and fertilizer reduction. Targets estimated through nonpoint source water quality modeling can be used to evaluate BMP implementation initiatives and provide milestones for the rehabilitation of streams and rivers in agricultural regions. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  14. Competitive kinetics as a tool to determine rate constants for reduction of ferrylmyoglobin by food components

    DEFF Research Database (Denmark)

    Jongberg, Sisse; Lund, Marianne Nissen; Pattison, David I.

    2016-01-01

    Competitive kinetics were applied as a tool to determine apparent rate constants for the reduction of hypervalent haem pigment ferrylmyoglobin (MbFe(IV)=O) by proteins and phenols in aqueous solution of pH 7.4 and I = 1.0 at 25 °C. Reduction of MbFe(IV)=O by a myofibrillar protein isolate (MPI) f...

  15. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Directory of Open Access Journals (Sweden)

    Christoph Heesen

    Full Text Available BACKGROUND: Prognostic counseling in multiple sclerosis (MS is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. OBJECTIVE: The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110 and their physicians (n = 6 and compared them with the estimates of OLAP. RESULTS: Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. CONCLUSION: While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic

  16. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Science.gov (United States)

    Heesen, Christoph; Gaissmaier, Wolfgang; Nguyen, Franziska; Stellmann, Jan-Patrick; Kasper, Jürgen; Köpke, Sascha; Lederer, Christian; Neuhaus, Anneke; Daumer, Martin

    2013-01-01

    Prognostic counseling in multiple sclerosis (MS) is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP) tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110) and their physicians (n = 6) and compared them with the estimates of OLAP. Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic estimates and clarify its usefulness for patients and physicians

  17. An open tool for input function estimation and quantification of dynamic PET FDG brain scans.

    Science.gov (United States)

    Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro

    2016-08-01

    Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main

  18. An adaptive observer for on-line tool wear estimation in turning, Part I: Theory

    Science.gov (United States)

    Danai, Kourosh; Ulsoy, A. Galip

    1987-04-01

    On-line sensing of tool wear has been a long-standing goal of the manufacturing engineering community. In the absence of any reliable on-line tool wear sensors, a new model-based approach for tool wear estimation has been proposed. This approach is an adaptive observer, based on force measurement, which uses both parameter and state estimation techniques. The design of the adaptive observer is based upon a dynamic state model of tool wear in turning. This paper (Part I) presents the model, and explains its use as the basis for the adaptive observer design. This model uses flank wear and crater wear as state variables, feed as the input, and the cutting force as the output. The suitability of the model as the basis for adaptive observation is also verified. The implementation of the adaptive observer requires the design of a state observer and a parameter estimator. To obtain the model parameters for tuning the adaptive observer procedures for linearisation of the non-linear model are specified. The implementation of the adaptive observer in turning and experimental results are presented in a companion paper (Part II).

  19. Reduction of radiation exposure and image quality using dose reduction tool on computed tomography fluoroscopy

    International Nuclear Information System (INIS)

    Sakabe, Daisuke; Tochihara, Syuichi; Ono, Michiaki; Tokuda, Masaki; Kai, Noriyuki; Nakato, Kengo; Hashida, Masahiro; Funama, Yoshinori; Murazaki, Hiroo

    2012-01-01

    The purpose of our study was to measure the reduction rate of radiation dose and variability of image noise using the angular beam modulation (ABM) on computed tomography (CT) fluoroscopy. The Alderson-Rando phantom and the homemade phantom were used in our study. These phantoms were scanned at on-center and off-center positions at -12 cm along y-axis with and without ABM technique. Regarding the technique, the x-ray tube is turned off in a 100-degree angle sector at the center of 12 o'clock, 10 o'clock, and 2 o'clock positions during CT fluoroscopy. CT fluoroscopic images were obtained with tube voltages, 120 kV; tube current-time product per reconstructed image, 30 mAs; rotation time, 0.5 s/rot; slice thickness, 4.8 mm; and reconstruction kernel B30s in each scanning. After CT scanning, radiation exposure and image noise were measured and the image artifacts were evaluated with and without the technique. The reduction rate for radiation exposure was 75-80% with and without the technique at on-center position regardless of each angle position. In the case of the off-center position at -12 cm, the reduction rate was 50% with and without the technique. In contrast, image noise remained constant with and without the technique. Visual inspection for image artifacts almost have the same scores with and without the technique and no statistical significance was found in both techniques (p>0.05). ABM is an appropriate tool for reducing radiation exposure and maintaining image-noise and artifacts during CT fluoroscopy. (author)

  20. SBML-PET-MPI: a parallel parameter estimation tool for Systems Biology Markup Language based models.

    Science.gov (United States)

    Zi, Zhike

    2011-04-01

    Parameter estimation is crucial for the modeling and dynamic analysis of biological systems. However, implementing parameter estimation is time consuming and computationally demanding. Here, we introduced a parallel parameter estimation tool for Systems Biology Markup Language (SBML)-based models (SBML-PET-MPI). SBML-PET-MPI allows the user to perform parameter estimation and parameter uncertainty analysis by collectively fitting multiple experimental datasets. The tool is developed and parallelized using the message passing interface (MPI) protocol, which provides good scalability with the number of processors. SBML-PET-MPI is freely available for non-commercial use at http://www.bioss.uni-freiburg.de/cms/sbml-pet-mpi.html or http://sites.google.com/site/sbmlpetmpi/.

  1. Emerging Tools to Estimate and to Predict Exposures to ...

    Science.gov (United States)

    The timely assessment of the human and ecological risk posed by thousands of existing and emerging commercial chemicals is a critical challenge facing EPA in its mission to protect public health and the environment The US EPA has been conducting research to enhance methods used to estimate and forecast exposures for tens of thousands of chemicals. This research is aimed at both assessing risks and supporting life cycle analysis, by developing new models and tools for high throughput exposure screening and prioritization, as well as databases that support these and other tools, especially regarding consumer products. The models and data address usage, and take advantage of quantitative structural activity relationships (QSARs) for both inherent chemical properties and function (why the chemical is a product ingredient). To make them more useful and widely available, the new tools, data and models are designed to be: • Flexible • Intraoperative • Modular (useful to more than one, stand-alone application) • Open (publicly available software) Presented at the Society for Risk Analysis Forum: Risk Governance for Key Enabling Technologies, Venice, Italy, March 1-3, 2017

  2. Noise reduction and estimation in multiple micro-electro-mechanical inertial systems

    International Nuclear Information System (INIS)

    Waegli, Adrian; Skaloud, Jan; Guerrier, Stéphane; Parés, Maria Eulàlia; Colomina, Ismael

    2010-01-01

    This research studies the reduction and the estimation of the noise level within a redundant configuration of low-cost (MEMS-type) inertial measurement units (IMUs). Firstly, independent observations between units and sensors are assumed and the theoretical decrease in the system noise level is analyzed in an experiment with four MEMS-IMU triads. Then, more complex scenarios are presented in which the noise level can vary in time and for each sensor. A statistical method employed for studying the volatility of financial markets (GARCH) is adapted and tested for the usage with inertial data. This paper demonstrates experimentally and through simulations the benefit of direct noise estimation in redundant IMU setups

  3. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    Science.gov (United States)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  4. Randomized Comparison of Mobile and Web-Tools to Provide Dementia Risk Reduction Education: Use, Engagement and Participant Satisfaction.

    Science.gov (United States)

    O'Connor, Elodie; Farrow, Maree; Hatherly, Chris

    2014-01-01

    Encouraging middle-aged adults to maintain their physical and cognitive health may have a significant impact on reducing the prevalence of dementia in the future. Mobile phone apps and interactive websites may be one effective way to target this age group. However, to date there has been little research investigating the user experience of dementia risk reduction tools delivered in this way. The aim of this study was to explore participant engagement and evaluations of three different targeted smartphone and Web-based dementia risk reduction tools following a four-week intervention. Participants completed a Web-based screening questionnaire to collect eligibility information. Eligible participants were asked to complete a Web-based baseline questionnaire and were then randomly assigned to use one of the three dementia risk reduction tools for a period of four weeks: (1) a mobile phone application; (2) an information-based website; and (3) an interactive website. User evaluations were obtained via a Web-based follow-up questionnaire after completion of the intervention. Of 415 eligible participants, 370 (89.16%) completed the baseline questionnaire and were assigned to an intervention group; 200 (54.05%) completed the post-intervention questionnaire. The average age of participants was 52 years, and 149 (75%) were female. Findings indicated that participants from all three intervention groups reported a generally positive impression of the tools across a range of domains. Participants using the information-based website reported higher ratings of their overall impression of the tool, F2,191=4.12, P=.02; how interesting the information was, F2,189=3.53, P=.03; how helpful the information was, F2,192=4.15, P=.02; and how much they learned, F2,188=3.86, P=.02. Group differences were significant between the mobile phone app and information-based website users, but not between the interactive website users and the other two groups. Additionally, participants using the

  5. Randomized Comparison of Mobile and Web-Tools to Provide Dementia Risk Reduction Education: Use, Engagement and Participant Satisfaction

    Science.gov (United States)

    O'Connor, Elodie; Hatherly, Chris

    2014-01-01

    Background Encouraging middle-aged adults to maintain their physical and cognitive health may have a significant impact on reducing the prevalence of dementia in the future. Mobile phone apps and interactive websites may be one effective way to target this age group. However, to date there has been little research investigating the user experience of dementia risk reduction tools delivered in this way. Objective The aim of this study was to explore participant engagement and evaluations of three different targeted smartphone and Web-based dementia risk reduction tools following a four-week intervention. Methods Participants completed a Web-based screening questionnaire to collect eligibility information. Eligible participants were asked to complete a Web-based baseline questionnaire and were then randomly assigned to use one of the three dementia risk reduction tools for a period of four weeks: (1) a mobile phone application; (2) an information-based website; and (3) an interactive website. User evaluations were obtained via a Web-based follow-up questionnaire after completion of the intervention. Results Of 415 eligible participants, 370 (89.16%) completed the baseline questionnaire and were assigned to an intervention group; 200 (54.05%) completed the post-intervention questionnaire. The average age of participants was 52 years, and 149 (75%) were female. Findings indicated that participants from all three intervention groups reported a generally positive impression of the tools across a range of domains. Participants using the information-based website reported higher ratings of their overall impression of the tool, F2,191=4.12, P=.02; how interesting the information was, F2,189=3.53, P=.03; how helpful the information was, F2,192=4.15, P=.02; and how much they learned, F2,188=3.86, P=.02. Group differences were significant between the mobile phone app and information-based website users, but not between the interactive website users and the other two groups

  6. Quantitative tools for addressing hospital readmissions

    Directory of Open Access Journals (Sweden)

    Lagoe Ronald J

    2012-11-01

    Full Text Available Abstract Background Increased interest in health care cost containment is focusing attention on reduction of hospital readmissions. Major payors have already developed financial penalties for providers that generate excess readmissions. This subject has benefitted from the development of resources such as the Potentially Preventable Readmissions software. This process has encouraged hospitals to renew efforts to improve these outcomes. The aim of this study was to describe quantitative tools such as definitions, risk estimation, and tracking of patients for reducing hospital readmissions. Findings This study employed the Potentially Preventable Readmissions software to develop quantitative tools for addressing hospital readmissions. These tools included two definitions of readmissions that support identification and management of patients. They also included analytical approaches for estimation of the risk of readmission for individual patients by age, discharge status of the initial admission, and severity of illness. They also included patient specific spreadsheets for tracking of target populations and for evaluation of the impact of interventions. Conclusions The study demonstrated that quantitative tools including the development of definitions of readmissions, estimation of the risk of readmission, and patient specific spreadsheets could contribute to the improvement of patient outcomes in hospitals.

  7. Consequent use of IT tools as a driver for cost reduction and quality improvements

    Science.gov (United States)

    Hein, Stefan; Rapp, Roberto; Feustel, Andreas

    2013-10-01

    The semiconductor industry drives a lot of efforts in the field of cost reductions and quality improvements. The consequent use of IT tools is one possibility to support these goals. With the extensions of its 150mm Fab to 200mm Robert Bosch increased the systematic use of data analysis and Advanced Process Control (APC).

  8. Children's estimates of food portion size: the development and evaluation of three portion size assessment tools for use with children.

    Science.gov (United States)

    Foster, E; Matthews, J N S; Lloyd, J; Marshall, L; Mathers, J C; Nelson, M; Barton, K L; Wrieden, W L; Cornelissen, P; Harris, J; Adamson, A J

    2008-01-01

    A number of methods have been developed to assist subjects in providing an estimate of portion size but their application in improving portion size estimation by children has not been investigated systematically. The aim was to develop portion size assessment tools for use with children and to assess the accuracy of children's estimates of portion size using the tools. The tools were food photographs, food models and an interactive portion size assessment system (IPSAS). Children (n 201), aged 4-16 years, were supplied with known quantities of food to eat, in school. Food leftovers were weighed. Children estimated the amount of each food using each tool, 24 h after consuming the food. The age-specific portion sizes represented were based on portion sizes consumed by children in a national survey. Significant differences were found between the accuracy of estimates using the three tools. Children of all ages performed well using the IPSAS and food photographs. The accuracy and precision of estimates made using the food models were poor. For all tools, estimates of the amount of food served were more accurate than estimates of the amount consumed. Issues relating to reporting of foods left over which impact on estimates of the amounts of foods actually consumed require further study. The IPSAS has shown potential for assessment of dietary intake with children. Before practical application in assessment of dietary intake of children the tool would need to be expanded to cover a wider range of foods and to be validated in a 'real-life' situation.

  9. Binaural noise reduction via cue-preserving MMSE filter and adaptive-blocking-based noise PSD estimation

    Science.gov (United States)

    Azarpour, Masoumeh; Enzner, Gerald

    2017-12-01

    Binaural noise reduction, with applications for instance in hearing aids, has been a very significant challenge. This task relates to the optimal utilization of the available microphone signals for the estimation of the ambient noise characteristics and for the optimal filtering algorithm to separate the desired speech from the noise. The additional requirements of low computational complexity and low latency further complicate the design. A particular challenge results from the desired reconstruction of binaural speech input with spatial cue preservation. The latter essentially diminishes the utility of multiple-input/single-output filter-and-sum techniques such as beamforming. In this paper, we propose a comprehensive and effective signal processing configuration with which most of the aforementioned criteria can be met suitably. This relates especially to the requirement of efficient online adaptive processing for noise estimation and optimal filtering while preserving the binaural cues. Regarding noise estimation, we consider three different architectures: interaural (ITF), cross-relation (CR), and principal-component (PCA) target blocking. An objective comparison with two other noise PSD estimation algorithms demonstrates the superiority of the blocking-based noise estimators, especially the CR-based and ITF-based blocking architectures. Moreover, we present a new noise reduction filter based on minimum mean-square error (MMSE), which belongs to the class of common gain filters, hence being rigorous in terms of spatial cue preservation but also efficient and competitive for the acoustic noise reduction task. A formal real-time subjective listening test procedure is also developed in this paper. The proposed listening test enables a real-time assessment of the proposed computationally efficient noise reduction algorithms in a realistic acoustic environment, e.g., considering time-varying room impulse responses and the Lombard effect. The listening test outcome

  10. Estimating the Fiscal Effects of Public Pharmaceutical Expenditure Reduction in Greece.

    Science.gov (United States)

    Souliotis, Kyriakos; Papageorgiou, Manto; Politi, Anastasia; Frangos, Nikolaos; Tountas, Yiannis

    2015-01-01

    The purpose of the present study is to estimate the impact of pharmaceutical spending reduction on public revenue, based on data from the national health accounts as well as on reports of Greece's organizations. The methodology of the analysis is structured in two basic parts. The first part presents the urgency for rapid cutbacks on public pharmaceutical costs due to the financial crisis and provides a conceptual framework for the contribution of the Greek pharmaceutical branch to the country's economy. In the second part, we perform a quantitative analysis for the estimation of multiplier effects of public pharmaceutical expenditure reduction on main revenue sources, such as taxes and social contributions. We also fit projection models with multipliers as regressands for the evaluation of the efficiency of the particular fiscal measure in the short run. According to the results, nearly half of the gains from the measure's application is offset by financially equivalent decreases in the government's revenue, i.e., losses in tax revenues and social security contributions alone, not considering any other direct or indirect costs. The findings of multipliers' high value and increasing short-term trend imply the measure's inefficiency henceforward and signal the risk of vicious circles that will provoke the economy's deprivation of useful resources.

  11. Comparing the Advanced REACH Tool's (ART) Estimates With Switzerland's Occupational Exposure Data.

    Science.gov (United States)

    Savic, Nenad; Gasic, Bojan; Schinkel, Jody; Vernez, David

    2017-10-01

    The Advanced REACH Tool (ART) is the most sophisticated tool used for evaluating exposure levels under the European Union's Registration, Evaluation, Authorisation and restriction of CHemicals (REACH) regulations. ART provides estimates at different percentiles of exposure and within different confidence intervals (CIs). However, its performance has only been tested on a limited number of exposure data. The present study compares ART's estimates with exposure measurements collected over many years in Switzerland. Measurements from 584 cases of exposure to vapours, mists, powders, and abrasive dusts (wood/stone and metal) were extracted from a Swiss database. The corresponding exposures at the 50th and 90th percentiles were calculated in ART. To characterize the model's performance, the 90% CI of the estimates was considered. ART's performance at the 50th percentile was only found to be insufficiently conservative with regard to exposure to wood/stone dusts, whereas the 90th percentile showed sufficient conservatism for all the types of exposure processed. However, a trend was observed with the residuals, where ART overestimated lower exposures and underestimated higher ones. The median was more precise, however, and the majority (≥60%) of real-world measurements were within a factor of 10 from ART's estimates. We provide recommendations based on the results and suggest further, more comprehensive, investigations. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  12. The Support Reduction Algorithm for Computing Non-Parametric Function Estimates in Mixture Models

    OpenAIRE

    GROENEBOOM, PIET; JONGBLOED, GEURT; WELLNER, JON A.

    2008-01-01

    In this paper, we study an algorithm (which we call the support reduction algorithm) that can be used to compute non-parametric M-estimators in mixture models. The algorithm is compared with natural competitors in the context of convex regression and the ‘Aspect problem’ in quantum physics.

  13. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    Directory of Open Access Journals (Sweden)

    Demeter Lisa

    2010-05-01

    Full Text Available Abstract Background The replication rate (or fitness between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV. HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models, a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1. Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  14. An Evaluation of Growth Models as Predictive Tools for Estimates at Completion (EAC)

    National Research Council Canada - National Science Library

    Trahan, Elizabeth N

    2009-01-01

    ...) as the Estimates at Completion (EAC). Our research evaluates the prospect of nonlinear growth modeling as an alternative to the current predictive tools used for calculating EAC, such as the Cost Performance Index (CPI...

  15. Estimating the fiscal effects of public pharmaceutical expenditure reduction in Greece

    Directory of Open Access Journals (Sweden)

    Kyriakos eSouliotis

    2015-08-01

    Full Text Available The purpose of the present study is to estimate the impact of pharmaceutical spending reduction on public revenue, based on data from the national health accounts as well as on reports of Greece’s organizations. The methodology of the analysis is structured in two basic parts. The first part presents the urgency for rapid cutbacks on public pharmaceutical costs due to the financial crisis and provides a conceptual framework for the contribution of the Greek pharmaceutical branch to the country’s economy. In the second part, we perform a quantitative analysis for the estimation of multiplier effects of public pharmaceutical expenditure reduction on main revenue sources such as taxes and social contributions. We also fit projection models with multipliers as regressands for the evaluation of the efficiency of the particular fiscal measure in the short run. According to the results, near half of the gains from the measure’s application is offset by financially equivalent decreases in the government’s revenue, i.e. losses in tax revenues and social security contributions alone, not considering any other direct or indirect costs. The findings of multipliers’ high value and increasing short-term trend imply the measure’s inefficiency henceforward and signal the risk of vicious circles that will provoke the economy’s deprivation of useful resources.

  16. Tool-specific performance of vibration-reducing gloves for attenuating fingers-transmitted vibration

    Science.gov (United States)

    Welcome, Daniel E.; Dong, Ren G.; Xu, Xueyan S.; Warren, Christopher; McDowell, Thomas W.

    2016-01-01

    BACKGROUND Fingers-transmitted vibration can cause vibration-induced white finger. The effectiveness of vibration-reducing (VR) gloves for reducing hand transmitted vibration to the fingers has not been sufficiently examined. OBJECTIVE The objective of this study is to examine tool-specific performance of VR gloves for reducing finger-transmitted vibrations in three orthogonal directions (3D) from powered hand tools. METHODS A transfer function method was used to estimate the tool-specific effectiveness of four typical VR gloves. The transfer functions of the VR glove fingers in three directions were either measured in this study or during a previous study using a 3D laser vibrometer. More than seventy vibration spectra of various tools or machines were used in the estimations. RESULTS When assessed based on frequency-weighted acceleration, the gloves provided little vibration reduction. In some cases, the gloves amplified the vibration by more than 10%, especially the neoprene glove. However, the neoprene glove did the best when the assessment was based on unweighted acceleration. The neoprene glove was able to reduce the vibration by 10% or more of the unweighted vibration for 27 out of the 79 tools. If the dominant vibration of a tool handle or workpiece was in the shear direction relative to the fingers, as observed in the operation of needle scalers, hammer chisels, and bucking bars, the gloves did not reduce the vibration but increased it. CONCLUSIONS This study confirmed that the effectiveness for reducing vibration varied with the gloves and the vibration reduction of each glove depended on tool, vibration direction to the fingers, and finger location. VR gloves, including certified anti-vibration gloves do not provide much vibration reduction when judged based on frequency-weighted acceleration. However, some of the VR gloves can provide more than 10% reduction of the unweighted vibration for some tools or workpieces. Tools and gloves can be matched for

  17. Estimating the Condition of the Heat Resistant Lining in an Electrical Reduction Furnace

    Directory of Open Access Journals (Sweden)

    Jan G. Waalmann

    1988-01-01

    Full Text Available This paper presents a system for estimating the condition of the heat resistant lining in an electrical reduction furnace for ferrosilicon. The system uses temperature measured with thermocouples placed on the outside of the furnace-pot. These measurements are used together with a mathematical model of the temperature distribution in the lining in a recursive least squares algorithm to estimate the position of 'the transformation front'. The system is part of a monitoring system which is being developed in the AIP-project: 'Condition monitoring of strongly exposed process equipment in thc ferroalloy industry'. The estimator runs on-line, and results arc presented in colour-graphics on a display unit. The goal is to locate the transformation front with an accuracy of +- 5cm.

  18. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    Science.gov (United States)

    Sirirojvisuth, Apinut

    In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this

  19. Transit Boardings Estimation and Simulation Tool (TBEST) calibration for guideway and BRT modes.

    Science.gov (United States)

    2013-06-01

    This research initiative was motivated by a desire of the Florida Department of Transportation and the : Transit Boardings Estimation and Simulation Tool (TBEST) project team to enhance the value of TBEST to : the planning community by improving its ...

  20. The ISO 50001 Impact Estimator Tool (IET 50001 V1.1.4) - User Guide and Introduction to the ISO 50001 Impacts Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Therkelsen, Peter L. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Rao, Prakash [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee T. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-08-01

    ISO 50001-Energy management systems – Requirements with guidance for use, is an internationally developed standard that provides organizations with a flexible framework for implementing an energy management system (EnMS) with the goal of continual energy performance improvement. The ISO 50001 standard was first published in 2011 and has since seen growth in the number of certificates issued around the world, primarily in the industrial (agriculture, manufacturing, and mining) and service (commercial) sectors. Policy makers in many regions and countries are looking to or are already using ISO 50001 as a basis for energy efficiency, carbon reduction, and other energy performance improvement schemes. The Impact Estimator Tool 50001 (IET 50001 Tool) is a computational model developed to assist researchers and policy makers determine the potential impact of ISO 50001 implementation in the industrial and service (commercial) sectors for a given region or country. The IET 50001 Tool is based upon a methodology initially developed by the Lawrence Berkeley National Laboratory that has been improved upon and vetted by a group of international researchers. By using a commonly accepted and transparent methodology, users of the IET 50001 Tool can easily and clearly communicate the potential impact of ISO 50001 for a region or country.

  1. A "Carbon Reduction Challenge" as tool for undergraduate engagement on climate change

    Science.gov (United States)

    Cobb, K. M.; Toktay, B.

    2017-12-01

    Challenge represents a solutions-oriented, hands-on, project-based learning tool that has achieved significant pedagogical benefits while delivering real-world carbon reductions and cost savings to community stakeholders.

  2. Establishing credible emission reduction estimates: GERT's experience

    International Nuclear Information System (INIS)

    Loseth, H.

    2001-01-01

    To address the challenge of reducing the greenhouse gas emissions in Canada, the federal and provincial governments are developing strategies and policies to reach that goal. One of the proposed solutions is the establishment of an emission trading system, which it is believed would encourage investment in lower-cost reductions. The Greenhouse Gas Emission Reduction Trading (GERT) pilot was established in 1998 to examine emission trading. It represents the collaborative efforts of government, industry, and non-governmental organizations. It is possible to establish emission reduction trading outside of a regulated environment. Emission reduction is defined as being an action which reduces emissions when compared to what they would have been otherwise. The functioning of GERT was described from the initial application by a buyer/seller to the review process. The assessment of projects is based on mandatory criteria: reductions of emissions must be real, measurable, verifiable and surplus. A section of the presentation was devoted to landfill gas recovery project issues, while another dealt with fuel substitution project issues. Section 5 discussed emission reductions from an off-site source electricity project issues. figs

  3. Estimated reductions in hospitalizations and deaths from childhood diarrhea following implementation of rotavirus vaccination in Africa.

    Science.gov (United States)

    Shah, Minesh P; Tate, Jacqueline E; Mwenda, Jason M; Steele, A Duncan; Parashar, Umesh D

    2017-10-01

    Rotavirus is the leading cause of hospitalizations and deaths from diarrhea. 33 African countries had introduced rotavirus vaccines by 2016. We estimate reductions in rotavirus hospitalizations and deaths for countries using rotavirus vaccination in national immunization programs and the potential of vaccine introduction across the continent. Areas covered: Regional rotavirus burden data were reviewed to calculate hospitalization rates, and applied to under-5 population to estimate baseline hospitalizations. Rotavirus mortality was based on 2013 WHO estimates. Regional pre-licensure vaccine efficacy and post-introduction vaccine effectiveness studies were used to estimate summary effectiveness, and vaccine coverage was applied to calculate prevented hospitalizations and deaths. Uncertainties around input parameters were propagated using boot-strapping simulations. In 29 African countries that introduced rotavirus vaccination prior to end 2014, 134,714 (IQR 112,321-154,654) hospitalizations and 20,986 (IQR 18,924-22,822) deaths were prevented in 2016. If all African countries had introduced rotavirus vaccines at benchmark immunization coverage, 273,619 (47%) (IQR 227,260-318,102) hospitalizations and 47,741 (39%) (IQR 42,822-52,462) deaths would have been prevented. Expert commentary: Rotavirus vaccination has substantially reduced hospitalizations and deaths in Africa; further reductions are anticipated as additional countries implement vaccination. These estimates bolster wider introduction and continued support of rotavirus vaccination programs.

  4. RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

    Energy Technology Data Exchange (ETDEWEB)

    William W. Weiss

    2000-06-30

    Incomplete or sparse information on geologic or formation characteristics introduces a high level of risk for oil exploration and development projects. Expert systems have been developed and used in several disciplines and industries, including medical diagnostics, with favorable results. A state-of-the-art exploration ''expert'' tool, relying on a computerized data base and computer maps generated by neural networks, is proposed through the use of ''fuzzy'' logic, a relatively new mathematical treatment of imprecise or non-explicit parameters and values. This project will develop an Artificial Intelligence system that will draw upon a wide variety of information to provide realistic estimates of risk. ''Fuzzy logic,'' a system of integrating large amounts of inexact, incomplete information with modern computational methods to derive usable conclusions, has been demonstrated as a cost-effective computational technology in many industrial applications. During project year 1, 90% of geologic, geophysical, production and price data were assimilated for installation into the database. Logs provided geologic data consisting of formation tops of the Brushy Canyon, Lower Brushy Canyon, and Bone Springs zones of 700 wells used to construct regional cross sections. Regional structure and isopach maps were constructed using kriging to interpolate between the measured points. One of the structure derivative maps (azimuth of curvature) visually correlates with Brushy Canyon fields on the maximum change contours. Derivatives of the regional geophysical data also visually correlate with the location of the fields. The azimuth of maximum dip approximately locates fields on the maximum change contours. In a similar manner the second derivative in the x-direction of the gravity map visually correlates with the alignment of the known fields. The visual correlations strongly suggest that neural network architectures will be

  5. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    Directory of Open Access Journals (Sweden)

    S. A. Archfield

    2013-01-01

    Full Text Available Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  6. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    Science.gov (United States)

    Archfield, Stacey A.; Steeves, Peter A.; Guthrie, John D.; Ries, Kernell G.

    2013-01-01

    Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals) at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  7. Parameter and State Estimation of Large-Scale Complex Systems Using Python Tools

    Directory of Open Access Journals (Sweden)

    M. Anushka S. Perera

    2015-07-01

    Full Text Available This paper discusses the topics related to automating parameter, disturbance and state estimation analysis of large-scale complex nonlinear dynamic systems using free programming tools. For large-scale complex systems, before implementing any state estimator, the system should be analyzed for structural observability and the structural observability analysis can be automated using Modelica and Python. As a result of structural observability analysis, the system may be decomposed into subsystems where some of them may be observable --- with respect to parameter, disturbances, and states --- while some may not. The state estimation process is carried out for those observable subsystems and the optimum number of additional measurements are prescribed for unobservable subsystems to make them observable. In this paper, an industrial case study is considered: the copper production process at Glencore Nikkelverk, Kristiansand, Norway. The copper production process is a large-scale complex system. It is shown how to implement various state estimators, in Python, to estimate parameters and disturbances, in addition to states, based on available measurements.

  8. Missing texture reconstruction method based on error reduction algorithm using Fourier transform magnitude estimation scheme.

    Science.gov (United States)

    Ogawa, Takahiro; Haseyama, Miki

    2013-03-01

    A missing texture reconstruction method based on an error reduction (ER) algorithm, including a novel estimation scheme of Fourier transform magnitudes is presented in this brief. In our method, Fourier transform magnitude is estimated for a target patch including missing areas, and the missing intensities are estimated by retrieving its phase based on the ER algorithm. Specifically, by monitoring errors converged in the ER algorithm, known patches whose Fourier transform magnitudes are similar to that of the target patch are selected from the target image. In the second approach, the Fourier transform magnitude of the target patch is estimated from those of the selected known patches and their corresponding errors. Consequently, by using the ER algorithm, we can estimate both the Fourier transform magnitudes and phases to reconstruct the missing areas.

  9. Estimate of the benefits of a population-based reduction in dietary sodium additives on hypertension and its related health care costs in Canada.

    Science.gov (United States)

    Joffres, Michel R; Campbell, Norm R C; Manns, Braden; Tu, Karen

    2007-05-01

    Hypertension is the leading risk factor for mortality worldwide. One-quarter of the adult Canadian population has hypertension, and more than 90% of the population is estimated to develop hypertension if they live an average lifespan. Reductions in dietary sodium additives significantly lower systolic and diastolic blood pressure, and population reductions in dietary sodium are recommended by major scientific and public health organizations. To estimate the reduction in hypertension prevalence and specific hypertension management cost savings associated with a population-wide reduction in dietary sodium additives. Based on data from clinical trials, reducing dietary sodium additives by 1840 mg/day would result in a decrease of 5.06 mmHg (systolic) and 2.7 mmHg (diastolic) blood pressures. Using Canadian Heart Health Survey data, the resulting reduction in hypertension was estimated. Costs of laboratory testing and physician visits were based on 2001 to 2003 Ontario Health Insurance Plan data, and the number of physician visits and costs of medications for patients with hypertension were taken from 2003 IMS Canada. To estimate the reduction in total physician visits and laboratory costs, current estimates of aware hypertensive patients in Canada were used from the Canadian Community Health Survey. Reducing dietary sodium additives may decrease hypertension prevalence by 30%, resulting in one million fewer hypertensive patients in Canada, and almost double the treatment and control rate. Direct cost savings related to fewer physician visits, laboratory tests and lower medication use are estimated to be approximately $430 million per year. Physician visits and laboratory costs would decrease by 6.5%, and 23% fewer treated hypertensive patients would require medications for control of blood pressure. Based on these estimates, lowering dietary sodium additives would lead to a large reduction in hypertension prevalence and result in health care cost savings in Canada.

  10. Exploring the effects of dimensionality reduction in deep networks for force estimation in robotic-assisted surgery

    Science.gov (United States)

    Aviles, Angelica I.; Alsaleh, Samar; Sobrevilla, Pilar; Casals, Alicia

    2016-03-01

    Robotic-Assisted Surgery approach overcomes the limitations of the traditional laparoscopic and open surgeries. However, one of its major limitations is the lack of force feedback. Since there is no direct interaction between the surgeon and the tissue, there is no way of knowing how much force the surgeon is applying which can result in irreversible injuries. The use of force sensors is not practical since they impose different constraints. Thus, we make use of a neuro-visual approach to estimate the applied forces, in which the 3D shape recovery together with the geometry of motion are used as input to a deep network based on LSTM-RNN architecture. When deep networks are used in real time, pre-processing of data is a key factor to reduce complexity and improve the network performance. A common pre-processing step is dimensionality reduction which attempts to eliminate redundant and insignificant information by selecting a subset of relevant features to use in model construction. In this work, we show the effects of dimensionality reduction in a real-time application: estimating the applied force in Robotic-Assisted Surgeries. According to the results, we demonstrated positive effects of doing dimensionality reduction on deep networks including: faster training, improved network performance, and overfitting prevention. We also show a significant accuracy improvement, ranging from about 33% to 86%, over existing approaches related to force estimation.

  11. Experimental evaluation of tool run-out in micro milling

    Science.gov (United States)

    Attanasio, Aldo; Ceretti, Elisabetta

    2018-05-01

    This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.

  12. Review of hardware cost estimation methods, models and tools applied to early phases of space mission planning

    Science.gov (United States)

    Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.

    2012-08-01

    The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation

  13. PREMIM and EMIM: tools for estimation of maternal, imprinting and interaction effects using multinomial modelling

    Directory of Open Access Journals (Sweden)

    Howey Richard

    2012-06-01

    Full Text Available Abstract Background Here we present two new computer tools, PREMIM and EMIM, for the estimation of parental and child genetic effects, based on genotype data from a variety of different child-parent configurations. PREMIM allows the extraction of child-parent genotype data from standard-format pedigree data files, while EMIM uses the extracted genotype data to perform subsequent statistical analysis. The use of genotype data from the parents as well as from the child in question allows the estimation of complex genetic effects such as maternal genotype effects, maternal-foetal interactions and parent-of-origin (imprinting effects. These effects are estimated by EMIM, incorporating chosen assumptions such as Hardy-Weinberg equilibrium or exchangeability of parental matings as required. Results In application to simulated data, we show that the inference provided by EMIM is essentially equivalent to that provided by alternative (competing software packages such as MENDEL and LEM. However, PREMIM and EMIM (used in combination considerably outperform MENDEL and LEM in terms of speed and ease of execution. Conclusions Together, EMIM and PREMIM provide easy-to-use command-line tools for the analysis of pedigree data, giving unbiased estimates of parental and child genotype relative risks.

  14. A Planning Tool for Estimating Waste Generated by a Radiological Incident and Subsequent Decontamination Efforts - 13569

    International Nuclear Information System (INIS)

    Boe, Timothy; Lemieux, Paul; Schultheisz, Daniel; Peake, Tom; Hayes, Colin

    2013-01-01

    Management of debris and waste from a wide-area radiological incident would probably constitute a significant percentage of the total remediation cost and effort. The U.S. Environmental Protection Agency's (EPA's) Waste Estimation Support Tool (WEST) is a unique planning tool for estimating the potential volume and radioactivity levels of waste generated by a radiological incident and subsequent decontamination efforts. The WEST was developed to support planners and decision makers by generating a first-order estimate of the quantity and characteristics of waste resulting from a radiological incident. The tool then allows the user to evaluate the impact of various decontamination/demolition strategies on the waste types and volumes generated. WEST consists of a suite of standalone applications and Esri R ArcGIS R scripts for rapidly estimating waste inventories and levels of radioactivity generated from a radiological contamination incident as a function of user-defined decontamination and demolition approaches. WEST accepts Geographic Information System (GIS) shape-files defining contaminated areas and extent of contamination. Building stock information, including square footage, building counts, and building composition estimates are then generated using the Federal Emergency Management Agency's (FEMA's) Hazus R -MH software. WEST then identifies outdoor surfaces based on the application of pattern recognition to overhead aerial imagery. The results from the GIS calculations are then fed into a Microsoft Excel R 2007 spreadsheet with a custom graphical user interface where the user can examine the impact of various decontamination/demolition scenarios on the quantity, characteristics, and residual radioactivity of the resulting waste streams. (authors)

  15. Carbon Footprint Estimation Tool for Residential Buildings for Non-Specialized Users: OERCO2 Project

    Directory of Open Access Journals (Sweden)

    Jaime Solís-Guzmán

    2018-04-01

    Full Text Available Existing tools for environmental certification of buildings are failing in their ability to reach the general public and to create social awareness, since they require not only specialized knowledge regarding construction and energy sources, but also environmental knowledge. In this paper, an open-source online tool for the estimation of the carbon footprint of residential buildings by non-specialized users is presented as a product from the OERCO2 Erasmus + project. The internal calculations, data management and operation of this tool are extensively explained. The ten most common building typologies built in the last decade in Spain are analysed by using the OERCO2 tool, and the order of magnitude of the results is analysed by comparing them to the ranges determined by other authors. The OERCO2 tool proves itself to be reliable, with its results falling within the defined logical value ranges. Moreover, the major simplification of the interface allows non-specialized users to evaluate the sustainability of buildings. Further research is oriented towards its inclusion in other environmental certification tools and in Building Information Modeling (BIM environments.

  16. Environmental isotope balance of Lake Kinneret as a tool in evaporation rate estimation

    International Nuclear Information System (INIS)

    Lewis, S.

    1979-01-01

    The balance of environmental isotopes in Lake Kinneret has been used to obtain an independent estimate of the mean monthly evaporation rate. Direct calculation was precluded by the inadequacy of the isotope data in uniquely representing the system behaviour throughout the annual cycle. The approach adopted uses an automatic algorithm to seek an objective best fit of the isotope balance model to measured oxygen-18 data by optimizing the evaporation rate as a parameter. To this end, evaporation is described as a periodic function with two parameters. The sensitivity of the evaporation rate estimates to parameter uncertainty and data errors is stressed. Error analysis puts confidence limits on the estimates obtained. Projected improvements in data collection and analysis show that a significant reduction in uncertainty can be realized. Relative to energy balance estimates, currently obtainable data result in about 30% uncertainty. The most optimistic scenario would yield about 15% relative uncertainty. (author)

  17. Development and application of a decision support tool for reduction of product losses in the food-processing industry

    NARCIS (Netherlands)

    Akkerman, Renzo; van Donk, Dirk Pieter

    2008-01-01

    In food-processing industries, reduction of product losses is important for improving profitability and sustainability. This paper presents a decision support tool for analyzing the effects of planning decisions on the amount of product losses in the food-processing industry. We created a research

  18. An estimate of the cost of burnout on early retirement and reduction in clinical hours of practicing physicians in Canada

    Science.gov (United States)

    2014-01-01

    Background Interest in the impact of burnout on physicians has been growing because of the possible burden this may have on health care systems. The objective of this study is to estimate the cost of burnout on early retirement and reduction in clinical hours of practicing physicians in Canada. Methods Using an economic model, the costs related to early retirement and reduction in clinical hours of physicians were compared for those who were experiencing burnout against a scenario in which they did not experience burnout. The January 2012 Canadian Medical Association Masterfile was used to determine the number of practicing physicians. Transition probabilities were estimated using 2007–2008 Canadian Physician Health Survey and 2007 National Physician Survey data. Adjustments were also applied to outcome estimates based on ratio of actual to planned retirement and reduction in clinical hours. Results The total cost of burnout for all physicians practicing in Canada is estimated to be $213.1 million ($185.2 million due to early retirement and $27.9 million due to reduced clinical hours). Family physicians accounted for 58.8% of the burnout costs, followed by surgeons for 24.6% and other specialists for 16.6%. Conclusion The cost of burnout associated with early retirement and reduction in clinical hours is substantial and a significant proportion of practicing physicians experience symptoms of burnout. As health systems struggle with human resource shortages and expanding waiting times, this estimate sheds light on the extent to which the burden could be potentially decreased through prevention and promotion activities to address burnout among physicians. PMID:24927847

  19. Qualitative: Python Tool for MT Quality Estimation Supporting Server Mode and Hybrid MT

    Directory of Open Access Journals (Sweden)

    Avramidis Eleftherios

    2016-10-01

    Full Text Available We are presenting the development contributions of the last two years to our Python opensource Quality Estimation tool, a tool that can function in both experiment-mode and online web-service mode. The latest version provides a new MT interface, which communicates with SMT and rule-based translation engines and supports on-the-fly sentence selection. Additionally, we present an improved Machine Learning interface allowing more efficient communication with several state-of-the-art toolkits. Additions also include a more informative training process, a Python re-implementation of QuEst baseline features, a new LM toolkit integration, an additional PCFG parser and alignments of syntactic nodes.

  20. Estimating rare events in biochemical systems using conditional sampling

    Science.gov (United States)

    Sundar, V. S.

    2017-01-01

    The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.

  1. Modeling of the effect of tool wear per discharge estimation error on the depth of machined cavities in micro-EDM milling

    DEFF Research Database (Denmark)

    Puthumana, Govindan; Bissacco, Giuliano; Hansen, Hans Nørgaard

    2017-01-01

    In micro-EDM milling, real time electrode wear compensation based on tool wear per discharge (TWD) estimation permits the direct control of the position of the tool electrode frontal surface. However, TWD estimation errors will cause errors on the tool electrode axial depth. A simulation tool...... is developed to determine the effects of errors in the initial estimation of TWD and its propagation effect with respect to the error on the depth of the cavity generated. Simulations were applied to micro-EDM milling of a slot of 5000 μm length and 50 μm depth and validated through slot milling experiments...... performed on a micro-EDM machine. Simulations and experimental results were found to be in good agreement, showing the effect of errror amplification through the cavity depth....

  2. A Planning Tool for Estimating Waste Generated by a Radiological Incident and Subsequent Decontamination Efforts - 13569

    Energy Technology Data Exchange (ETDEWEB)

    Boe, Timothy [Oak Ridge Institute for Science and Education, Research Triangle Park, NC 27711 (United States); Lemieux, Paul [U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Schultheisz, Daniel; Peake, Tom [U.S. Environmental Protection Agency, Washington, DC 20460 (United States); Hayes, Colin [Eastern Research Group, Inc, Morrisville, NC 26560 (United States)

    2013-07-01

    Management of debris and waste from a wide-area radiological incident would probably constitute a significant percentage of the total remediation cost and effort. The U.S. Environmental Protection Agency's (EPA's) Waste Estimation Support Tool (WEST) is a unique planning tool for estimating the potential volume and radioactivity levels of waste generated by a radiological incident and subsequent decontamination efforts. The WEST was developed to support planners and decision makers by generating a first-order estimate of the quantity and characteristics of waste resulting from a radiological incident. The tool then allows the user to evaluate the impact of various decontamination/demolition strategies on the waste types and volumes generated. WEST consists of a suite of standalone applications and Esri{sup R} ArcGIS{sup R} scripts for rapidly estimating waste inventories and levels of radioactivity generated from a radiological contamination incident as a function of user-defined decontamination and demolition approaches. WEST accepts Geographic Information System (GIS) shape-files defining contaminated areas and extent of contamination. Building stock information, including square footage, building counts, and building composition estimates are then generated using the Federal Emergency Management Agency's (FEMA's) Hazus{sup R}-MH software. WEST then identifies outdoor surfaces based on the application of pattern recognition to overhead aerial imagery. The results from the GIS calculations are then fed into a Microsoft Excel{sup R} 2007 spreadsheet with a custom graphical user interface where the user can examine the impact of various decontamination/demolition scenarios on the quantity, characteristics, and residual radioactivity of the resulting waste streams. (authors)

  3. Design Tool for Estimating Chemical Hydrogen Storage System Characteristics for Light-Duty Fuel Cell Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, Kriston P.; Sprik, Sam; Tamburello, David; Thornton, Matthew

    2018-05-03

    The U.S. Department of Energy (DOE) has developed a vehicle framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to DOE’s Technical Targets using four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework model for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be easily estimated. To address this challenge, a design tool has been developed that allows researchers to directly enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates the systems parameters required to run the storage system model. Additionally, this design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the framework model and compare it to the DOE Technical Targets. These models will be explained and exercised with existing hydrogen storage materials.

  4. Design Tool for Estimating Chemical Hydrogen Storage System Characteristics for Light-Duty Fuel Cell Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Thornton, Matthew J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sprik, Samuel [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Brooks, Kriston P. [Pacific Northwest National Laboratory; Tamburello, David A. [Savannah River National Laboratory

    2018-04-07

    The U.S. Department of Energy (DOE) developed a vehicle Framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to Technical Targets established by DOE for four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be estimated easily. To address this challenge, a design tool has been developed that allows researchers to directly enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates system parameters required to run the storage system model. Additionally, the design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the Framework model. These models will be explained and exercised with the representative hydrogen storage materials exothermic ammonia borane (NH3BH3) and endothermic alane (AlH3).

  5. Children with developmental coordination disorder demonstrate a spatial mismatch when estimating coincident-timing ability with tools.

    Science.gov (United States)

    Caçola, Priscila; Ibana, Melvin; Ricard, Mark; Gabbard, Carl

    2016-01-01

    Coincident timing or interception ability can be defined as the capacity to precisely time sensory input and motor output. This study compared accuracy of typically developing (TD) children and those with Developmental Coordination Disorder (DCD) on a task involving estimation of coincident timing with their arm and various tool lengths. Forty-eight (48) participants performed two experiments where they imagined intercepting a target moving toward (Experiment 1) and target moving away (Experiment 2) from them in 5 conditions with their arm and tool lengths: arm, 10, 20, 30, and 40 cm. In Experiment 1, the DCD group overestimated interception points approximately twice as much as the TD group, and both groups overestimated consistently regardless of the tool used. Results for Experiment 2 revealed that those with DCD underestimated about three times as much as the TD group, with the exception of when no tool was used. Overall, these results indicate that children with DCD are less accurate with estimation of coincident-timing; which might in part explain their difficulties with common motor activities such as catching a ball or striking a baseball pitch. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Development, Validation, and Verification of a Self-Assessment Tool to Estimate Agnibala (Digestive Strength).

    Science.gov (United States)

    Singh, Aparna; Singh, Girish; Patwardhan, Kishor; Gehlot, Sangeeta

    2017-01-01

    According to Ayurveda, the traditional system of healthcare of Indian origin, Agni is the factor responsible for digestion and metabolism. Four functional states (Agnibala) of Agni have been recognized: regular, irregular, intense, and weak. The objective of the present study was to develop and validate a self-assessment tool to estimate Agnibala The developed tool was evaluated for its reliability and validity by administering it to 300 healthy volunteers of either gender belonging to 18 to 40-year age group. Besides confirming the statistical validity and reliability, the practical utility of the newly developed tool was also evaluated by recording serum lipid parameters of all the volunteers. The results show that the lipid parameters vary significantly according to the status of Agni The tool, therefore, may be used to screen normal population to look for possible susceptibility to certain health conditions. © The Author(s) 2016.

  7. Adaptation of the Tool to Estimate Patient Costs Questionnaire into Indonesian Context for Tuberculosis-affected Households

    Directory of Open Access Journals (Sweden)

    Ahmad Fuady

    2018-04-01

    Full Text Available Background: Indonesia is the second-highest country for tuberculosis (TB incidence worldwide. Hence, it urgently requires improvements and innovations beyond the strategies that are currently being implemented throughout the country. One fundamental step in monitoring its progress is by preparing a validated tool to measure total patient costs and catastrophic total costs. The World Health Organization (WHO recommends using a version of the generic questionnaire that has been adapted to the local cultural context in order to interpret findings correctly. This study is aimed to adapt the Tool to Estimate Patient Costs questionnaire into the Indonesian context, which measures total costs and catastrophic total costs for tuberculosis-affected households. Methods: the tool was adapted using best-practice guidelines. On the basis of a pre-test performed in a previous study (referred to as Phase 1 Study, we refined the adaptation process by comparing it with the generic tool introduced by the WHO. We also held an expert committee review and performed pre-testing by interviewing 30 TB patients. After pre-testing, the tool was provided with complete explanation sheets for finalization. Results: seventy-two major changes were made during the adaptation process including changing the answer choices to match the Indonesian context, refining the flow of questions, deleting questions, changing some words and restoring original questions that had been changed in Phase 1 Study. Participants indicated that most questions were clear and easy to understand. To address recall difficulties by the participants, we made some adaptations to obtain data that might be missing, such as tracking data to medical records, developing a proxy of costs and guiding interviewers to ask for a specific value when participants were uncertain about the estimated market value of property they had sold. Conclusion: the adapted Tool to Estimate Patient Costs in Bahasa Indonesia is

  8. Adaptation of the Tool to Estimate Patient Costs Questionnaire into Indonesian Context for Tuberculosis-affected Households.

    Science.gov (United States)

    Fuady, Ahmad; Houweling, Tanja A; Mansyur, Muchtaruddin; Richardus, Jan H

    2018-01-01

    Indonesia is the second-highest country for tuberculosis (TB) incidence worldwide. Hence, it urgently requires improvements and innovations beyond the strategies that are currently being implemented throughout the country. One fundamental step in monitoring its progress is by preparing a validated tool to measure total patient costs and catastrophic total costs. The World Health Organization (WHO) recommends using a version of the generic questionnaire that has been adapted to the local cultural context in order to interpret findings correctly. This study is aimed to adapt the Tool to Estimate Patient Costs questionnaire into the Indonesian context, which measures total costs and catastrophic total costs for tuberculosis-affected households. the tool was adapted using best-practice guidelines. On the basis of a pre-test performed in a previous study (referred to as Phase 1 Study), we refined the adaptation process by comparing it with the generic tool introduced by the WHO. We also held an expert committee review and performed pre-testing by interviewing 30 TB patients. After pre-testing, the tool was provided with complete explanation sheets for finalization. seventy-two major changes were made during the adaptation process including changing the answer choices to match the Indonesian context, refining the flow of questions, deleting questions, changing some words and restoring original questions that had been changed in Phase 1 Study. Participants indicated that most questions were clear and easy to understand. To address recall difficulties by the participants, we made some adaptations to obtain data that might be missing, such as tracking data to medical records, developing a proxy of costs and guiding interviewers to ask for a specific value when participants were uncertain about the estimated market value of property they had sold. the adapted Tool to Estimate Patient Costs in Bahasa Indonesia is comprehensive and ready for use in future studies on TB

  9. Integral Criticality Estimators in MCATK

    Energy Technology Data Exchange (ETDEWEB)

    Nolen, Steven Douglas [Los Alamos National Laboratory; Adams, Terry R. [Los Alamos National Laboratory; Sweezy, Jeremy Ed [Los Alamos National Laboratory

    2016-06-14

    The Monte Carlo Application ToolKit (MCATK) is a component-based software toolset for delivering customized particle transport solutions using the Monte Carlo method. Currently under development in the XCP Monte Carlo group at Los Alamos National Laboratory, the toolkit has the ability to estimate the ke f f and a eigenvalues for static geometries. This paper presents a description of the estimators and variance reduction techniques available in the toolkit and includes a preview of those slated for future releases. Along with the description of the underlying algorithms is a description of the available user inputs for controlling the iterations. The paper concludes with a comparison of the MCATK results with those provided by analytic solutions. The results match within expected statistical uncertainties and demonstrate MCATK’s usefulness in estimating these important quantities.

  10. The power tool

    International Nuclear Information System (INIS)

    HAYFIELD, J.P.

    1999-01-01

    POWER Tool--Planning, Optimization, Waste Estimating and Resourcing tool, a hand-held field estimating unit and relational database software tool for optimizing disassembly and final waste form of contaminated systems and equipment

  11. Development of Prediction Tool for Sound Absorption and Sound Insulation for Sound Proof Properties

    OpenAIRE

    Yoshio Kurosawa; Takao Yamaguchi

    2015-01-01

    High frequency automotive interior noise above 500 Hz considerably affects automotive passenger comfort. To reduce this noise, sound insulation material is often laminated on body panels or interior trim panels. For a more effective noise reduction, the sound reduction properties of this laminated structure need to be estimated. We have developed a new calculate tool that can roughly calculate the sound absorption and insulation properties of laminate structure and handy ...

  12. Time improvement of photoelectric effect calculation for absorbed dose estimation

    International Nuclear Information System (INIS)

    Massa, J M; Wainschenker, R S; Doorn, J H; Caselli, E E

    2007-01-01

    Ionizing radiation therapy is a very useful tool in cancer treatment. It is very important to determine absorbed dose in human tissue to accomplish an effective treatment. A mathematical model based on affected areas is the most suitable tool to estimate the absorbed dose. Lately, Monte Carlo based techniques have become the most reliable, but they are time expensive. Absorbed dose calculating programs using different strategies have to choose between estimation quality and calculating time. This paper describes an optimized method for the photoelectron polar angle calculation in photoelectric effect, which is significant to estimate deposited energy in human tissue. In the case studies, time cost reduction nearly reached 86%, meaning that the time needed to do the calculation is approximately 1/7 th of the non optimized approach. This has been done keeping precision invariant

  13. Comparing Fatigue Life Estimations of Composite Wind Turbine Blades using different Fatigue Analysis Tools

    DEFF Research Database (Denmark)

    Ardila, Oscar Gerardo Castro; Lennie, Matthew; Branner, Kim

    2015-01-01

    In this paper, fatigue lifetime prediction of NREL 5MW reference wind turbine is presented. The fatigue response of materials used in selected blade cross sections was obtained by applying macroscopic fatigue approaches and assuming uniaxial stress states. Power production and parked load cases...... suggested by the IEC 61400-1 standard were studied employing different load time intervals and by using two novel fatigue tools called ALBdeS and BECAS+F. The aeroelastic loads were defined thought aeroelastic simulations performed with both FAST and HAWC2 tools. The stress spectra at each layer were...... calculated employing laminated composite theory and beam cross section methods. The Palmgren-Miner linear damage rule was used to calculate the accumulation damage. The theoretical results produced by both fatigue tools proved a prominent effect of analysed design load conditions on the estimated lifetime...

  14. Potential for waste reduction

    International Nuclear Information System (INIS)

    Warren, J.L.

    1990-01-01

    The author focuses on wastes considered hazardous under the Resource Conservation and Recovery Act. This chapter discusses wastes that are of interest as well as the factors affecting the quantity of waste considered available for waste reduction. Estimates are provided of the quantities of wastes generated. Estimates of the potential for waste reduction are meaningful only to the extent that one can understand the amount of waste actually being generated. Estimates of waste reduction potential are summarized from a variety of government and nongovernment sources

  15. Reduction Assessment of Agricultural Non-Point Source Pollutant Loading

    OpenAIRE

    Fu, YiCheng; Zang, Wenbin; Zhang, Jian; Wang, Hongtao; Zhang, Chunling; Shi, Wanli

    2018-01-01

    NPS (Non-point source) pollution has become a key impact element to watershed environment at present. With the development of technology, application of models to control NPS pollution has become a very common practice for resource management and Pollutant reduction control in the watershed scale of China. The SWAT (Soil and Water Assessment Tool) model is a semi-conceptual model, which was put forward to estimate pollutant production & the influences on water quantity-quality under different...

  16. Effect of Using Different Vehicle Weight Groups on the Estimated Relationship Between Mass Reduction and U.S. Societal Fatality Risk per Vehicle Miles of Travel

    Energy Technology Data Exchange (ETDEWEB)

    Wenzel, Tom P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Technologies Area. Building Technology and Urban Systems Division

    2016-08-22

    This report recalculates the estimated relationship between vehicle mass and societal fatality risk, using alternative groupings by vehicle weight, to test whether the trend of decreasing fatality risk from mass reduction as case vehicle mass increases, holds over smaller increments of the range in case vehicle masses. The NHTSA baseline regression model estimates the relationship using for two weight groups for cars and light trucks; we re-estimated the mass reduction coefficients using four, six, and eight bins of vehicle mass. The estimated effect of mass reduction on societal fatality risk was not consistent over the range in vehicle masses in these weight bins. These results suggest that the relationship indicated by the NHTSA baseline model is a result of other, unmeasured attributes of the mix of vehicles in the lighter vs. heavier weight bins, and not necessarily the result of a correlation between mass reduction and societal fatality risk. An analysis of the average vehicle, driver, and crash characteristics across the various weight groupings did not reveal any strong trends that might explain the lack of a consistent trend of decreasing fatality risk from mass reduction in heavier vehicles.

  17. Antioxidant-capacity-based models for the prediction of acrylamide reduction by flavonoids.

    Science.gov (United States)

    Cheng, Jun; Chen, Xinyu; Zhao, Sheng; Zhang, Yu

    2015-02-01

    The aim of this study was to investigate the applicability of artificial neural network (ANN) and multiple linear regression (MLR) models for the estimation of acrylamide reduction by flavonoids, using multiple antioxidant capacities of Maillard reaction products as variables via a microwave food processing workstation. The addition of selected flavonoids could effectively reduce acrylamide formation, which may be closely related to the number of phenolic hydroxyl groups of flavonoids (R: 0.735-0.951, Pcapacity (ΔTEAC) measured by DPPH (R(2)=0.833), ABTS (R(2)=0.860) or FRAP (R(2)=0.824) assay. Both ANN and MLR models could effectively serve as predictive tools for estimating the reduction of acrylamide affected by flavonoids. The current predictive model study provides a low-cost and easy-to-use approach to the estimation of rates at which acrylamide is degraded, while avoiding tedious sample pretreatment procedures and advanced instrumental analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Regression tools for CO2 inversions: application of a shrinkage estimator to process attribution

    International Nuclear Information System (INIS)

    Shaby, Benjamin A.; Field, Christopher B.

    2006-01-01

    In this study we perform an atmospheric inversion based on a shrinkage estimator. This method is used to estimate surface fluxes of CO 2 , first partitioned according to constituent geographic regions, and then according to constituent processes that are responsible for the total flux. Our approach differs from previous approaches in two important ways. The first is that the technique of linear Bayesian inversion is recast as a regression problem. Seen as such, standard regression tools are employed to analyse and reduce errors in the resultant estimates. A shrinkage estimator, which combines standard ridge regression with the linear 'Bayesian inversion' model, is introduced. This method introduces additional bias into the model with the aim of reducing variance such that errors are decreased overall. Compared with standard linear Bayesian inversion, the ridge technique seems to reduce both flux estimation errors and prediction errors. The second divergence from previous studies is that instead of dividing the world into geographically distinct regions and estimating the CO 2 flux in each region, the flux space is divided conceptually into processes that contribute to the total global flux. Formulating the problem in this manner adds to the interpretability of the resultant estimates and attempts to shed light on the problem of attributing sources and sinks to their underlying mechanisms

  19. On-Line Flutter Prediction Tool for Wind Tunnel Flutter Testing using Parameter Varying Estimation Methodology, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology, Inc. (ZONA) proposes to develop an on-line flutter prediction tool for wind tunnel model using the parameter varying estimation (PVE) technique to...

  20. Tool to estimate optical metrics from summary wave-front analysis data in the human eye

    NARCIS (Netherlands)

    Jansonius, Nomdo M.

    Purpose Studies in the field of cataract and refractive surgery often report only summary wave-front analysis data data that are too condensed to allow for a retrospective calculation of metrics relevant to visual perception. The aim of this study was to develop a tool that can be used to estimate

  1. Treatment Cost Analysis Tool (TCAT) for estimating costs of outpatient treatment services.

    Science.gov (United States)

    Flynn, Patrick M; Broome, Kirk M; Beaston-Blaakman, Aaron; Knight, Danica K; Horgan, Constance M; Shepard, Donald S

    2009-02-01

    A Microsoft Excel-based workbook designed for research analysts to use in a national study was retooled for treatment program directors and financial officers to allocate, analyze, and estimate outpatient treatment costs in the U.S. This instrument can also be used as a planning and management tool to optimize resources and forecast the impact of future changes in staffing, client flow, program design, and other resources. The Treatment Cost Analysis Tool (TCAT) automatically provides feedback and generates summaries and charts using comparative data from a national sample of non-methadone outpatient providers. TCAT is being used by program staff to capture and allocate both economic and accounting costs, and outpatient service costs are reported for a sample of 70 programs. Costs for an episode of treatment in regular, intensive, and mixed types of outpatient treatment were $882, $1310, and $1381 respectively (based on 20% trimmed means and 2006 dollars). An hour of counseling cost $64 in regular, $85 intensive, and $86 mixed. Group counseling hourly costs per client were $8, $11, and $10 respectively for regular, intensive, and mixed. Future directions include use of a web-based interview version, much like some of the commercially available tax preparation software tools, and extensions for use in other modalities of treatment.

  2. A user's manual of Tools for Error Estimation of Complex Number Matrix Computation (Ver.1.0)

    International Nuclear Information System (INIS)

    Ichihara, Kiyoshi.

    1997-03-01

    'Tools for Error Estimation of Complex Number Matrix Computation' is a subroutine library which aids the users in obtaining the error ranges of the complex number linear system's solutions or the Hermitian matrices' eigen values. This library contains routines for both sequential computers and parallel computers. The subroutines for linear system error estimation calulate norms of residual vectors, matrices's condition numbers, error bounds of solutions and so on. The error estimation subroutines for Hermitian matrix eigen values' derive the error ranges of the eigen values according to the Korn-Kato's formula. This user's manual contains a brief mathematical background of error analysis on linear algebra and usage of the subroutines. (author)

  3. The Massachusetts Sustainable-Yield Estimator: A decision-support tool to assess water availability at ungaged stream locations in Massachusetts

    Science.gov (United States)

    Archfield, Stacey A.; Vogel, Richard M.; Steeves, Peter A.; Brandt, Sara L.; Weiskel, Peter K.; Garabedian, Stephen P.

    2010-01-01

    Federal, State and local water-resource managers require a variety of data and modeling tools to better understand water resources. The U.S. Geological Survey, in cooperation with the Massachusetts Department of Environmental Protection, has developed a statewide, interactive decision-support tool to meet this need. The decision-support tool, referred to as the Massachusetts Sustainable-Yield Estimator (MA SYE) provides screening-level estimates of the sustainable yield of a basin, defined as the difference between the unregulated streamflow and some user-specified quantity of water that must remain in the stream to support such functions as recreational activities or aquatic habitat. The MA SYE tool was designed, in part, because the quantity of surface water available in a basin is a time-varying quantity subject to competing demands for water. To compute sustainable yield, the MA SYE tool estimates a daily time series of unregulated, daily mean streamflow for a 44-year period of record spanning October 1, 1960, through September 30, 2004. Selected streamflow quantiles from an unregulated, daily flow-duration curve are estimated by solving six regression equations that are a function of physical and climate basin characteristics at an ungaged site on a stream of interest. Streamflow is then interpolated between the estimated quantiles to obtain a continuous daily flow-duration curve. A time series of unregulated daily streamflow subsequently is created by transferring the timing of the daily streamflow at a reference streamgage to the ungaged site by equating exceedence probabilities of contemporaneous flow at the two locations. One of 66 reference streamgages is selected by kriging, a geostatistical method, which is used to map the spatial relation among correlations between the time series of the logarithm of daily streamflows at each reference streamgage and the ungaged site. Estimated unregulated, daily mean streamflows show good agreement with observed

  4. Effective tool wear estimation through multisensory information ...

    African Journals Online (AJOL)

    On-line tool wear monitoring plays a significant role in industrial automation for higher productivity and product quality. In addition, an intelligent system is required to make a timely decision for tool change in machining systems in order to avoid the subsequent consequences on the dimensional accuracy and surface finish ...

  5. Recov'Heat: An estimation tool of urban waste heat recovery potential in sustainable cities

    Science.gov (United States)

    Goumba, Alain; Chiche, Samuel; Guo, Xiaofeng; Colombert, Morgane; Bonneau, Patricia

    2017-02-01

    Waste heat recovery is considered as an efficient way to increase carbon-free green energy utilization and to reduce greenhouse gas emission. Especially in urban area, several sources such as sewage water, industrial process, waste incinerator plants, etc., are still rarely explored. Their integration into a district heating system providing heating and/or domestic hot water could be beneficial for both energy companies and local governments. EFFICACITY, a French research institute focused on urban energy transition, has developed an estimation tool for different waste heat sources potentially explored in a sustainable city. This article presents the development method of such a decision making tool which, by giving both energetic and economic analysis, helps local communities and energy service companies to make preliminary studies in heat recovery projects.

  6. An error reduction algorithm to improve lidar turbulence estimates for wind energy

    Directory of Open Access Journals (Sweden)

    J. F. Newman

    2017-02-01

    Full Text Available Remote-sensing devices such as lidars are currently being investigated as alternatives to cup anemometers on meteorological towers for the measurement of wind speed and direction. Although lidars can measure mean wind speeds at heights spanning an entire turbine rotor disk and can be easily moved from one location to another, they measure different values of turbulence than an instrument on a tower. Current methods for improving lidar turbulence estimates include the use of analytical turbulence models and expensive scanning lidars. While these methods provide accurate results in a research setting, they cannot be easily applied to smaller, vertically profiling lidars in locations where high-resolution sonic anemometer data are not available. Thus, there is clearly a need for a turbulence error reduction model that is simpler and more easily applicable to lidars that are used in the wind energy industry. In this work, a new turbulence error reduction algorithm for lidars is described. The Lidar Turbulence Error Reduction Algorithm, L-TERRA, can be applied using only data from a stand-alone vertically profiling lidar and requires minimal training with meteorological tower data. The basis of L-TERRA is a series of physics-based corrections that are applied to the lidar data to mitigate errors from instrument noise, volume averaging, and variance contamination. These corrections are applied in conjunction with a trained machine-learning model to improve turbulence estimates from a vertically profiling WINDCUBE v2 lidar. The lessons learned from creating the L-TERRA model for a WINDCUBE v2 lidar can also be applied to other lidar devices. L-TERRA was tested on data from two sites in the Southern Plains region of the United States. The physics-based corrections in L-TERRA brought regression line slopes much closer to 1 at both sites and significantly reduced the sensitivity of lidar turbulence errors to atmospheric stability. The accuracy of machine

  7. Estimating the benefits of greenhouse gas emission reduction from agricultural policy reform

    International Nuclear Information System (INIS)

    Adger, W.N.; Moran, D.C.

    1993-01-01

    Land use and agricultural activities contribute directly to the increased concentrations of atmospheric greenhouse gases. Economic support in industrialized countries generally increases agriculture's contribution to global greenhouse gas concentrations through fluxes associated with land use change and other sources. Changes in economic support offers opportunities to reduce net emissions, through this so far has gone unaccounted. Estimates are presented here of emissions of methane from livestock in the UK and show that, in monetary terms, when compared to the costs of reducing support, greenhouse gases are a significant factor. As signatory parties to the Climate Change Convection are required to stabilize emissions of all greenhouse gases, options for reduction of emissions of methane and other trace gases from the agricultural sector should form part of these strategies

  8. Establishing the value of occupational health nurses' contributions to worker health and safety: a pilot test of a user-friendly estimation tool.

    Science.gov (United States)

    Graeve, Catherine; McGovern, Patricia; Nachreiner, Nancy M; Ayers, Lynn

    2014-01-01

    Occupational health nurses use their knowledge and skills to improve the health and safety of the working population; however, companies increasingly face budget constraints and may eliminate health and safety programs. Occupational health nurses must be prepared to document their services and outcomes, and use quantitative tools to demonstrate their value to employers. The aim of this project was to create and pilot test a quantitative tool for occupational health nurses to track their activities and potential cost savings for on-site occupational health nursing services. Tool developments included a pilot test in which semi-structured interviews with occupational health and safety leaders were conducted to identify currents issues and products used for estimating the value of occupational health nursing services. The outcome was the creation of a tool that estimates the economic value of occupational health nursing services. The feasibility and potential value of this tool is described.

  9. A Unified tool to estimate Distances, Ages, and Masses (UniDAM) from spectrophotometric data

    Science.gov (United States)

    Mints, Alexey; Hekker, Saskia

    2017-08-01

    Context. Galactic archaeology, the study of the formation and evolution of the Milky Way by reconstructing its past from its current constituents, requires precise and accurate knowledge of stellar parameters for as many stars as possible. To achieve this, a number of large spectroscopic surveys have been undertaken and are still ongoing. Aims: So far consortia carrying out the different spectroscopic surveys have used different tools to determine stellar parameters of stars from their derived effective temperatures (Teff), surface gravities (log g), and metallicities ([Fe/H]); the parameters can be combined with photometric, astrometric, interferometric, or asteroseismic information. Here we aim to homogenise the stellar characterisation by applying a unified tool to a large set of publicly available spectrophotometric data. Methods: We used spectroscopic data from a variety of large surveys combined with infrared photometry from 2MASS and AllWISE and compared these in a Bayesian manner with PARSEC isochrones to derive probability density functions (PDFs) for stellar masses, ages, and distances. We treated PDFs of pre-helium-core burning, helium-core burning, and post helium-core burning solutions as well as different peaks in multimodal PDFs (I.e. each unimodal sub-PDF) of the different evolutionary phases separately. Results: For over 2.5 million stars we report mass, age, and distance estimates for each evolutionary phase and unimodal sub-PDF. We report Gaussian, skewed, Gaussian, truncated Gaussian, modified truncated exponential distribution or truncated Student's t-distribution functions to represent each sub-PDF, allowing us to reconstruct detailed PDFs. Comparisons with stellar parameter estimates from the literature show good agreement within uncertainties. Conclusions: We present UniDAM, the unified tool applicable to spectrophotometric data of different surveys, to obtain a homogenised set of stellar parameters. The unified tool and the tables with

  10. SU-F-P-19: Fetal Dose Estimate for a High-Dose Fluoroscopy Guided Intervention Using Modern Data Tools

    Energy Technology Data Exchange (ETDEWEB)

    Moirano, J [University of Washington, Seattle, WA (United States)

    2016-06-15

    Purpose: An accurate dose estimate is necessary for effective patient management after a fetal exposure. In the case of a high-dose exposure, it is critical to use all resources available in order to make the most accurate assessment of the fetal dose. This work will demonstrate a methodology for accurate fetal dose estimation using tools that have recently become available in many clinics, and show examples of best practices for collecting data and performing the fetal dose calculation. Methods: A fetal dose estimate calculation was performed using modern data collection tools to determine parameters for the calculation. The reference point air kerma as displayed by the fluoroscopic system was checked for accuracy. A cumulative dose incidence map and DICOM header mining were used to determine the displayed reference point air kerma. Corrections for attenuation caused by the patient table and pad were measured and applied in order to determine the peak skin dose. The position and depth of the fetus was determined by ultrasound imaging and consultation with a radiologist. The data collected was used to determine a normalized uterus dose from Monte Carlo simulation data. Fetal dose values from this process were compared to other accepted calculation methods. Results: An accurate high-dose fetal dose estimate was made. Comparison to accepted legacy methods were were within 35% of estimated values. Conclusion: Modern data collection and reporting methods ease the process for estimation of fetal dose from interventional fluoroscopy exposures. Many aspects of the calculation can now be quantified rather than estimated, which should allow for a more accurate estimation of fetal dose.

  11. Reduction of robot base parameters

    International Nuclear Information System (INIS)

    Vandanjon, P.O.

    1995-01-01

    This paper is a new step in the search of minimum dynamic parameters of robots. In spite of planing exciting trajectories and using base parameters, some parameters remain not identifiable due to the perturbation effects. In this paper, we propose methods to reduce the set of base parameters in order to get an essential set of parameters. This new set defines a simplified identification model witch improves the noise immunity of the estimation process. It contributes also in reducing the computation burden of a simplified dynamic model. Different methods are proposed and are classified in two parts: methods, witch perform reduction and identification together, come from statistical field and methods, witch reduces the model before the identification thanks to a priori information, come from numerical field like the QR factorization. Statistical tools and QR reduction are shown to be efficient and adapted to determine the essential parameters. They can be applied to open-loop, or graph structured rigid robot, as well as flexible-link robot. Application for the PUMA 560 robot is given. (authors). 9 refs., 4 tabs

  12. Reduction of robot base parameters

    Energy Technology Data Exchange (ETDEWEB)

    Vandanjon, P O [CEA Centre d` Etudes de Saclay, 91 - Gif-sur-Yvette (France). Dept. des Procedes et Systemes Avances; Gautier, M [Nantes Univ., 44 (France)

    1996-12-31

    This paper is a new step in the search of minimum dynamic parameters of robots. In spite of planing exciting trajectories and using base parameters, some parameters remain not identifiable due to the perturbation effects. In this paper, we propose methods to reduce the set of base parameters in order to get an essential set of parameters. This new set defines a simplified identification model witch improves the noise immunity of the estimation process. It contributes also in reducing the computation burden of a simplified dynamic model. Different methods are proposed and are classified in two parts: methods, witch perform reduction and identification together, come from statistical field and methods, witch reduces the model before the identification thanks to a priori information, come from numerical field like the QR factorization. Statistical tools and QR reduction are shown to be efficient and adapted to determine the essential parameters. They can be applied to open-loop, or graph structured rigid robot, as well as flexible-link robot. Application for the PUMA 560 robot is given. (authors). 9 refs., 4 tabs.

  13. Accounting for density reduction and structural loss in standing dead trees: Implications for forest biomass and carbon stock estimates in the United States

    Directory of Open Access Journals (Sweden)

    Domke Grant M

    2011-11-01

    Full Text Available Abstract Background Standing dead trees are one component of forest ecosystem dead wood carbon (C pools, whose national stock is estimated by the U.S. as required by the United Nations Framework Convention on Climate Change. Historically, standing dead tree C has been estimated as a function of live tree growing stock volume in the U.S.'s National Greenhouse Gas Inventory. Initiated in 1998, the USDA Forest Service's Forest Inventory and Analysis program (responsible for compiling the Nation's forest C estimates began consistent nationwide sampling of standing dead trees, which may now supplant previous purely model-based approaches to standing dead biomass and C stock estimation. A substantial hurdle to estimating standing dead tree biomass and C attributes is that traditional estimation procedures are based on merchantability paradigms that may not reflect density reductions or structural loss due to decomposition common in standing dead trees. The goal of this study was to incorporate standing dead tree adjustments into the current estimation procedures and assess how biomass and C stocks change at multiple spatial scales. Results Accounting for decay and structural loss in standing dead trees significantly decreased tree- and plot-level C stock estimates (and subsequent C stocks by decay class and tree component. At a regional scale, incorporating adjustment factors decreased standing dead quaking aspen biomass estimates by almost 50 percent in the Lake States and Douglas-fir estimates by more than 36 percent in the Pacific Northwest. Conclusions Substantial overestimates of standing dead tree biomass and C stocks occur when one does not account for density reductions or structural loss. Forest inventory estimation procedures that are descended from merchantability standards may need to be revised toward a more holistic approach to determining standing dead tree biomass and C attributes (i.e., attributes of tree biomass outside of sawlog

  14. Decision support tools to improve the effectiveness of hazardous fuel reduction treatments in the New Jersey Pine Barrens

    Science.gov (United States)

    Kenneth L. Clark; Nicholas Skowronski; John Hom; Matthew Duveneck; Yude Pan; Stephen Van Tuyl; Jason Cole; Matthew Patterson; Stephen Maurer

    2009-01-01

    Our goal is to assist the New Jersey Forest Fire Service and federal wildland fire managers in the New Jersey Pine Barrens evaluate where and when to conduct hazardous fuel reduction treatments. We used remotely sensed LIDAR (Light Detection and Ranging System) data and field sampling to estimate fuel loads and consumption during prescribed fire treatments. This...

  15. Perceptual effects of noise reduction by time-frequency masking of noisy speech.

    Science.gov (United States)

    Brons, Inge; Houben, Rolph; Dreschler, Wouter A

    2012-10-01

    Time-frequency masking is a method for noise reduction that is based on the time-frequency representation of a speech in noise signal. Depending on the estimated signal-to-noise ratio (SNR), each time-frequency unit is either attenuated or not. A special type of a time-frequency mask is the ideal binary mask (IBM), which has access to the real SNR (ideal). The IBM either retains or removes each time-frequency unit (binary mask). The IBM provides large improvements in speech intelligibility and is a valuable tool for investigating how different factors influence intelligibility. This study extends the standard outcome measure (speech intelligibility) with additional perceptual measures relevant for noise reduction: listening effort, noise annoyance, speech naturalness, and overall preference. Four types of time-frequency masking were evaluated: the original IBM, a tempered version of the IBM (called ITM) which applies limited and non-binary attenuation, and non-ideal masking (also tempered) with two different types of noise-estimation algorithms. The results from ideal masking imply that there is a trade-off between intelligibility and sound quality, which depends on the attenuation strength. Additionally, the results for non-ideal masking suggest that subjective measures can show effects of noise reduction even if noise reduction does not lead to differences in intelligibility.

  16. Visual tool for estimating the fractal dimension of images

    Science.gov (United States)

    Grossu, I. V.; Besliu, C.; Rusu, M. V.; Jipa, Al.; Bordeianu, C. C.; Felea, D.

    2009-10-01

    This work presents a new Visual Basic 6.0 application for estimating the fractal dimension of images, based on an optimized version of the box-counting algorithm. Following the attempt to separate the real information from "noise", we considered also the family of all band-pass filters with the same band-width (specified as parameter). The fractal dimension can be thus represented as a function of the pixel color code. The program was used for the study of paintings cracks, as an additional tool which can help the critic to decide if an artistic work is original or not. Program summaryProgram title: Fractal Analysis v01 Catalogue identifier: AEEG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 29 690 No. of bytes in distributed program, including test data, etc.: 4 967 319 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 30M Classification: 14 Nature of problem: Estimating the fractal dimension of images. Solution method: Optimized implementation of the box-counting algorithm. Use of a band-pass filter for separating the real information from "noise". User friendly graphical interface. Restrictions: Although various file-types can be used, the application was mainly conceived for the 8-bit grayscale, windows bitmap file format. Running time: In a first approximation, the algorithm is linear.

  17. High-throughput migration modelling for estimating exposure to chemicals in food packaging in screening and prioritization tools

    DEFF Research Database (Denmark)

    Ernstoff, Alexi S; Fantke, Peter; Huang, Lei

    2017-01-01

    Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioriti...... to uncertainty and dramatically decreased model performance (R2 = 0.4, Se = 1). In all, this study provides a rapid migration modelling approach to estimate exposure to chemicals in food packaging for emerging screening and prioritization approaches....

  18. Large biases in regression-based constituent flux estimates: causes and diagnostic tools

    Science.gov (United States)

    Hirsch, Robert M.

    2014-01-01

    It has been documented in the literature that, in some cases, widely used regression-based models can produce severely biased estimates of long-term mean river fluxes of various constituents. These models, estimated using sample values of concentration, discharge, and date, are used to compute estimated fluxes for a multiyear period at a daily time step. This study compares results of the LOADEST seven-parameter model, LOADEST five-parameter model, and the Weighted Regressions on Time, Discharge, and Season (WRTDS) model using subsampling of six very large datasets to better understand this bias problem. This analysis considers sample datasets for dissolved nitrate and total phosphorus. The results show that LOADEST-7 and LOADEST-5, although they often produce very nearly unbiased results, can produce highly biased results. This study identifies three conditions that can give rise to these severe biases: (1) lack of fit of the log of concentration vs. log discharge relationship, (2) substantial differences in the shape of this relationship across seasons, and (3) severely heteroscedastic residuals. The WRTDS model is more resistant to the bias problem than the LOADEST models but is not immune to them. Understanding the causes of the bias problem is crucial to selecting an appropriate method for flux computations. Diagnostic tools for identifying the potential for bias problems are introduced, and strategies for resolving bias problems are described.

  19. How accurate are adolescents in portion-size estimation using the computer tool young adolescents' nutrition assessment on computer (YANA-C)?

    OpenAIRE

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-01-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amou...

  20. Reduction of potassium content of green bean pods and chard by culinary processing. Tools for chronic kidney disease.

    Science.gov (United States)

    Martínez-Pineda, Montserrat; Yagüe-Ruiz, Cristina; Caverni-Muñoz, Alberto; Vercet-Tormo, Antonio

    2016-01-01

    In order to prevent a possible hyperkalemia, chronic renal patients, especially in advanced stages, must follow a low potassium diet. So dietary guidelines for chronic kidney disease recommend limiting the consumption of many vegetables, as well as to apply laborious culinary techniques to maximize the reduction of potassium. The aim of this work is to analyze potassium content from several vegetable, fresh products, frozen and preserved, as well as check and compare the effectiveness in potassium reduction of different culinary processes, some of them recommended in dietary guidelines such as soaking or double cooking. Sample potassium content was analyzed by triplicate using flamephotometry. The results showed significant reductions in potassium content in all culinary processes studied. The degree of loss varied depending on the type of vegetable and processing applied. Frozen products achieved greater reductions than the fresh ones, obtaining in some cases losses greater than 90%. In addition, it was observed how in many cases the single application of a normal cooking reached potassium reductions to acceptable levels for its inclusion in renal patient diet. The results shown in this study are very positive because they provide tools for professionals who deal with this kind of patients. They allow them to adapt more easily to the needs and preferences of their patients and increase dietary variety. Copyright © 2016 Sociedad Española de Nefrología. Published by Elsevier España, S.L.U. All rights reserved.

  1. Two NextGen Air Safety Tools: An ADS-B Equipped UAV and a Wake Turbulence Estimator

    Science.gov (United States)

    Handley, Ward A.

    Two air safety tools are developed in the context of the FAA's NextGen program. The first tool addresses the alarming increase in the frequency of near-collisions between manned and unmanned aircraft by equipping a common hobby class UAV with an ADS-B transponder that broadcasts its position, speed, heading and unique identification number to all local air traffic. The second tool estimates and outputs the location of dangerous wake vortex corridors in real time based on the ADS-B data collected and processed using a custom software package developed for this project. The TRansponder based Position Information System (TRAPIS) consists of data packet decoders, an aircraft database, Graphical User Interface (GUI) and the wake vortex extension application. Output from TRAPIS can be visualized in Google Earth and alleviates the problem of pilots being left to imagine where invisible wake vortex corridors are based solely on intuition or verbal warnings from ATC. The result of these two tools is the increased situational awareness, and hence safety, of human pilots in the National Airspace System (NAS).

  2. Chemiluminescence analyzer of NOx as a high-throughput screening tool in selective catalytic reduction of NO

    International Nuclear Information System (INIS)

    Oh, Kwang Seok; Woo, Seong Ihl

    2011-01-01

    A chemiluminescence-based analyzer of NO x gas species has been applied for high-throughput screening of a library of catalytic materials. The applicability of the commercial NO x analyzer as a rapid screening tool was evaluated using selective catalytic reduction of NO gas. A library of 60 binary alloys composed of Pt and Co, Zr, La, Ce, Fe or W on Al 2 O 3 substrate was tested for the efficiency of NO x removal using a home-built 64-channel parallel and sequential tubular reactor. The NO x concentrations measured by the NO x analyzer agreed well with the results obtained using micro gas chromatography for a reference catalyst consisting of 1 wt% Pt on γ-Al 2 O 3 . Most alloys showed high efficiency at 275 °C, which is typical of Pt-based catalysts for selective catalytic reduction of NO. The screening with NO x analyzer allowed to select Pt-Ce (X) (X=1–3) and Pt–Fe (2) as the optimal catalysts for NO x removal: 73% NO x conversion was achieved with the Pt–Fe (2) alloy, which was much better than the results for the reference catalyst and the other library alloys. This study demonstrates a sequential high-throughput method of practical evaluation of catalysts for the selective reduction of NO.

  3. Estimating effectiveness of crop management for reduction of soil erosion and runoff

    Science.gov (United States)

    Hlavcova, K.; Studvova, Z.; Kohnova, S.; Szolgay, J.

    2017-10-01

    The paper focuses on erosion processes in the Svacenický Creek catchment which is a small sub-catchment of the Myjava River basin. To simulate soil loss and sediment transport the USLE/SDR and WaTEM/SEDEM models were applied. The models were validated by comparing the simulated results with the actual bathymetry of a polder at the catchment outlet. Methods of crop management based on rotation and strip cropping were applied for the reduction of soil loss and sediment transport. The comparison shows that the greatest intensities of soil loss were achieved by the bare soil without vegetation and from the planting of maize for corn. The lowest values were achieved from the planting of winter wheat. At the end the effectiveness of row crops and strip cropping for decreasing design floods from the catchment was estimated.

  4. Estimation and reduction of CO2 emissions from crude oil distillation units

    International Nuclear Information System (INIS)

    Gadalla, M.; Olujic, Z.; Jobson, M.; Smith, R.

    2006-01-01

    Distillation systems are energy-intensive processes, and consequently contribute significantly to the greenhouse gases emissions (e.g. carbon dioxide (CO 2 ). A simple model for the estimation of CO 2 emissions associated with operation of heat-integrated distillation systems as encountered in refineries is introduced. In conjunction with a shortcut distillation model, this model has been used to optimize the process conditions of an existing crude oil atmospheric tower unit aiming at minimization of CO 2 emissions. Simulation results indicate that the total CO 2 emissions of the existing crude oil unit can be cut down by 22%, just by changing the process conditions accordingly, and that the gain in this respect can be doubled by integrating a gas turbine. In addition, emissions reduction is accompanied by substantial profit increase due to utility saving and/or export

  5. Community-based field implementation scenarios of a short message service reporting tool for lymphatic filariasis case estimates in Africa and Asia.

    Science.gov (United States)

    Mableson, Hayley E; Martindale, Sarah; Stanton, Michelle C; Mackenzie, Charles; Kelly-Hope, Louise A

    2017-01-01

    Lymphatic filariasis (LF) is a neglected tropical disease (NTD) targeted for global elimination by 2020. Currently there is considerable international effort to scale-up morbidity management activities in endemic countries, however there remains a need for rapid, cost-effective methods and adaptable tools for obtaining estimates of people presenting with clinical manifestations of LF, namely lymphoedema and hydrocele. The mHealth tool ' MeasureSMS-Morbidity ' allows health workers in endemic areas to use their own mobile phones to send clinical information in a simple format using short message service (SMS). The experience gained through programmatic use of the tool in five endemic countries across a diversity of settings in Africa and Asia is used here to present implementation scenarios that are suitable for adapting the tool for use in a range of different programmatic, endemic, demographic and health system settings. A checklist of five key factors and sub-questions was used to determine and define specific community-based field implementation scenarios for using the MeasureSMS-Morbidity tool in a range of settings. These factors included: (I) tool feasibility (acceptability; community access and ownership); (II) LF endemicity (high; low prevalence); (III) population demography (urban; rural); (IV) health system structure (human resources; community access); and (V) integration with other diseases (co-endemicity). Based on experiences in Bangladesh, Ethiopia, Malawi, Nepal and Tanzania, four implementation scenarios were identified as suitable for using the MeasureSMS-Morbidity tool for searching and reporting LF clinical case data across a range of programmatic, endemic, demographic and health system settings. These include: (I) urban, high endemic setting with two-tier reporting; (II) rural, high endemic setting with one-tier reporting; (III) rural, high endemic setting with two-tier reporting; and (IV) low-endemic, urban and rural setting with one

  6. A Cost Simulation Tool for Estimating the Cost of Operating Government Owned and Operated Ships

    Science.gov (United States)

    1994-09-01

    Horngren , C.T., Foster, G., Datar, S.M., Cost Accounting : A Management Emphasis, Prentice-Hall, Englewood Cliffs, NJ, 1994 IBM Corporation, A Graphical...4. TITLE AND SUBTITLE A COST SIMULATION TOOL FOR 5. FUNDING NUMBERS ESTIMATING THE COST OF OPERATING GOVERNMENT OWNED AND OPERATED SHIPS 6. AUTHOR( S ...normally does not present a problem to the accounting department. The final category, the cost of operating the government owned and operated ships is

  7. The estimated effect of mass or footprint reduction in recent light-duty vehicles on U.S. societal fatality risk per vehicle mile traveled.

    Science.gov (United States)

    Wenzel, Tom

    2013-10-01

    The National Highway Traffic Safety Administration (NHTSA) recently updated its 2003 and 2010 logistic regression analyses of the effect of a reduction in light-duty vehicle mass on US societal fatality risk per vehicle mile traveled (VMT; Kahane, 2012). Societal fatality risk includes the risk to both the occupants of the case vehicle as well as any crash partner or pedestrians. The current analysis is the most thorough investigation of this issue to date. This paper replicates the Kahane analysis and extends it by testing the sensitivity of his results to changes in the definition of risk, and the data and control variables used in the regression models. An assessment by Lawrence Berkeley National Laboratory (LBNL) indicates that the estimated effect of mass reduction on risk is smaller than in Kahane's previous studies, and is statistically non-significant for all but the lightest cars (Wenzel, 2012a). The estimated effects of a reduction in mass or footprint (i.e. wheelbase times track width) are small relative to other vehicle, driver, and crash variables used in the regression models. The recent historical correlation between mass and footprint is not so large to prohibit including both variables in the same regression model; excluding footprint from the model, i.e. allowing footprint to decrease with mass, increases the estimated detrimental effect of mass reduction on risk in cars and crossover utility vehicles (CUVs)/minivans, but has virtually no effect on light trucks. Analysis by footprint deciles indicates that risk does not consistently increase with reduced mass for vehicles of similar footprint. Finally, the estimated effects of mass and footprint reduction are sensitive to the measure of exposure used (fatalities per induced exposure crash, rather than per VMT), as well as other changes in the data or control variables used. It appears that the safety penalty from lower mass can be mitigated with careful vehicle design, and that manufacturers can

  8. Wear-Induced Changes in FSW Tool Pin Profile: Effect of Process Parameters

    Science.gov (United States)

    Sahlot, Pankaj; Jha, Kaushal; Dey, G. K.; Arora, Amit

    2018-06-01

    Friction stir welding (FSW) of high melting point metallic (HMPM) materials has limited application due to tool wear and relatively short tool life. Tool wear changes the profile of the tool pin and adversely affects weld properties. A quantitative understanding of tool wear and tool pin profile is crucial to develop the process for joining of HMPM materials. Here we present a quantitative wear study of H13 steel tool pin profile for FSW of CuCrZr alloy. The tool pin profile is analyzed at multiple traverse distances for welding with various tool rotational and traverse speeds. The results indicate that measured wear depth is small near the pin root and significantly increases towards the tip. Near the pin tip, wear depth increases with increase in tool rotational speed. However, change in wear depth near the pin root is minimal. Wear depth also increases with decrease in tool traverse speeds. Tool pin wear from the bottom results in pin length reduction, which is greater for higher tool rotational speeds, and longer traverse distances. The pin profile changes due to wear and result in root defect for long traverse distance. This quantitative understanding of tool wear would be helpful to estimate tool wear, optimize process parameters, and tool pin shape during FSW of HMPM materials.

  9. RealCalc : a real time Java calculation tool. Application to HVSR estimation

    Science.gov (United States)

    Hloupis, G.; Vallianatos, F.

    2009-04-01

    Java computation platform is not a newcomer in the seismology field. It is mainly used for applications regarding collecting, requesting, spreading and visualizing seismological data because it is productive, safe and has low maintenance costs. Although it has very attractive characteristics for the engineers, Java didn't used frequently in real time applications where prediction and reliability required as a reaction to real world events. The main reasons for this are the absence of priority support (such as priority ceiling or priority inversion) and the use of an automated memory management (called garbage collector). To overcome these problems a number of extensions have been proposed with the Real Time Specification for Java (RTSJ) being the most promising and used one. In the current study we used the RTSJ to build an application that receives data continuously and provides estimations in real time. The application consists of four main modules: incoming data, preprocessing, estimation and publication. As an application example we present real time HVSR estimation. Microtremors recordings are collected continuously from the incoming data module. The preprocessing module consists of a window selector tool based on wavelets which is applied on the incoming data stream in order derive the most stationary parts. The estimation module provides all the necessary calculations according to user specifications. Finally the publication module except the results presentation it also calculates attributes and relevant statistics for each site (temporal variations, HVSR stability). Acknowledgements This work is partially supported by the Greek General Secretariat of Research and Technology in the frame of Crete Regional Project 2000- 2006 (M1.2): "TALOS: An integrated system of seismic hazard monitoring and management in the front of the Hellenic Arc", CRETE PEP7 (KP_7).

  10. ResilSIM—A Decision Support Tool for Estimating Resilience of Urban Systems

    Directory of Open Access Journals (Sweden)

    Sarah Irwin

    2016-09-01

    Full Text Available Damages to urban systems as a result of water-related natural disasters have escalated in recent years. The observed trend is expected to increase in the future as the impacts of population growth, rapid urbanization and climate change persist. To alleviate the damages associated with these impacts, it is recommended to integrate disaster management methods into planning, design and operational policies under all levels of government. This manuscript proposes the concept of ResilSIM: A decision support tool that rapidly estimates the resilience (a modern disaster management measure that is dynamic in time and space of an urban system to the consequences of natural disasters. The web-based tool (with mobile access operates in near real-time. It is designed to assist decision makers in selecting the best options for integrating adaptive capacity into their communities to protect against the negative impacts of a hazard. ResilSIM is developed for application in Toronto and London, Ontario, Canada; however, it is only demonstrated for use in the city of London, which is susceptible to riverine flooding. It is observed how the incorporation of different combinations of adaptation options maintain or strengthen London’s basic structures and functions in the event of a flood.

  11. An innovative multivariate tool for fuel consumption and costs estimation of agricultural operations

    Directory of Open Access Journals (Sweden)

    Mirko Guerrieri

    2016-12-01

    Full Text Available The estimation of operating costs of agricultural and forestry machineries is a key factor in both planning agricultural policies and farm management. Few works have tried to estimate operating costs and the produced models are normally based on deterministic approaches. Conversely, in the statistical model randomness is present and variable states are not described by unique values, but rather by probability distributions. In this study, for the first time, a multivariate statistical model based on Partial Least Squares (PLS was adopted to predict the fuel consumption and costs of six agricultural operations such as: ploughing, harrowing, fertilization, sowing, weed control and shredding. The prediction was conducted on two steps: first of all few initial selected parameters (time per surface-area unit, maximum engine power, purchase price of the tractor and purchase price of the operating machinery were used to estimate the fuel consumption; then the predicted fuel consumption together with the initial parameters were used to estimate the operational costs. Since the obtained models were based on an input dataset very heterogeneous, these resulted to be extremely efficient and so generalizable and robust. In details the results show prediction values in the test with r always ≥ 0.91. Thus, the approach may results extremely useful for both farmers (in terms of economic advantages and at institutional level (representing an innovative and efficient tool for planning future Rural Development Programmes and the Common Agricultural Policy. In light of these advantages the proposed approach may as well be implemented on a web platform and made available to all the stakeholders.

  12. An innovative multivariate tool for fuel consumption and costs estimation of agricultural operations

    Energy Technology Data Exchange (ETDEWEB)

    Guerrieri, M.; Fedrizzi, M.; Antonucci, F.; Pallottino, F.; Sperandio, G.; Pagano, M.; Figorilli, S.; Menesatti, P.; Costa, C.

    2016-07-01

    The estimation of operating costs of agricultural and forestry machineries is a key factor in both planning agricultural policies and farm management. Few works have tried to estimate operating costs and the produced models are normally based on deterministic approaches. Conversely, in the statistical model randomness is present and variable states are not described by unique values, but rather by probability distributions. In this study, for the first time, a multivariate statistical model based on Partial Least Squares (PLS) was adopted to predict the fuel consumption and costs of six agricultural operations such as: ploughing, harrowing, fertilization, sowing, weed control and shredding. The prediction was conducted on two steps: first of all few initial selected parameters (time per surface-area unit, maximum engine power, purchase price of the tractor and purchase price of the operating machinery) were used to estimate the fuel consumption; then the predicted fuel consumption together with the initial parameters were used to estimate the operational costs. Since the obtained models were based on an input dataset very heterogeneous, these resulted to be extremely efficient and so generalizable and robust. In details the results show prediction values in the test with r always ≥ 0.91. Thus, the approach may results extremely useful for both farmers (in terms of economic advantages) and at institutional level (representing an innovative and efficient tool for planning future Rural Development Programmes and the Common Agricultural Policy). In light of these advantages the proposed approach may as well be implemented on a web platform and made available to all the stakeholders.

  13. An innovative multivariate tool for fuel consumption and costs estimation of agricultural operations

    International Nuclear Information System (INIS)

    Guerrieri, M.; Fedrizzi, M.; Antonucci, F.; Pallottino, F.; Sperandio, G.; Pagano, M.; Figorilli, S.; Menesatti, P.; Costa, C.

    2016-01-01

    The estimation of operating costs of agricultural and forestry machineries is a key factor in both planning agricultural policies and farm management. Few works have tried to estimate operating costs and the produced models are normally based on deterministic approaches. Conversely, in the statistical model randomness is present and variable states are not described by unique values, but rather by probability distributions. In this study, for the first time, a multivariate statistical model based on Partial Least Squares (PLS) was adopted to predict the fuel consumption and costs of six agricultural operations such as: ploughing, harrowing, fertilization, sowing, weed control and shredding. The prediction was conducted on two steps: first of all few initial selected parameters (time per surface-area unit, maximum engine power, purchase price of the tractor and purchase price of the operating machinery) were used to estimate the fuel consumption; then the predicted fuel consumption together with the initial parameters were used to estimate the operational costs. Since the obtained models were based on an input dataset very heterogeneous, these resulted to be extremely efficient and so generalizable and robust. In details the results show prediction values in the test with r always ≥ 0.91. Thus, the approach may results extremely useful for both farmers (in terms of economic advantages) and at institutional level (representing an innovative and efficient tool for planning future Rural Development Programmes and the Common Agricultural Policy). In light of these advantages the proposed approach may as well be implemented on a web platform and made available to all the stakeholders.

  14. Effect of tube current modulation for dose estimation using a simulation tool on body CT examination

    International Nuclear Information System (INIS)

    Kawaguchi, Ai; Matsunaga, Yuta; Kobayashi, Masanao; Suzuki, Shoichi; Matsubara, Kosuke; Chida, Koichi

    2015-01-01

    The purpose of this study was to evaluate the effect of tube current modulation for dose estimation of a body computed tomography (CT) examination using a simulation tool. The authors also compared longitudinal variations in tube current values between iterative reconstruction (IR) and filtered back-projection (FBP) reconstruction algorithms. One hundred patients underwent body CT examinations. The tube current values around 10 organ regions were recorded longitudinally from tube current information. The organ and effective doses were simulated by average tube current values and longitudinal modulated tube current values. The organ doses for the bladder and breast estimated by longitudinal modulated tube current values were 20 % higher and 25 % lower than those estimated using the average tube current values, respectively. The differences in effective doses were small (mean, 0.7 mSv). The longitudinal variations in tube current values were almost the same for the IR and FBP algorithms. (authors)

  15. U-AVLIS feed conversion using continuous metallothermic reduction of UF4: System description and cost estimate

    International Nuclear Information System (INIS)

    1994-04-01

    The purpose of this document is to present a system description and develop baseline capital and operating cost estimates for commercial facilities which produced U-Fe feedstock for AVLIS enrichment plants using the continuous fluoride reduction (CFR) process. These costs can then be used together with appropriate economic assumptions to calculate estimated unit costs to the AVLIS plant owner (or utility customer) for such conversion services. Six cases are being examined. All cases assume that the conversion services are performed by a private company at a commercial site which has an existing NRC license to possess source material and which has existing uranium processing operations. The cases differ in terms of annual production capacity and whether the new process system is installed in a new building or in an existing building on the site. The six cases are summarized here

  16. EucaTool®, a cloud computing application for estimating the growth and production of Eucalyptus globulus Labill. plantations in Galicia (NW Spain

    Directory of Open Access Journals (Sweden)

    Alberto Rojo-Alboreca

    2015-12-01

    Full Text Available Aim of study: To present the software utilities and explain how to use EucaTool®, a free cloud computing application developed to estimate the growth and production of seedling and clonal blue gum (Eucalyptus globulus Labill. plantations in Galicia (NW Spain.Area of study: Galicia (NW Spain.Material and methods: EucaTool® implements a dynamic growth and production model that is valid for clonal and non-clonal blue gum plantations in the region. The model integrates transition functions for dominant height (site index curves, number of stems per hectare (mortality function and basal area, as well as output functions for tree and stand volume, biomass and carbon content.Main results: EucaTool® can be freely accessed from any device with an Internet connection, from http://app.eucatool.com. In addition, useful information about the application is published on a related website: http://www.eucatool.com.Research highlights: The application has been designed to enable forest stakeholders to estimate volume, biomass and carbon content of forest plantations from individual trees, diameter classes or stand data, as well as to estimate growth and future production (indicating the optimal rotation age for maximum income by measurement of only four stand variables: age, number of trees per hectare, dominant height and basal area.Keywords: forest management; biomass; seedling; clones; blue gum; forest tool.

  17. Statistical analysis of electrical resistivity as a tool for estimating cement type of 12-year-old concrete specimens

    NARCIS (Netherlands)

    Polder, R.B.; Morales-Napoles, O.; Pacheco, J.

    2012-01-01

    Statistical tests on values of concrete resistivity can be used as a fast tool for estimating the cement type of old concrete. Electrical resistivity of concrete is a material property that describes the electrical resistance of concrete in a unit cell. Influences of binder type, water-to-binder

  18. CoCoa: a software tool for estimating the coefficient of coancestry from multilocus genotype data.

    Science.gov (United States)

    Maenhout, Steven; De Baets, Bernard; Haesaert, Geert

    2009-10-15

    Phenotypic data collected in breeding programs and marker-trait association studies are often analyzed by means of linear mixed models. In these models, the covariance between the genetic background effects of all genotypes under study is modeled by means of pairwise coefficients of coancestry. Several marker-based coancestry estimation procedures allow to estimate this covariance matrix, but generally introduce a certain amount of bias when the examined genotypes are part of a breeding program. CoCoa implements the most commonly used marker-based coancestry estimation procedures and as such, allows to select the best fitting covariance structure for the phenotypic data at hand. This better model fit translates into an increased power and improved type I error control in association studies and an improved accuracy in phenotypic prediction studies. The presented software package also provides an implementation of the new Weighted Alikeness in State (WAIS) estimator for use in hybrid breeding programs. Besides several matrix manipulation tools, CoCoa implements two different bending heuristics, in case the inverse of an ill-conditioned coancestry matrix estimate is needed. The software package CoCoa is freely available at http://webs.hogent.be/cocoa. Source code, manual, binaries for 32 and 64-bit Linux systems and an installer for Microsoft Windows are provided. The core components of CoCoa are written in C++, while the graphical user interface is written in Java.

  19. Estimates of the timing of reductions in genital warts and high grade cervical intraepithelial neoplasia after onset of human papillomavirus (HPV) vaccination in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Ekwueme, Donatus U; Saraiya, Mona; Dunne, Eileen F; Markowitz, Lauri E

    2013-08-20

    The objective of this study was to estimate the number of years after onset of a quadrivalent HPV vaccination program before notable reductions in genital warts and cervical intraepithelial neoplasia (CIN) will occur in teenagers and young adults in the United States. We applied a previously published model of HPV vaccination in the United States and focused on the timing of reductions in genital warts among both sexes and reductions in CIN 2/3 among females. Using different coverage scenarios, the lowest being consistent with current 3-dose coverage in the United States, we estimated the number of years before reductions of 10%, 25%, and 50% would be observed after onset of an HPV vaccination program for ages 12-26 years. The model suggested female-only HPV vaccination in the intermediate coverage scenario will result in a 10% reduction in genital warts within 2-4 years for females aged 15-19 years and a 10% reduction in CIN 2/3 among females aged 20-29 years within 7-11 years. Coverage had a major impact on when reductions would be observed. For example, in the higher coverage scenario a 25% reduction in CIN2/3 would be observed with 8 years compared with 15 years in the lower coverage scenario. Our model provides estimates of the potential timing and magnitude of the impact of HPV vaccination on genital warts and CIN 2/3 at the population level in the United States. Notable, population-level impacts of HPV vaccination on genital warts and CIN 2/3 can occur within a few years after onset of vaccination, particularly among younger age groups. Our results are generally consistent with early reports of declines in genital warts among youth. Published by Elsevier Ltd.

  20. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    International Nuclear Information System (INIS)

    Grassi, Giacomo; Monni, Suvi; Achard, Frederic; Mollicone, Danilo; Federici, Sandro

    2008-01-01

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation

  1. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    Energy Technology Data Exchange (ETDEWEB)

    Grassi, Giacomo; Monni, Suvi; Achard, Frederic [Institute for Environment and Sustainability, Joint Research Centre of the European Commission, I-21020 Ispra (Italy); Mollicone, Danilo [Department of Geography, University of Alcala de Henares, Madrid (Spain); Federici, Sandro

    2008-07-15

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation.

  2. Forensic surface metrology: tool mark evidence.

    Science.gov (United States)

    Gambino, Carol; McLaughlin, Patrick; Kuo, Loretta; Kammerman, Frani; Shenkin, Peter; Diaczuk, Peter; Petraco, Nicholas; Hamby, James; Petraco, Nicholas D K

    2011-01-01

    Over the last several decades, forensic examiners of impression evidence have come under scrutiny in the courtroom due to analysis methods that rely heavily on subjective morphological comparisons. Currently, there is no universally accepted system that generates numerical data to independently corroborate visual comparisons. Our research attempts to develop such a system for tool mark evidence, proposing a methodology that objectively evaluates the association of striated tool marks with the tools that generated them. In our study, 58 primer shear marks on 9 mm cartridge cases, fired from four Glock model 19 pistols, were collected using high-resolution white light confocal microscopy. The resulting three-dimensional surface topographies were filtered to extract all "waviness surfaces"-the essential "line" information that firearm and tool mark examiners view under a microscope. Extracted waviness profiles were processed with principal component analysis (PCA) for dimension reduction. Support vector machines (SVM) were used to make the profile-gun associations, and conformal prediction theory (CPT) for establishing confidence levels. At the 95% confidence level, CPT coupled with PCA-SVM yielded an empirical error rate of 3.5%. Complementary, bootstrap-based computations for estimated error rates were 0%, indicating that the error rate for the algorithmic procedure is likely to remain low on larger data sets. Finally, suggestions are made for practical courtroom application of CPT for assigning levels of confidence to SVM identifications of tool marks recorded with confocal microscopy. Copyright © 2011 Wiley Periodicals, Inc.

  3. Single Tree Vegetation Depth Estimation Tool for Satellite Services Link Design

    Directory of Open Access Journals (Sweden)

    Z. Hasirci

    2016-04-01

    Full Text Available Attenuation caused by tree shadowing is an important factor for describing the propagation channel of satellite services. Thus, vegetation effects should be determined by experimental studies or empirical formulations. In this study, tree types in the Black Sea Region of Turkey are classified based on their geometrical shapes into four groups such as conic, ellipsoid, spherical and hemispherical. The variations of the vegetation depth according to different tree shapes are calculated with ray tracing method. It is showed that different geometrical shapes have different vegetation depths even if they have same foliage volume for different elevation angles. The proposed method is validated with the related literature in terms of average single tree attenuation. On the other hand, due to decrease system requirements (speed, memory usage etc. of ray tracing method, an artificial neural network is proposed as an alternative. A graphical user interface is created for the above processes in MATLAB environment named vegetation depth estimation tool (VdET.

  4. Determination of reduction yield of lithium metal reduction process

    International Nuclear Information System (INIS)

    Choi, In Kyu; Cho, Young Hwan; Kim, Taek Jin; Jee, Kwang Young

    2004-01-01

    Metal reduction of spent oxide fuel is the first step for the effective storage of spent fuel in Korea as well as transmutation purpose of long-lived radio-nuclides. During the reduction of uranium oxide by lithium metal to uranium metal, lithium oxide is stoichiometrically produced. By determining the concentration of lithium oxide in lithium chloride, we can estimate that how much uranium oxide is converted to uranium metal. Previous method to determine the lithium oxide concentration in lithium chloride is tedious and timing consuming. This paper describe the on-line monitoring method of lithium oxide during the reduction process

  5. Estimation of toxicity using the Toxicity Estimation Software Tool (TEST)

    Science.gov (United States)

    Tens of thousands of chemicals are currently in commerce, and hundreds more are introduced every year. Since experimental measurements of toxicity are extremely time consuming and expensive, it is imperative that alternative methods to estimate toxicity are developed.

  6. Stochastic LMP (Locational marginal price) calculation method in distribution systems to minimize loss and emission based on Shapley value and two-point estimate method

    International Nuclear Information System (INIS)

    Azad-Farsani, Ehsan; Agah, S.M.M.; Askarian-Abyaneh, Hossein; Abedi, Mehrdad; Hosseinian, S.H.

    2016-01-01

    LMP (Locational marginal price) calculation is a serious impediment in distribution operation when private DG (distributed generation) units are connected to the network. A novel policy is developed in this study to guide distribution company (DISCO) to exert its control over the private units when power loss and green-house gases emissions are minimized. LMP at each DG bus is calculated according to the contribution of the DG to the reduced amount of loss and emission. An iterative algorithm which is based on the Shapley value method is proposed to allocate loss and emission reduction. The proposed algorithm will provide a robust state estimation tool for DISCOs in the next step of operation. The state estimation tool provides the decision maker with the ability to exert its control over private DG units when loss and emission are minimized. Also, a stochastic approach based on the PEM (point estimate method) is employed to capture uncertainty in the market price and load demand. The proposed methodology is applied to a realistic distribution network, and efficiency and accuracy of the method are verified. - Highlights: • Reduction of the loss and emission at the same time. • Fair allocation of loss and emission reduction. • Estimation of the system state using an iterative algorithm. • Ability of DISCOs to control DG units via the proposed policy. • Modeling the uncertainties to calculate the stochastic LMP.

  7. A GIS-based tool for estimating tree canopy cover on fixed-radius plots using high-resolution aerial imagery

    Science.gov (United States)

    Sara A. Goeking; Greg C. Liknes; Erik Lindblom; John Chase; Dennis M. Jacobs; Robert. Benton

    2012-01-01

    Recent changes to the Forest Inventory and Analysis (FIA) Program's definition of forest land precipitated the development of a geographic information system (GIS)-based tool for efficiently estimating tree canopy cover for all FIA plots. The FIA definition of forest land has shifted from a density-related criterion based on stocking to a 10 percent tree canopy...

  8. Application of the Streamflow Prediction Tool to Estimate Sediment Dredging Volumes in Texas Coastal Waterways

    Science.gov (United States)

    Yeates, E.; Dreaper, G.; Afshari, S.; Tavakoly, A. A.

    2017-12-01

    Over the past six fiscal years, the United States Army Corps of Engineers (USACE) has contracted an average of about a billion dollars per year for navigation channel dredging. To execute these funds effectively, USACE Districts must determine which navigation channels need to be dredged in a given year. Improving this prioritization process results in more efficient waterway maintenance. This study uses the Streamflow Prediction Tool, a runoff routing model based on global weather forecast ensembles, to estimate dredged volumes. This study establishes regional linear relationships between cumulative flow and dredged volumes over a long-term simulation covering 30 years (1985-2015), using drainage area and shoaling parameters. The study framework integrates the National Hydrography Dataset (NHDPlus Dataset) with parameters from the Corps Shoaling Analysis Tool (CSAT) and dredging record data from USACE District records. Results in the test cases of the Houston Ship Channel and the Sabine and Port Arthur Harbor waterways in Texas indicate positive correlation between the simulated streamflows and actual dredging records.

  9. A correction in the CDM methodological tool for estimating methane emissions from solid waste disposal sites.

    Science.gov (United States)

    Santos, M M O; van Elk, A G P; Romanel, C

    2015-12-01

    Solid waste disposal sites (SWDS) - especially landfills - are a significant source of methane, a greenhouse gas. Although having the potential to be captured and used as a fuel, most of the methane formed in SWDS is emitted to the atmosphere, mainly in developing countries. Methane emissions have to be estimated in national inventories. To help this task the Intergovernmental Panel on Climate Change (IPCC) has published three sets of guidelines. In addition, the Kyoto Protocol established the Clean Development Mechanism (CDM) to assist the developed countries to offset their own greenhouse gas emissions by assisting other countries to achieve sustainable development while reducing emissions. Based on methodologies provided by the IPCC regarding SWDS, the CDM Executive Board has issued a tool to be used by project developers for estimating baseline methane emissions in their project activities - on burning biogas from landfills or on preventing biomass to be landfilled and so avoiding methane emissions. Some inconsistencies in the first two IPCC guidelines have already been pointed out in an Annex of IPCC latest edition, although with hidden details. The CDM tool uses a model for methane estimation that takes on board parameters, factors and assumptions provided in the latest IPCC guidelines, while using in its core equation the one of the second IPCC edition with its shortcoming as well as allowing a misunderstanding of the time variable. Consequences of wrong ex-ante estimation of baseline emissions regarding CDM project activities can be of economical or environmental type. Example of the first type is the overestimation of 18% in an actual project on biogas from landfill in Brazil that harms its developers; of the second type, the overestimation of 35% in a project preventing municipal solid waste from being landfilled in China, which harms the environment, not for the project per se but for the undue generated carbon credits. In a simulated landfill - the same

  10. U-AVLIS feed conversion using continuous metallothermic reduction of UF{sub 4}: System description and cost estimate

    Energy Technology Data Exchange (ETDEWEB)

    1994-04-01

    The purpose of this document is to present a system description and develop baseline capital and operating cost estimates for commercial facilities which produced U-Fe feedstock for AVLIS enrichment plants using the continuous fluoride reduction (CFR) process. These costs can then be used together with appropriate economic assumptions to calculate estimated unit costs to the AVLIS plant owner (or utility customer) for such conversion services. Six cases are being examined. All cases assume that the conversion services are performed by a private company at a commercial site which has an existing NRC license to possess source material and which has existing uranium processing operations. The cases differ in terms of annual production capacity and whether the new process system is installed in a new building or in an existing building on the site. The six cases are summarized here.

  11. Estimating Prediction Uncertainty from Geographical Information System Raster Processing: A User's Manual for the Raster Error Propagation Tool (REPTool)

    Science.gov (United States)

    Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.

    2009-01-01

    The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.

  12. Precipitation areal-reduction factor estimation using an annual-maxima centered approach

    Science.gov (United States)

    Asquith, W.H.; Famiglietti, J.S.

    2000-01-01

    The adjustment of precipitation depth of a point storm to an effective (mean) depth over a watershed is important for characterizing rainfall-runoff relations and for cost-effective designs of hydraulic structures when design storms are considered. A design storm is the precipitation point depth having a specified duration and frequency (recurrence interval). Effective depths are often computed by multiplying point depths by areal-reduction factors (ARF). ARF range from 0 to 1, vary according to storm characteristics, such as recurrence interval; and are a function of watershed characteristics, such as watershed size, shape, and geographic location. This paper presents a new approach for estimating ARF and includes applications for the 1-day design storm in Austin, Dallas, and Houston, Texas. The approach, termed 'annual-maxima centered,' specifically considers the distribution of concurrent precipitation surrounding an annual-precipitation maxima, which is a feature not seen in other approaches. The approach does not require the prior spatial averaging of precipitation, explicit determination of spatial correlation coefficients, nor explicit definition of a representative area of a particular storm in the analysis. The annual-maxima centered approach was designed to exploit the wide availability of dense precipitation gauge data in many regions of the world. The approach produces ARF that decrease more rapidly than those from TP-29. Furthermore, the ARF from the approach decay rapidly with increasing recurrence interval of the annual-precipitation maxima. (C) 2000 Elsevier Science B.V.The adjustment of precipitation depth of a point storm to an effective (mean) depth over a watershed is important for characterizing rainfall-runoff relations and for cost-effective designs of hydraulic structures when design storms are considered. A design storm is the precipitation point depth having a specified duration and frequency (recurrence interval). Effective depths are

  13. Los Alamos Waste Management Cost Estimation Model

    International Nuclear Information System (INIS)

    Matysiak, L.M.; Burns, M.L.

    1994-03-01

    This final report completes the Los Alamos Waste Management Cost Estimation Project, and includes the documentation of the waste management processes at Los Alamos National Laboratory (LANL) for hazardous, mixed, low-level radioactive solid and transuranic waste, development of the cost estimation model and a user reference manual. The ultimate goal of this effort was to develop an estimate of the life cycle costs for the aforementioned waste types. The Cost Estimation Model is a tool that can be used to calculate the costs of waste management at LANL for the aforementioned waste types, under several different scenarios. Each waste category at LANL is managed in a separate fashion, according to Department of Energy requirements and state and federal regulations. The cost of the waste management process for each waste category has not previously been well documented. In particular, the costs associated with the handling, treatment and storage of the waste have not been well understood. It is anticipated that greater knowledge of these costs will encourage waste generators at the Laboratory to apply waste minimization techniques to current operations. Expected benefits of waste minimization are a reduction in waste volume, decrease in liability and lower waste management costs

  14. Active3 noise reduction

    International Nuclear Information System (INIS)

    Holzfuss, J.

    1996-01-01

    Noise reduction is a problem being encountered in a variety of applications, such as environmental noise cancellation, signal recovery and separation. Passive noise reduction is done with the help of absorbers. Active noise reduction includes the transmission of phase inverted signals for the cancellation. This paper is about a threefold active approach to noise reduction. It includes the separation of a combined source, which consists of both a noise and a signal part. With the help of interaction with the source by scanning it and recording its response, modeling as a nonlinear dynamical system is achieved. The analysis includes phase space analysis and global radial basis functions as tools for the prediction used in a subsequent cancellation procedure. Examples are given which include noise reduction of speech. copyright 1996 American Institute of Physics

  15. GumTree: Data reduction

    Energy Technology Data Exchange (ETDEWEB)

    Rayner, Hugh [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)]. E-mail: hrz@ansto.gov.au; Hathaway, Paul [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Hauser, Nick [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Fei, Yang [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Franceschini, Ferdi [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Lam, Tony [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    2006-11-15

    Access to software tools for interactive data reduction, visualisation and analysis during a neutron scattering experiment enables instrument users to make informed decisions regarding the direction and success of their experiment. ANSTO aims to enhance the experiment experience of its facility's users by integrating these data reduction tools with the instrument control interface for immediate feedback. GumTree is a software framework and application designed to support an Integrated Scientific Experimental Environment, for concurrent access to instrument control, data acquisition, visualisation and analysis software. The Data Reduction and Analysis (DRA) module is a component of the GumTree framework that allows users to perform data reduction, correction and basic analysis within GumTree while an experiment is running. It is highly integrated with GumTree, able to pull experiment data and metadata directly from the instrument control and data acquisition components. The DRA itself uses components common to all instruments at the facility, providing a consistent interface. It features familiar ISAW-based 1D and 2D plotting, an OpenGL-based 3D plotter and peak fitting performed by fityk. This paper covers the benefits of integration, the flexibility of the DRA module, ease of use for the interface and audit trail generation.

  16. GumTree: Data reduction

    International Nuclear Information System (INIS)

    Rayner, Hugh; Hathaway, Paul; Hauser, Nick; Fei, Yang; Franceschini, Ferdi; Lam, Tony

    2006-01-01

    Access to software tools for interactive data reduction, visualisation and analysis during a neutron scattering experiment enables instrument users to make informed decisions regarding the direction and success of their experiment. ANSTO aims to enhance the experiment experience of its facility's users by integrating these data reduction tools with the instrument control interface for immediate feedback. GumTree is a software framework and application designed to support an Integrated Scientific Experimental Environment, for concurrent access to instrument control, data acquisition, visualisation and analysis software. The Data Reduction and Analysis (DRA) module is a component of the GumTree framework that allows users to perform data reduction, correction and basic analysis within GumTree while an experiment is running. It is highly integrated with GumTree, able to pull experiment data and metadata directly from the instrument control and data acquisition components. The DRA itself uses components common to all instruments at the facility, providing a consistent interface. It features familiar ISAW-based 1D and 2D plotting, an OpenGL-based 3D plotter and peak fitting performed by fityk. This paper covers the benefits of integration, the flexibility of the DRA module, ease of use for the interface and audit trail generation

  17. Is The Ca + K + Mg/Al Ratio in the Soil Solution a Predictive Tool for Estimating Forest Damage?

    International Nuclear Information System (INIS)

    Goeransson, A.; Eldhuset, T. D.

    2001-01-01

    The ratio between (Ca +K +Mg) and Al in nutrient solution has been suggested as a predictive tool for estimating tree growth disturbance. However, the ratio is unspecific in the sense that it is based on several elements which are all essential for plant growth;each of these may be growth-limiting. Furthermore,aluminium retards growth at higher concentrations. Itis therefore difficult to give causal and objective biological explanations for possible growth disturbances. The importance of the proportion of base-cations to N, at a fixed base-cation/Al ratio, is evaluated with regard to growth of Picea abies.The uptake of elements was found to be selective; nutrients were taken up while most Al remained in solution. Biomass partitioning to the roots increased after aluminium addition with low proportions of basecations to nitrogen. We conclude that the low growthrates depend on nutrient limitation in these treatments. Low growth rates in the high proportion experiments may be explained by high internal Alconcentrations. The results strongly suggest that growth rate is not correlated with the ratio in the rooting medium and question the validity of using ratios as predictive tools for estimating forest damage. We suggest that growth limitation of Picea abies in the field may depend on low proportions of base cations to nitrate. It is therefore important to know the nutritional status of the plant material in relation to the growth potential and environmental limitation to be able to predict and estimate forest damage

  18. Control Strategy Tool (CoST)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The EPA Control Strategy Tool (CoST) is a software tool for projecting potential future control scenarios, their effects on emissions and estimated costs. This tool...

  19. Observation-based estimation of aerosol-induced reduction of planetary boundary layer height

    Science.gov (United States)

    Zou, Jun; Sun, Jianning; Ding, Aijun; Wang, Minghuai; Guo, Weidong; Fu, Congbin

    2017-09-01

    Radiative aerosols are known to influence the surface energy budget and hence the evolution of the planetary boundary layer. In this study, we develop a method to estimate the aerosol-induced reduction in the planetary boundary layer height (PBLH) based on two years of ground-based measurements at a site, the Station for Observing Regional Processes of the Earth System (SORPES), at Nanjing University, China, and radiosonde data from the meteorological station of Nanjing. The observations show that increased aerosol loads lead to a mean decrease of 67.1 W m-2 for downward shortwave radiation (DSR) and a mean increase of 19.2 W m-2 for downward longwave radiation (DLR), as well as a mean decrease of 9.6 Wm-2 for the surface sensible heat flux (SHF) in the daytime. The relative variations of DSR, DLR and SHF are shown as a function of the increment of column mass concentration of particulate matter (PM2.5). High aerosol loading can significantly increase the atmospheric stability in the planetary boundary layer during both daytime and nighttime. Based on the statistical relationship between SHF and PM2.5 column mass concentrations, the SHF under clean atmospheric conditions (same as the background days) is derived. In this case, the derived SHF, together with observed SHF, are then used to estimate changes in the PBLH related to aerosols. Our results suggest that the PBLH decreases more rapidly with increasing aerosol loading at high aerosol loading. When the daytime mean column mass concentration of PM2.5 reaches 200 mg m-2, the decrease in the PBLH at 1600 LST (local standard time) is about 450 m.

  20. A Visualization Tool to Analyse Usage of Web-Based Interventions: The Example of Positive Online Weight Reduction (POWeR)

    Science.gov (United States)

    Smith, Emily; Bradbury, Katherine; Morrison, Leanne; Dennison, Laura; Michaelides, Danius; Yardley, Lucy

    2015-01-01

    Background Attrition is a significant problem in Web-based interventions. Consequently, this research aims to identify the relation between Web usage and benefit from such interventions. A visualization tool has been developed that enables researchers to more easily examine large datasets on intervention usage that can be difficult to make sense of using traditional descriptive or statistical techniques alone. Objective This paper demonstrates how the visualization tool was used to explore patterns in participants’ use of a Web-based weight management intervention, termed "positive online weight reduction (POWeR)." We also demonstrate how the visualization tool can be used to perform subsequent statistical analyses of the association between usage patterns, participant characteristics, and intervention outcome. Methods The visualization tool was used to analyze data from 132 participants who had accessed at least one session of the POWeR intervention. Results There was a drop in usage of optional sessions after participants had accessed the initial, core POWeR sessions, but many users nevertheless continued to complete goal and weight reviews. The POWeR tools relating to the food diary and steps diary were reused most often. Differences in participant characteristics and usage of other intervention components were identified between participants who did and did not choose to access optional POWeR sessions (in addition to the initial core sessions) or reuse the food and steps diaries. Reuse of the steps diary and the getting support tools was associated with greater weight loss. Conclusions The visualization tool provided a quick and efficient method for exploring patterns of Web usage, which enabled further analyses of whether different usage patterns were associated with participant characteristics or differences in intervention outcome. Further usage of visualization techniques is recommended to (1) make sense of large datasets more quickly and efficiently; (2

  1. An Accurate Computational Tool for Performance Estimation of FSO Communication Links over Weak to Strong Atmospheric Turbulent Channels

    Directory of Open Access Journals (Sweden)

    Theodore D. Katsilieris

    2017-03-01

    Full Text Available The terrestrial optical wireless communication links have attracted significant research and commercial worldwide interest over the last few years due to the fact that they offer very high and secure data rate transmission with relatively low installation and operational costs, and without need of licensing. However, since the propagation path of the information signal, i.e., the laser beam, is the atmosphere, their effectivity affects the atmospheric conditions strongly in the specific area. Thus, system performance depends significantly on the rain, the fog, the hail, the atmospheric turbulence, etc. Due to the influence of these effects, it is necessary to study, theoretically and numerically, very carefully before the installation of such a communication system. In this work, we present exactly and accurately approximate mathematical expressions for the estimation of the average capacity and the outage probability performance metrics, as functions of the link’s parameters, the transmitted power, the attenuation due to the fog, the ambient noise and the atmospheric turbulence phenomenon. The latter causes the scintillation effect, which results in random and fast fluctuations of the irradiance at the receiver’s end. These fluctuations can be studied accurately with statistical methods. Thus, in this work, we use either the lognormal or the gamma–gamma distribution for weak or moderate to strong turbulence conditions, respectively. Moreover, using the derived mathematical expressions, we design, accomplish and present a computational tool for the estimation of these systems’ performances, while also taking into account the parameter of the link and the atmospheric conditions. Furthermore, in order to increase the accuracy of the presented tool, for the cases where the obtained analytical mathematical expressions are complex, the performance results are verified with the numerical estimation of the appropriate integrals. Finally, using

  2. Duration of fuels reduction following prescribed fire in coniferous forests of U.S. national parks in California and the Colorado Plateau

    Science.gov (United States)

    van Mantgem, Phillip J.; Lalemand, Laura; Keifer, MaryBeth; Kane, Jeffrey M.

    2016-01-01

    Prescribed fire is a widely used forest management tool, yet the long-term effectiveness of prescribed fire in reducing fuels and fire hazards in many vegetation types is not well documented. We assessed the magnitude and duration of reductions in surface fuels and modeled fire hazards in coniferous forests across nine U.S. national parks in California and the Colorado Plateau. We used observations from a prescribed fire effects monitoring program that feature standard forest and surface fuels inventories conducted pre-fire, immediately following an initial (first-entry) prescribed fire and at varying intervals up to >20 years post-fire. A subset of these plots was subjected to prescribed fire again (second-entry) with continued monitoring. Prescribed fire effects were highly variable among plots, but we found on average first-entry fires resulted in a significant post-fire reduction in surface fuels, with litter and duff fuels not returning to pre-fire levels over the length of our observations. Fine and coarse woody fuels often took a decade or longer to return to pre-fire levels. For second-entry fires we found continued fuels reductions, without strong evidence of fuel loads returning to levels observed immediately prior to second-entry fire. Following both first- and second-entry fire there were increases in estimated canopy base heights, along with reductions in estimated canopy bulk density and modeled flame lengths. We did not find evidence of return to pre-fire conditions during our observation intervals for these measures of fire hazard. Our results show that prescribed fire can be a valuable tool to reduce fire hazards and, depending on forest conditions and the measurement used, reductions in fire hazard can last for decades. Second-entry prescribed fire appeared to reinforce the reduction in fuels and fire hazard from first-entry fires.

  3. Basis of Estimate Software Tool (BEST) - a practical solution to part of the cost and schedule integration puzzle

    International Nuclear Information System (INIS)

    Murphy, L.; Bain, P.

    1997-01-01

    The Basis of Estimate Software Tool (BEST) was developed at the Rocky Flats Environmental Technology Site (Rocky Flats) to bridge the gap that exists in conventional project control systems between scheduled activities, their allocated or assigned resources, and the set of assumptions (basis of estimate) that correlate resources and activities. Having a documented and auditable basis of estimate (BOE) is necessary for budget validation, work scope analysis, change control, and a number of related management control functions. The uniqueness of BEST is demonstrated by the manner in which it responds to the diverse needs of the heavily regulated environmental workplace - containing many features not found in conventional off-the-shelf software products. However, even companies dealing in relatively unregulated work places will find many attractive features in BEST. This product will be of particular interest to current Government contractors and contractors preparing proposals that may require subsequent validation. 2 figs

  4. A model reduction approach for the variational estimation of vascular compliance by solving an inverse fluid–structure interaction problem

    International Nuclear Information System (INIS)

    Bertagna, Luca; Veneziani, Alessandro

    2014-01-01

    Scientific computing has progressively become an important tool for research in cardiovascular diseases. The role of quantitative analyses based on numerical simulations has moved from ‘proofs of concept’ to patient-specific investigations, thanks to a strong integration between imaging and computational tools. However, beyond individual geometries, numerical models require the knowledge of parameters that are barely retrieved from measurements, especially in vivo. For this reason, recently cardiovascular mathematics considered data assimilation procedures for extracting the knowledge of patient-specific parameters from measures and images. In this paper, we consider specifically the quantification of vascular compliance, i.e. the parameter quantifying the tendency of arterial walls to deform under blood stress. Following up a previous paper, where a variational data assimilation procedure was proposed, based on solving an inverse fluid–structure interaction problem, here we consider model reduction techniques based on a proper orthogonal decomposition approach to accomplish the solution of the inverse problem in a computationally efficient way. (paper)

  5. ARA and ARI imperfect repair models: Estimation, goodness-of-fit and reliability prediction

    International Nuclear Information System (INIS)

    Toledo, Maria Luíza Guerra de; Freitas, Marta A.; Colosimo, Enrico A.; Gilardoni, Gustavo L.

    2015-01-01

    An appropriate maintenance policy is essential to reduce expenses and risks related to equipment failures. A fundamental aspect to be considered when specifying such policies is to be able to predict the reliability of the systems under study, based on a well fitted model. In this paper, the classes of models Arithmetic Reduction of Age and Arithmetic Reduction of Intensity are explored. Likelihood functions for such models are derived, and a graphical method is proposed for model selection. A real data set involving failures in trucks used by a Brazilian mining is analyzed considering models with different memories. Parameters, namely, shape and scale for Power Law Process, and the efficiency of repair were estimated for the best fitted model. Estimation of model parameters allowed us to derive reliability estimators to predict the behavior of the failure process. These results are a valuable information for the mining company and can be used to support decision making regarding preventive maintenance policy. - Highlights: • Likelihood functions for imperfect repair models are derived. • A goodness-of-fit technique is proposed as a tool for model selection. • Failures in trucks owned by a Brazilian mining are modeled. • Estimation allowed deriving reliability predictors to forecast the future failure process of the trucks

  6. Reducing catheter-related thrombosis using a risk reduction tool centered on catheter to vessel ratio.

    Science.gov (United States)

    Spencer, Timothy R; Mahoney, Keegan J

    2017-11-01

    In vascular access practices, the internal vessel size is considered important, and a catheter to vessel ratio (CVR) is recommended to assist clinicians in selecting the most appropriate-sized device for the vessel. In 2016, new practice recommendations stated that the CVR can increase from 33 to 45% of the vessels diameter. There has been evidence on larger diameter catheters and increased thrombosis risk in recent literature, while insufficient information established on what relationship to vessel size is appropriate for any intra-vascular device. Earlier references to clinical standards and guidelines did not clearly address vessel size in relation to the area consumed or external catheter diameter. The aim of this manuscript is to present catheter-related thrombosis evidence and develop a standardized process of ultrasound-guided vessel assessment, integrating CVR, Virchow's triad phenomenon and vessel health and preservation strategies, empowering an evidence-based approach to device placement. Through review, calculation and assessment on the areas of the 33 and 45% rule, a preliminary clinical tool was developed to assist clinicians make cognizant decisions when placing intravascular devices relating to target vessel size, focusing on potential reduction in catheter-related thrombosis. Increasing the understanding and utilization of CVRs will lead to a safer, more consistent approach to device placement, with potential thrombosis reduction strategies. The future of evidence-based data relies on the clinician to capture accurate vessel measurements and device-related outcomes. This will lead to a more dependable data pool, driving the relationship of catheter-related thrombosis and vascular assessment.

  7. High-throughput migration modelling for estimating exposure to chemicals in food packaging in screening and prioritization tools.

    Science.gov (United States)

    Ernstoff, Alexi S; Fantke, Peter; Huang, Lei; Jolliet, Olivier

    2017-11-01

    Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioritization, which require rapid computation of accurate estimates for diverse scenarios. To fulfil this need, we develop an accurate and rapid (high-throughput) model that estimates the fraction of organic chemicals migrating from polymeric packaging materials into foods. Several hundred step-wise simulations optimised the model coefficients to cover a range of user-defined scenarios (e.g. temperature). The developed model, operationalised in a spreadsheet for future dissemination, nearly instantaneously estimates chemical migration, and has improved performance over commonly used model simplifications. When using measured diffusion coefficients the model accurately predicted (R 2  = 0.9, standard error (S e ) = 0.5) hundreds of empirical data points for various scenarios. Diffusion coefficient modelling, which determines the speed of chemical transfer from package to food, was a major contributor to uncertainty and dramatically decreased model performance (R 2  = 0.4, S e  = 1). In all, this study provides a rapid migration modelling approach to estimate exposure to chemicals in food packaging for emerging screening and prioritization approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Development of a health effects based priority ranking system for air emissions reductions from oil refineries in Canada

    International Nuclear Information System (INIS)

    McColl, S.; Gower, S.; Hicks, J.; Shortreed, J.; Craig, L.

    2004-01-01

    This paper presents the concept and methodologies behind the development of a health effects priority ranking tool for the reduction of air emissions from oil refineries. The Health Effects Indicators Decision Index- Versions 2 (Heidi II) was designed to assist policy makers in prioritizing air emissions reductions on the basis of estimated risk to human health. Inputs include facility level rankings of potential health impacts associated with carcinogenic air toxics, non-carcinogenic air toxics and criteria air contaminants for each of the 20 refineries in Canada. Rankings of estimated health impacts are presented on predicted incidence of health effects. Heidi II considers site-specific annual pollutant emission data, ambient air concentrations associated with releases and concentration response functions for various types of health effects. Additional data includes location specific background air concentrations, site-specific population densities, and the baseline incidence of different health effects endpoints, such as cancer, non-cancer illnesses and cardiorespiratory illnesses and death. Air pollutants include the 29 air toxics reported annually in Environment Canada's National Pollutant Release Inventory. Three health impact ranking outputs are provided for each facility: ranking of pollutants based on predicted number of annual cases of health effects; ranking of pollutants based on simplified Disability Adjusted Life Years (DALYs); and ranking of pollutants based on more complex DALYs that consider types of cancer, systemic disease or types of cardiopulmonary health effects. Rankings rely on rough statistical estimates of predicted incidence rates for health endpoints. The models used to calculate rankings can provide useful guidance by comparing estimated health impacts. Heidi II has demonstrated that it is possible to develop a consistent and objective approach for ranking priority reductions of air emissions. Heidi II requires numerous types and

  9. Technical Note: On the efficiency of variance reduction techniques for Monte Carlo estimates of imaging noise.

    Science.gov (United States)

    Sharma, Diksha; Sempau, Josep; Badano, Aldo

    2018-02-01

    Monte Carlo simulations require large number of histories to obtain reliable estimates of the quantity of interest and its associated statistical uncertainty. Numerous variance reduction techniques (VRTs) have been employed to increase computational efficiency by reducing the statistical uncertainty. We investigate the effect of two VRTs for optical transport methods on accuracy and computing time for the estimation of variance (noise) in x-ray imaging detectors. We describe two VRTs. In the first, we preferentially alter the direction of the optical photons to increase detection probability. In the second, we follow only a fraction of the total optical photons generated. In both techniques, the statistical weight of photons is altered to maintain the signal mean. We use fastdetect2, an open-source, freely available optical transport routine from the hybridmantis package. We simulate VRTs for a variety of detector models and energy sources. The imaging data from the VRT simulations are then compared to the analog case (no VRT) using pulse height spectra, Swank factor, and the variance of the Swank estimate. We analyze the effect of VRTs on the statistical uncertainty associated with Swank factors. VRTs increased the relative efficiency by as much as a factor of 9. We demonstrate that we can achieve the same variance of the Swank factor with less computing time. With this approach, the simulations can be stopped when the variance of the variance estimates reaches the desired level of uncertainty. We implemented analytic estimates of the variance of Swank factor and demonstrated the effect of VRTs on image quality calculations. Our findings indicate that the Swank factor is dominated by the x-ray interaction profile as compared to the additional uncertainty introduced in the optical transport by the use of VRTs. For simulation experiments that aim at reducing the uncertainty in the Swank factor estimate, any of the proposed VRT can be used for increasing the relative

  10. POETICS OF TRANSCENDENCE: STYLISTIC REDUCTION AS A TOOL FOR REPRESENTATION OF SACRED MEANINGS

    Directory of Open Access Journals (Sweden)

    Elena Brazgovskaya

    2016-10-01

    Full Text Available The main direction of the work is connected to the representation of abstract (transcendent objects in music and literature. The article analyses "Cantus in Memoriam Benjamin Britten" by Arvo Pärt and some poems of Czesław Miłosz. The metaphysical dimension of reality involves forms and things, existing beyond the boundaries of empirical perception and, at first sight, beyond the descriptive practices. Abstract objects are available in intellectual experience, but culture must transform them into a symbolic form. As a rule, it is connected to the practice of art minimalism. The essence of minimalism is the reduction of number of stylistic tools and “purification” the perception from the visual / auditory images (not a mimetic use of language. For the representation of the sacred Pärt uses only mensural canon form, scale and chord. These “characters” are deprived of descriptive function, but have symbolic potential (canon as a sign of stopped time, the eternal return. The distinctive feature of the Miłoszʼs style is the pursuit to “clean” the signs (indexical and symbolic. There is the reverse side of language distillation: the rejection of the subjective position, emotional experience, the distance between the person and the object of representation.

  11. BombCAD - A new tool for bomb defense in nuclear facilities

    International Nuclear Information System (INIS)

    Massa, D.J.; Howard, J.W.; Sturm, S.R.

    1988-01-01

    This paper describes a new tool for analysis of the specific vulnerability of diverse facilites to bomb attack and for computer-aided-design (CAD) of siting, screening and hardening/softening aspects of comprehensive bomb defense programs. BombCAD combines the extensive architectural and engineering data base and graphics capabilities of modern architectural CAD systems with the bomb effects computational capability of the ''SECUREPLAN'' BOMB UTILITY. BombCAD permits architects/engineers, security professionals and facility managers to analytically estimate and graphically display facility vulnerability and changes (reductions) in vulnerability which result from the adoption of various bomb defense measures

  12. Climate Action Planning Tool | NREL

    Science.gov (United States)

    NREL's Climate Action Planning Tool provides a quick, basic estimate of how various technology options can contribute to an overall climate action plan for your research campus. Use the tool to Tool Calculation Formulas and Assumptions Climate Neutral Research Campuses Website Climate Neutral

  13. Cost Estimating Cases: Educational Tools for Cost Analysts

    Science.gov (United States)

    1993-09-01

    only appropriate documentation should be provided. In other words, students should not submit all of the documentation possible using ACEIT , only that...case was their lack of understanding of the ACEIT software used to conduct the estimate. Specifically, many students misinterpreted the cost...estimating relationships (CERs) embedded in the 49 software. Additionally, few of the students were able to properly organize the ACEIT documentation output

  14. Approximate zero-variance Monte Carlo estimation of Markovian unreliability

    International Nuclear Information System (INIS)

    Delcoux, J.L.; Labeau, P.E.; Devooght, J.

    1997-01-01

    Monte Carlo simulation has become an important tool for the estimation of reliability characteristics, since conventional numerical methods are no more efficient when the size of the system to solve increases. However, evaluating by a simulation the probability of occurrence of very rare events means playing a very large number of histories of the system, which leads to unacceptable computation times. Acceleration and variance reduction techniques have to be worked out. We show in this paper how to write the equations of Markovian reliability as a transport problem, and how the well known zero-variance scheme can be adapted to this application. But such a method is always specific to the estimation of one quality, while a Monte Carlo simulation allows to perform simultaneously estimations of diverse quantities. Therefore, the estimation of one of them could be made more accurate while degrading at the same time the variance of other estimations. We propound here a method to reduce simultaneously the variance for several quantities, by using probability laws that would lead to zero-variance in the estimation of a mean of these quantities. Just like the zero-variance one, the method we propound is impossible to perform exactly. However, we show that simple approximations of it may be very efficient. (author)

  15. Cost estimation tools in Germany and the UK. Comparison of cost estimates and actual costs

    International Nuclear Information System (INIS)

    Pfeifer, W.; Gordelier, S.; Drake, V.

    2005-01-01

    Full text: Accurate cost estimation for future decommissioning projects is a matter of considerable importance, especially for ensuring that sufficient funds will be available at the time of project implementation. This paper looks at the experience of cost estimation and real implementation outcomes from two countries, Germany and the UK, and draws lessons for the future. In Germany, cost estimates for the decommissioning of power reactors are updated every two years. For this purpose, the STILLKO program of the NIS Company is used. So far, Forschungszentrum Karlsruhe has successfully decommissioned two prototype reactor facilities. Re-cultivation of the premises has already been completed. At the moment, the activated components of the multi-purpose research reactor (MZFR), the first pressurized water reactor in Germany that was moderated and cooled with heavy water, and of the prototype fast breeder reactor (KNK) are being dismantled remotely. Consequently, vast experience exists in particular for the updating of total costs on the basis of actually incurred expenses. The further the dismantling work proceeds, the more reliable is the total cost estimate. Here, the development of the estimated MZFR decommissioning costs shall be presented and compared with the estimates obtained for a German reference PWR-type power reactor of 1200 MW. In this way: - common features of the prototype reactor and power reactor shall be emphasized, - several parameters leading to an increase in the estimated costs shall be highlighted, - cost risks shall be outlined with the remote dismantling of the reactor pressure vessel serving as an example, - calculation parameters shall be presented, and - recommendations shall be made for a consistent estimation of costs. The United Kingdom Atomic Energy Authority (UKAEA) has a major programme for the environmental remediation of its former research and development sites at Dounreay, Windscale, Harwell and Winfrith together with the need to

  16. Field scale modeling to estimate phosphorus and sediment load reductions using a newly developed graphical user interface for soil and water assessment tool

    Science.gov (United States)

    Streams throughout the North Canadian River watershed in northwest Oklahoma, USA have elevated levels of nutrients and sediment. SWAT (Soil and Water Assessment Tool) was used to identify areas that likely contributed disproportionate amounts of phosphorus (P) and sediment to Lake Overholser, the re...

  17. MURMoT: Design and Application of Microbial Uranium Reduction Monitoring Tools

    Energy Technology Data Exchange (ETDEWEB)

    Pennell, Kurt [Tufts Univ., Medford, MA (United States)

    2014-12-31

    The overarching project goal of the MURMoT project was the design of tools to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-transforming bacteria. To accomplish these objectives, an integrated approach that combined nucleic acid-based tools, proteomic workflows, uranium isotope measurements, and U(IV) speciation and structure analyses using the Advanced Photon Source (APS) at Argonne National Laboratory was developed.

  18. MURMoT: Design and Application of Microbial Uranium Reduction Monitoring Tools

    International Nuclear Information System (INIS)

    Pennell, Kurt

    2014-01-01

    The overarching project goal of the MURMoT project was the design of tools to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-transforming bacteria. To accomplish these objectives, an integrated approach that combined nucleic acid-based tools, proteomic workflows, uranium isotope measurements, and U(IV) speciation and structure analyses using the Advanced Photon Source (APS) at Argonne National Laboratory was developed.

  19. A software tool to estimate the dynamic behaviour of the IP2C samples as sensors for didactic purposes

    International Nuclear Information System (INIS)

    Graziani, S.; Pagano, F.; Pitrone, N.; Umana, E.

    2010-01-01

    Ionic Polymer Polymer Composites (IP 2 Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP 2 C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP 2 Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP 2 C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP 2 C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  20. A technical review of urban land use - transportation models as tools for evaluating vehicle travel reduction strategies

    Energy Technology Data Exchange (ETDEWEB)

    Southworth, F.

    1995-07-01

    The continued growth of highway traffic in the United States has led to unwanted urban traffic congestion as well as to noticeable urban air quality problems. These problems include emissions covered by the 1990 Clean Air Act Amendments (CAAA) and 1991 Intermodal Surface Transportation Efficiency Act (ISTEA), as well as carbon dioxide and related {open_quotes}greenhouse gas{close_quotes} emissions. Urban travel also creates a major demand for imported oil. Therefore, for economic as well as environmental reasons, transportation planning agencies at both the state and metropolitan area level are focussing a good deal of attention on urban travel reduction policies. Much discussed policy instruments include those that encourage fewer trip starts, shorter trip distances, shifts to higher-occupancy vehicles or to nonvehicular modes, and shifts in the timing of trips from the more to the less congested periods of the day or week. Some analysts have concluded that in order to bring about sustainable reductions in urban traffic volumes, significant changes will be necessary in the way our households and businesses engage in daily travel. Such changes are likely to involve changes in the ways we organize and use traffic-generating and-attracting land within our urban areas. The purpose of this review is to evaluate the ability of current analytic methods and models to support both the evaluation and possibly the design of such vehicle travel reduction strategies, including those strategies involving the reorganization and use of urban land. The review is organized into three sections. Section 1 describes the nature of the problem we are trying to model, Section 2 reviews the state of the art in operational urban land use-transportation simulation models, and Section 3 provides a critical assessment of such models as useful urban transportation planning tools. A number of areas are identified where further model development or testing is required.

  1. On Commitments and Other Uncertainty Reduction Tools in Joint Action

    Directory of Open Access Journals (Sweden)

    Michael John

    2015-01-01

    Full Text Available In this paper, we evaluate the proposal that a central function of commitments within joint action is to reduce various kinds of uncertainty, and that this accounts for the prevalence of commitments in joint action. While this idea is prima facie attractive, we argue that it faces two serious problems. First, commitments can only reduce uncertainty if they are credible, and accounting for the credibility of commitments proves not to be straightforward. Second, there are many other ways in which uncertainty is commonly reduced within joint actions, which raises the possibility that commitments may be superfluous. Nevertheless, we argue that the existence of these alternative uncertainty reduction processes does not make commitments superfluous after all but, rather, helps to explain how commitments may contribute in various ways to uncertainty reduction.

  2. Mathematical simulation for estimating reduction of breast cancer mortality in mass screening using mammography

    International Nuclear Information System (INIS)

    Iinuma, Takeshi; Matsumoto, Tohru; Tateno, Yukio

    1999-01-01

    In Japan it is considered that mammography should be introduced with physical examination for the mass screening of breast cancer instead of physical examination alone, which is performed at present. Before the introduction of mammography, a mathematical simulation should be performed to show the reduction in breast cancer mortality by mass screening compared with an unscreened population. A mathematical model of cancer screening devised by the authors was used to estimate the number of deaths due to breast cancer (A) in the screened group and those (B) in the unscreened group within the same population. Then the relative risk (RR) and attributable risk (RD) were calculated as (A/B) and (B-A) respectively. Three methods of mass screening were compared: (1) physical examination (1-year interval), (2) mammography with physical examination (1-year interval), (3) mammography with physical examination (2-year interval). The calculated RR values were 0.85 for (1), 0.60 for (2) and 0.69 for (3). Assuming that the incidence of breast cancer was 100/10 5 person-years, the calculated RD values were 3.0, 8.1 and 6.2 persons/10 5 person-years for (1), (2) and (3), respectively. The 95% confidence interval of RR for three methods was over 1.0, and thus the reduction of breast cancer mortality was not statistically significant in the present population. In conclusion, mammography with physical examination may reduce breast cancer mortality in comparison with physical examination alone, but a larger number of women must be screened in order to obtain a significant RR value. (author)

  3. Tensor integrand reduction via Laurent expansion

    Energy Technology Data Exchange (ETDEWEB)

    Hirschi, Valentin [SLAC, National Accelerator Laboratory,2575 Sand Hill Road, Menlo Park, CA 94025-7090 (United States); Peraro, Tiziano [Higgs Centre for Theoretical Physics, School of Physics and Astronomy,The University of Edinburgh,Edinburgh EH9 3JZ, Scotland (United Kingdom)

    2016-06-09

    We introduce a new method for the application of one-loop integrand reduction via the Laurent expansion algorithm, as implemented in the public C++ library Ninja. We show how the coefficients of the Laurent expansion can be computed by suitable contractions of the loop numerator tensor with cut-dependent projectors, making it possible to interface Ninja to any one-loop matrix element generator that can provide the components of this tensor. We implemented this technique in the Ninja library and interfaced it to MADLOOP, which is part of the public MADGRAPH5{sub A}MC@NLO framework. We performed a detailed performance study, comparing against other public reduction tools, namely CUTTOOLS, SAMURAI, IREGI, PJFRY++ and GOLEM95. We find that Ninja outperforms traditional integrand reduction in both speed and numerical stability, the latter being on par with that of the tensor integral reduction tool GOLEM95 which is however more limited and slower than Ninja. We considered many benchmark multi-scale processes of increasing complexity, involving QCD and electro-weak corrections as well as effective non-renormalizable couplings, showing that Ninja’s performance scales well with both the rank and multiplicity of the considered process.

  4. Omitted variable bias in crash reduction factors.

    Science.gov (United States)

    2015-09-01

    Transportation planners and traffic engineers are increasingly turning to crash reduction factors to evaluate changes in road : geometric and design features in order to reduce crashes. Crash reduction factors are typically estimated based on segment...

  5. COST ESTIMATING RELATIONSHIPS IN ONSHORE DRILLING PROJECTS

    Directory of Open Access Journals (Sweden)

    Ricardo de Melo e Silva Accioly

    2017-03-01

    Full Text Available Cost estimating relationships (CERs are very important tools in the planning phases of an upstream project. CERs are, in general, multiple regression models developed to estimate the cost of a particular item or scope of a project. They are based in historical data that should pass through a normalization process before fitting a model. In the early phases they are the primary tool for cost estimating. In later phases they are usually used as an estimation validation tool and sometimes for benchmarking purposes. As in any other modeling methodology there are number of important steps to build a model. In this paper the process of building a CER to estimate drilling cost of onshore wells will be addressed.

  6. S-bases as a tool to solve reduction problems for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, A.V.; Smirnov, V.A.

    2006-01-01

    We suggest a mathematical definition of the notion of master integrals and present a brief review of algorithmic methods to solve reduction problems for Feynman integrals based on integration by parts relations. In particular, we discuss a recently suggested reduction algorithm which uses Groebner bases. New results obtained with its help for a family of three-loop Feynman integrals are outlined

  7. S-bases as a tool to solve reduction problems for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, A.V. [Scientific Research Computing Center of Moscow State University, Moscow 119992 (Russian Federation); Smirnov, V.A. [Nuclear Physics Institute of Moscow State University, Moscow 119992 (Russian Federation)

    2006-10-15

    We suggest a mathematical definition of the notion of master integrals and present a brief review of algorithmic methods to solve reduction problems for Feynman integrals based on integration by parts relations. In particular, we discuss a recently suggested reduction algorithm which uses Groebner bases. New results obtained with its help for a family of three-loop Feynman integrals are outlined.

  8. Risk Reduction with a Fuzzy Expert Exploration Tool

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, William W.; Broadhead, Ron; Sung, Andrew

    2000-10-24

    This project developed an Artificial Intelligence system that drew up on a wide variety of information in providing realistic estimates of risk. ''Fuzzy logic,'' a system of integrating large amounts of inexact, incomplete information with modern computational methods derived usable conclusions, were demonstrated as a cost-effective computational technology in many industrial applications.

  9. Examination of an indicative tool for rapidly estimating viable organism abundance in ballast water

    Science.gov (United States)

    Vanden Byllaardt, Julie; Adams, Jennifer K.; Casas-Monroy, Oscar; Bailey, Sarah A.

    2018-03-01

    Regulatory discharge standards stipulating a maximum allowable number of viable organisms in ballast water have led to a need for rapid, easy and accurate compliance assessment tools and protocols. Some potential tools presume that organisms present in ballast water samples display the same characteristics of life as the native community (e.g. rates of fluorescence). This presumption may not prove true, particularly when ships' ballast tanks present a harsh environment and long transit times, negatively impacting organism health. Here, we test the accuracy of a handheld pulse amplitude modulated (PAM) fluorometer, the Hach BW680, for detecting photosynthetic protists at concentrations above or below the discharge standard (< 10 cells·ml- 1) in comparison to microscopic counts using fluorescein diacetate as a viability probe. Testing was conducted on serial dilutions of freshwater harbour samples in the lab and in situ untreated ballast water samples originating from marine, freshwater and brackish sources utilizing three preprocessing techniques to target organisms in the size range of ≥ 10 and < 50 μm. The BW680 numeric estimates were in agreement with microscopic counts when analyzing freshly collected harbour water at all but the lowest concentrations (< 38 cells·ml- 1). Chi-square tests determined that error is not independent of preprocessing methods: using the filtrate method or unfiltered water, in addition to refining the conversion factor of raw fluorescence to cell size, can decrease the grey area where exceedance of the discharge standard cannot be measured with certainty (at least for the studied populations). When examining in situ ballast water, the BW680 detected significantly fewer viable organisms than microscopy, possibly due to factors such as organism size or ballast water age. Assuming both the BW680 and microscopy with FDA stain were measuring fluorescence and enzymatic activity/membrane integrity correctly, the observed discrepancy

  10. Measurement reduction for mutual coupling calibration in DOA estimation

    Science.gov (United States)

    Aksoy, Taylan; Tuncer, T. Engin

    2012-01-01

    Mutual coupling is an important source of error in antenna arrays that should be compensated for super resolution direction-of-arrival (DOA) algorithms, such as Multiple Signal Classification (MUSIC) algorithm. A crucial step in array calibration is the determination of the mutual coupling coefficients for the antenna array. In this paper, a system theoretic approach is presented for the mutual coupling characterization of antenna arrays. The comprehension and implementation of this approach is simple leading to further advantages in calibration measurement reduction. In this context, a measurement reduction method for antenna arrays with omni-directional and identical elements is proposed which is based on the symmetry planes in the array geometry. The proposed method significantly decreases the number of measurements during the calibration process. This method is evaluated using different array types whose responses and the mutual coupling characteristics are obtained through numerical electromagnetic simulations. It is shown that a single calibration measurement is sufficient for uniform circular arrays. Certain important and interesting characteristics observed during the experiments are outlined.

  11. Aircraft parameter estimation ± A tool for development of ...

    Indian Academy of Sciences (India)

    In addition, actuator performance and controller gains may be flight condition dependent. Moreover, this approach may result in open-loop parameter estimates with low accuracy. 6. Aerodynamic databases for high fidelity flight simulators. Estimation of a comprehensive aerodynamic model suitable for a flight simulator is an.

  12. The estimated reduction in the odds of loss-of-control type crashes for sport utility vehicles equipped with electronic stability control.

    Science.gov (United States)

    Green, Paul E; Woodrooffe, John

    2006-01-01

    Using data from the NASS General Estimates System (GES), the method of induced exposure was used to assess the effects of electronic stability control (ESC) on loss-of-control type crashes for sport utility vehicles. Sport utility vehicles were classified into crash types generally associated with loss of control and crash types most likely not associated with loss of control. Vehicles were then compared as to whether ESC technology was present or absent in the vehicles. A generalized additive model was fit to assess the effects of ESC, driver age, and driver gender on the odds of loss of control. In addition, the effects of ESC on roads that were not dry were compared to effects on roads that were dry. Overall, the estimated percentage reduction in the odds of a loss-of-control crash for sport utility vehicles equipped with ESC was 70.3%. Both genders and all age groups showed reduced odds of loss-of-control crashes, but there was no significant difference between males and females. With respect to driver age, the maximum percentage reduction of 73.6% occurred at age 27. The positive effects of ESC on roads that were not dry were significantly greater than on roads that were dry.

  13. Binomial Distribution Sample Confidence Intervals Estimation 7. Absolute Risk Reduction and ARR-like Expressions

    Directory of Open Access Journals (Sweden)

    Andrei ACHIMAŞ CADARIU

    2004-08-01

    Full Text Available Assessments of a controlled clinical trial suppose to interpret some key parameters as the controlled event rate, experimental event date, relative risk, absolute risk reduction, relative risk reduction, number needed to treat when the effect of the treatment are dichotomous variables. Defined as the difference in the event rate between treatment and control groups, the absolute risk reduction is the parameter that allowed computing the number needed to treat. The absolute risk reduction is compute when the experimental treatment reduces the risk for an undesirable outcome/event. In medical literature when the absolute risk reduction is report with its confidence intervals, the method used is the asymptotic one, even if it is well know that may be inadequate. The aim of this paper is to introduce and assess nine methods of computing confidence intervals for absolute risk reduction and absolute risk reduction – like function.Computer implementations of the methods use the PHP language. Methods comparison uses the experimental errors, the standard deviations, and the deviation relative to the imposed significance level for specified sample sizes. Six methods of computing confidence intervals for absolute risk reduction and absolute risk reduction-like functions were assessed using random binomial variables and random sample sizes.The experiments shows that the ADAC, and ADAC1 methods obtains the best overall performance of computing confidence intervals for absolute risk reduction.

  14. Influenza vaccination coverage estimates in the fee-for service Medicare beneficiary population 2006 - 2016: Using population-based administrative data to support a geographic based near real-time tool.

    Science.gov (United States)

    Shen, Angela K; Warnock, Rob; Brereton, Stephaeno; McKean, Stephen; Wernecke, Michael; Chu, Steve; Kelman, Jeffrey A

    2018-04-11

    Older adults are at great risk of developing serious complications from seasonal influenza. We explore vaccination coverage estimates in the Medicare population through the use of administrative claims data and describe a tool designed to help shape outreach efforts and inform strategies to help raise influenza vaccination rates. This interactive mapping tool uses claims data to compare vaccination levels between geographic (i.e., state, county, zip code) and demographic (i.e., race, age) groups at different points in a season. Trends can also be compared across seasons. Utilization of this tool can assist key actors interested in prevention - medical groups, health plans, hospitals, and state and local public health authorities - in supporting strategies for reaching pools of unvaccinated beneficiaries where general national population estimates of coverage are less informative. Implementing evidence-based tools can be used to address persistent racial and ethnic disparities and prevent a substantial number of influenza cases and hospitalizations.

  15. A software tool to estimate the dynamic behaviour of the IP{sup 2}C samples as sensors for didactic purposes

    Energy Technology Data Exchange (ETDEWEB)

    Graziani, S.; Pagano, F.; Pitrone, N.; Umana, E., E-mail: nicola.pitrone@diees.unict.i [Dipartimento di Ingegneria Elettrica Elettronica e dei Sistemi -University of Catania V.le A. Doria 6, 95125, Catania (Italy)

    2010-07-01

    Ionic Polymer Polymer Composites (IP{sup 2}Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP{sup 2}C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP{sup 2}Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP{sup 2}C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP{sup 2}C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  16. Health gain by salt reduction in europe: a modelling study.

    Directory of Open Access Journals (Sweden)

    Marieke A H Hendriksen

    Full Text Available Excessive salt intake is associated with hypertension and cardiovascular diseases. Salt intake exceeds the World Health Organization population nutrition goal of 5 grams per day in the European region. We assessed the health impact of salt reduction in nine European countries (Finland, France, Ireland, Italy, Netherlands, Poland, Spain, Sweden and United Kingdom. Through literature research we obtained current salt intake and systolic blood pressure levels of the nine countries. The population health modeling tool DYNAMO-HIA including country-specific disease data was used to predict the changes in prevalence of ischemic heart disease and stroke for each country estimating the effect of salt reduction through its effect on blood pressure levels. A 30% salt reduction would reduce the prevalence of stroke by 6.4% in Finland to 13.5% in Poland. Ischemic heart disease would be decreased by 4.1% in Finland to 8.9% in Poland. When salt intake is reduced to the WHO population nutrient goal, it would reduce the prevalence of stroke from 10.1% in Finland to 23.1% in Poland. Ischemic heart disease would decrease by 6.6% in Finland to 15.5% in Poland. The number of postponed deaths would be 102,100 (0.9% in France, and 191,300 (2.3% in Poland. A reduction of salt intake to 5 grams per day is expected to substantially reduce the burden of cardiovascular disease and mortality in several European countries.

  17. High Accuracy Nonlinear Control and Estimation for Machine Tool Systems

    DEFF Research Database (Denmark)

    Papageorgiou, Dimitrios

    Component mass production has been the backbone of industry since the second industrial revolution, and machine tools are producing parts of widely varying size and design complexity. The ever-increasing level of automation in modern manufacturing processes necessitates the use of more...... sophisticated machine tool systems that are adaptable to different workspace conditions, while at the same time being able to maintain very narrow workpiece tolerances. The main topic of this thesis is to suggest control methods that can maintain required manufacturing tolerances, despite moderate wear and tear....... The purpose is to ensure that full accuracy is maintained between service intervals and to advice when overhaul is needed. The thesis argues that quality of manufactured components is directly related to the positioning accuracy of the machine tool axes, and it shows which low level control architectures...

  18. Chatter and machine tools

    CERN Document Server

    Stone, Brian

    2014-01-01

    Focussing on occurrences of unstable vibrations, or Chatter, in machine tools, this book gives important insights into how to eliminate chatter with associated improvements in product quality, surface finish and tool wear. Covering a wide range of machining processes, including turning, drilling, milling and grinding, the author uses his research expertise and practical knowledge of vibration problems to provide solutions supported by experimental evidence of their effectiveness. In addition, this book contains links to supplementary animation programs that help readers to visualise the ideas detailed in the text. Advancing knowledge in chatter avoidance and suggesting areas for new innovations, Chatter and Machine Tools serves as a handbook for those desiring to achieve significant reductions in noise, longer tool and grinding wheel life and improved product finish.

  19. Risk reduction by filtered venting in PWR large dry-containments

    International Nuclear Information System (INIS)

    Gazzillo, F.; Kastenberg, W.E.

    1984-01-01

    The potential risk reduction associated with a Filtered-Vented Containment System is evaluated. A low-volume venting strategy has been considered and data referring to the Zion power plant, along with the results of the Zion Probabilistic Safety Study, have been used. An estimate of the reduction factor is first made for a single core melt accident sequence whose containment failure mode is late overpressure. The result, interpreted as a reduction factor applicable to the release category associated with containment late overpressure is then used for the estimation of the overall risk reduction factor. In particular, the case of internal and external risk for the Zion power plant are considered. Because the contribution from seismic events dominates the overall risk, the importance of different assumptions for seismic fragility is also assessed. Finally an uncertainty analysis of the risk reduction factor for a single accident sequence is performed. An estimate is also obtained on the level of confidence with which certain required values of risk reduction can be achieved. (orig.)

  20. Estimation of economic parameters of U.S. hydropower resources

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Douglas G. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL); Hunt, Richard T. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL); Reeves, Kelly S. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL); Carroll, Greg R. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL)

    2003-06-01

    Tools for estimating the cost of developing and operating and maintaining hydropower resources in the form of regression curves were developed based on historical plant data. Development costs that were addressed included: licensing, construction, and five types of environmental mitigation. It was found that the data for each type of cost correlated well with plant capacity. A tool for estimating the annual and monthly electric generation of hydropower resources was also developed. Additional tools were developed to estimate the cost of upgrading a turbine or a generator. The development and operation and maintenance cost estimating tools, and the generation estimating tool were applied to 2,155 U.S. hydropower sites representing a total potential capacity of 43,036 MW. The sites included totally undeveloped sites, dams without a hydroelectric plant, and hydroelectric plants that could be expanded to achieve greater capacity. Site characteristics and estimated costs and generation for each site were assembled in a database in Excel format that is also included within the EERE Library under the title, “Estimation of Economic Parameters of U.S. Hydropower Resources - INL Hydropower Resource Economics Database.”

  1. Energy Saving Melting and Revert Reduction Technology (E-SMARRT): Development of Surface Engineered Coating Systems for Aluminum Pressure Die Casting Dies: Towards a 'Smart' Die Coating

    Energy Technology Data Exchange (ETDEWEB)

    Dr. John J. Moore; Dr. Jianliang Lin,

    2012-07-31

    The main objective of this research program was to design and develop an optimal coating system that extends die life by minimizing premature die failure. In high-pressure aluminum die-casting, the die, core pins and inserts must withstand severe processing conditions. Many of the dies and tools in the industry are being coated to improve wear-resistance and decrease down-time for maintenance. However, thermal fatigue in metal itself can still be a major problem, especially since it often leads to catastrophic failure (i.e. die breakage) as opposed to a wear-based failure (parts begin to go out of tolerance). Tooling costs remain the largest portion of production costs for many of these parts, so the ability prevent catastrophic failures would be transformative for the manufacturing industry.The technology offers energy savings through reduced energy use in the die casting process from several factors, including increased life of the tools and dies, reuse of the dies and die components, reduction/elimination of lubricants, and reduced machine down time, and reduction of Al solder sticking on the die. The use of the optimized die coating system will also reduce environmental wastes and scrap parts. Current (2012) annual energy saving estimates, based on initial dissemination to the casting industry in 2010 and market penetration of 80% by 2020, is 3.1 trillion BTU's/year. The average annual estimate of CO2 reduction per year through 2020 is 0.63 Million Metric Tons of Carbon Equivalent (MM TCE).

  2. Determination of Flood Reduction Alternatives for Climate Change Adaptation in Gyeongancheon basin

    Science.gov (United States)

    Han, D.; Joo, H. J.; Jung, J.; Kim, H. S.

    2017-12-01

    Recently, the frequency of extreme rainfall event has increased due to the climate change and the impermeable area in an urban watershed has also increased due to the rapid urbanization. Therefore, the flood risk is increasing and we ought to prepare countermeasures for flood damage reduction. For the determination of appropriate measures or alternatives, firstly, this study estimated the frequency based rainfall considering the climate change according to the each target period(reference : 1971˜2010, Target period Ⅰ : 2011˜2040, Target period Ⅱ : 2041˜2070, Target period Ⅲ : 2071˜2100). Then the future flood discharge was computed by using HEC-HMS model. We set 5 sizes of drainage pumps and detention ponds respectively as the flood reduction alternatives and the flood level in the river was obtained by each alternative through HEC-RAS model. The flood inundation map was constructed using topographical data and flood water level in the river and the economic analysis was conducted for the flood damage reduction studies using Multi Dimensional Flood Damage Analysis (MD-FDA) tool. As a result of the effectiveness analysis of the flood reduction alternatives, the flood level by drainage pump was reduced by 0.06m up to 0.44m while it was reduced by 0.01m up to 1.86m in the case of the detention pond. The flooded area was shrunk by up to 32.64% from 0.3% and inundation depth was also dropped. As a result of a comparison of the Benefit/Cost ratio estimated by the economic analysis, a detention pond E in the target period Ⅰ and the pump D in the periods Ⅱ and Ⅲ were considered as the appropriate alternatives for the flood damage reduction under the climate change. AcknowledgementsThis research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning(2017R1A2B3005695)

  3. Landscape planning for agricultural nonpoint source pollution reduction III: Assessing phosphorus and sediment reduction potential

    Science.gov (United States)

    Diebel, M.W.; Maxted, J.T.; Robertson, Dale M.; Han, S.; Vander Zanden, M. J.

    2009-01-01

    Riparian buffers have the potential to improve stream water quality in agricultural landscapes. This potential may vary in response to landscape characteristics such as soils, topography, land use, and human activities, including legacies of historical land management. We built a predictive model to estimate the sediment and phosphorus load reduction that should be achievable following the implementation of riparian buffers; then we estimated load reduction potential for a set of 1598 watersheds (average 54 km2) in Wisconsin. Our results indicate that land cover is generally the most important driver of constituent loads in Wisconsin streams, but its influence varies among pollutants and according to the scale at which it is measured. Physiographic (drainage density) variation also influenced sediment and phosphorus loads. The effect of historical land use on present-day channel erosion and variation in soil texture are the most important sources of phosphorus and sediment that riparian buffers cannot attenuate. However, in most watersheds, a large proportion (approximately 70%) of these pollutants can be eliminated from streams with buffers. Cumulative frequency distributions of load reduction potential indicate that targeting pollution reduction in the highest 10% of Wisconsin watersheds would reduce total phosphorus and sediment loads in the entire state by approximately 20%. These results support our approach of geographically targeting nonpoint source pollution reduction at multiple scales, including the watershed scale. ?? 2008 Springer Science+Business Media, LLC.

  4. Phase-processing as a tool for speckle reduction in pulse-echo images

    DEFF Research Database (Denmark)

    Healey, AJ; Leeman, S; Forsberg, F

    1991-01-01

    . Traditional speckle reduction procedures regard speckle correction as a stochastic process and trade image smoothing (resolution loss) for speckle reduction. Recently, a new phase acknowledging technique has been proposed that is unique in its ability to correct for speckle interference with no image......Due to the coherent nature of conventional ultrasound medical imaging systems interference artefacts occur in pulse echo images. These artefacts are generically termed 'speckle'. The phenomenon may severely limit low contrast resolution with clinically relevant information being obscured...

  5. Methane emission reduction: an application of FUND

    NARCIS (Netherlands)

    Tol, R.S.J.; Heintz, R.J.; Lammers, P.E.M.

    2003-01-01

    Methane is, after carbon dioxide, the most important anthropogenic greenhouse gas. Governments plan to abate methane emissions. A crude set of estimates of reduction costs is included in FUND, an integrated assessment model of climate change. In a cost-benefit analysis, methane emission reduction is

  6. Welfare Effects of Tariff Reduction Formulas

    DEFF Research Database (Denmark)

    Guldager, Jan G.; Schröder, Philipp J.H.

    WTO negotiations rely on tariff reduction formulas. It has been argued that formula approaches are of increasing importance in trade talks, because of the large number of countries involved, the wider dispersion in initial tariffs (e.g. tariff peaks) and gaps between bound and applied tariff rate....... No single formula dominates for all conditions. The ranking of the three tools depends on the degree of product differentiation in the industry, and the achieved reduction in the average tariff....

  7. A modal-based reduction method for sound absorbing porous materials in poro-acoustic finite element models.

    Science.gov (United States)

    Rumpler, Romain; Deü, Jean-François; Göransson, Peter

    2012-11-01

    Structural-acoustic finite element models including three-dimensional (3D) modeling of porous media are generally computationally costly. While being the most commonly used predictive tool in the context of noise reduction applications, efficient solution strategies are required. In this work, an original modal reduction technique, involving real-valued modes computed from a classical eigenvalue solver is proposed to reduce the size of the problem associated with the porous media. In the form presented in this contribution, the method is suited for homogeneous porous layers. It is validated on a 1D poro-acoustic academic problem and tested for its performance on a 3D application, using a subdomain decomposition strategy. The performance of the proposed method is estimated in terms of degrees of freedom downsizing, computational time enhancement, as well as matrix sparsity of the reduced system.

  8. Usefulness of the automatic quantitative estimation tool for cerebral blood flow: clinical assessment of the application software tool AQCEL.

    Science.gov (United States)

    Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi

    2011-01-01

    AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation

  9. Heuristic and probabilistic wind power availability estimation procedures: Improved tools for technology and site selection

    Energy Technology Data Exchange (ETDEWEB)

    Nigim, K.A. [University of Waterloo, Waterloo, Ont. (Canada). Department of Electrical and Computer Engineering; Parker, Paul [University of Waterloo, Waterloo, Ont. (Canada). Department of Geography, Environmental Studies

    2007-04-15

    The paper describes two investigative procedures to estimate wind power from measured wind velocities. Wind velocity data are manipulated to visualize the site potential by investigating the probable wind power availability and its capacity to meet a targeted demand. The first procedure is an availability procedure that looks at the wind characteristics and its probable energy capturing profile. This profile of wind enables the probable maximum operating wind velocity profile for a selected wind turbine design to be predicted. The structured procedures allow for a consequent adjustment, sorting and grouping of the measured wind velocity data taken at different time intervals and hub heights. The second procedure is the adequacy procedure that investigates the probable degree of availability and the application consequences. Both procedures are programmed using MathCAD symbolic mathematical software. The math tool is used to generate a visual interpolation of the data as well as numerical results from extensive data sets that exceed the capacity of conventional spreadsheet tools. Two sites located in Southern Ontario, Canada are investigated using the procedures. Successful implementation of the procedures supports informed decision making where a hill site is shown to have much higher wind potential than that measured at the local airport. The process is suitable for a wide spectrum of users who are considering the energy potential for either a grid-tied or off-grid wind energy system. (author)

  10. FunGeneNet: a web tool to estimate enrichment of functional interactions in experimental gene sets.

    Science.gov (United States)

    Tiys, Evgeny S; Ivanisenko, Timofey V; Demenkov, Pavel S; Ivanisenko, Vladimir A

    2018-02-09

    Estimation of functional connectivity in gene sets derived from genome-wide or other biological experiments is one of the essential tasks of bioinformatics. A promising approach for solving this problem is to compare gene networks built using experimental gene sets with random networks. One of the resources that make such an analysis possible is CrossTalkZ, which uses the FunCoup database. However, existing methods, including CrossTalkZ, do not take into account individual types of interactions, such as protein/protein interactions, expression regulation, transport regulation, catalytic reactions, etc., but rather work with generalized types characterizing the existence of any connection between network members. We developed the online tool FunGeneNet, which utilizes the ANDSystem and STRING to reconstruct gene networks using experimental gene sets and to estimate their difference from random networks. To compare the reconstructed networks with random ones, the node permutation algorithm implemented in CrossTalkZ was taken as a basis. To study the FunGeneNet applicability, the functional connectivity analysis of networks constructed for gene sets involved in the Gene Ontology biological processes was conducted. We showed that the method sensitivity exceeds 0.8 at a specificity of 0.95. We found that the significance level of the difference between gene networks of biological processes and random networks is determined by the type of connections considered between objects. At the same time, the highest reliability is achieved for the generalized form of connections that takes into account all the individual types of connections. By taking examples of the thyroid cancer networks and the apoptosis network, it is demonstrated that key participants in these processes are involved in the interactions of those types by which these networks differ from random ones. FunGeneNet is a web tool aimed at proving the functionality of networks in a wide range of sizes of

  11. The Global Earthquake Model and Disaster Risk Reduction

    Science.gov (United States)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  12. Quality-assured evaluation of effective porosity using fit-for-purpose estimates of clay-mineral volume fraction

    Science.gov (United States)

    Worthington, Paul F.

    2010-05-01

    Reservoirs that contain dispersed clay minerals traditionally have been evaluated petrophysically using either the effective or the total porosity system. The major weakness of the former is its reliance on "shale" volume fraction ( Vsh) as a clay-mineral indicator in the determination of effective porosity from well logs. Downhole clay-mineral indicators have usually delivered overestimates of fractional clay-mineral volume ( Vcm) because they use as a reference nearby shale beds that are often assumed to comprise clay minerals exclusively, whereas those beds also include quartzitic silts and other detritus. For this reason, effective porosity is often underestimated significantly, and this shortfall transmits to computed hydrocarbons in place and thence to estimates of ultimate recovery. The problem is overcome here by using, as proxy groundtruths, core porosities that have been upscaled to match the spatial resolutions of porosity logs. Matrix and fluid properties are established over clean intervals in the usual way. Log-derived values of Vsh are tuned so that, on average, the resulting log-derived porosities match the corresponding core porosities over an evaluation interval. In this way, Vsh is rendered fit for purpose as an indicator of clay-mineral content Vcm for purposes of evaluating effective porosity. The method is conditioned to deliver a value of effective porosity that shows overall agreement with core porosity to within the limits of uncertainty of the laboratory measurements. This is achieved through function-, reservoir- and tool-specific Vsh reduction factors that can be applied to downhole estimates of clay-mineral content over uncored intervals of similar reservoir character. As expected, the reduction factors can also vary for different measurement conditions. The reduction factors lie in the range of 0.29-0.80, which means that in its raw form, log-derived Vsh can overestimate the clay-mineral content by more than a factor of three. This

  13. The construction of a decision tool to analyse local demand and local supply for GP care using a synthetic estimation model.

    Science.gov (United States)

    de Graaf-Ruizendaal, Willemijn A; de Bakker, Dinny H

    2013-10-27

    This study addresses the growing academic and policy interest in the appropriate provision of local healthcare services to the healthcare needs of local populations to increase health status and decrease healthcare costs. However, for most local areas information on the demand for primary care and supply is missing. The research goal is to examine the construction of a decision tool which enables healthcare planners to analyse local supply and demand in order to arrive at a better match. National sample-based medical record data of general practitioners (GPs) were used to predict the local demand for GP care based on local populations using a synthetic estimation technique. Next, the surplus or deficit in local GP supply were calculated using the national GP registry. Subsequently, a dynamic internet tool was built to present demand, supply and the confrontation between supply and demand regarding GP care for local areas and their surroundings in the Netherlands. Regression analysis showed a significant relationship between sociodemographic predictors of postcode areas and GP consultation time (F [14, 269,467] = 2,852.24; P 1,000 inhabitants in the Netherlands covering 97% of the total population. Confronting these estimated demand figures with the actual GP supply resulted in the average GP workload and the number of full-time equivalent (FTE) GP too much/too few for local areas to cover the demand for GP care. An estimated shortage of one FTE GP or more was prevalent in about 19% of the postcode areas with >1,000 inhabitants if the surrounding postcode areas were taken into consideration. Underserved areas were mainly found in rural regions. The constructed decision tool is freely accessible on the Internet and can be used as a starting point in the discussion on primary care service provision in local communities and it can make a considerable contribution to a primary care system which provides care when and where people need it.

  14. An automated A-value measurement tool for accurate cochlear duct length estimation.

    Science.gov (United States)

    Iyaniwura, John E; Elfarnawany, Mai; Ladak, Hanif M; Agrawal, Sumit K

    2018-01-22

    There has been renewed interest in the cochlear duct length (CDL) for preoperative cochlear implant electrode selection and postoperative generation of patient-specific frequency maps. The CDL can be estimated by measuring the A-value, which is defined as the length between the round window and the furthest point on the basal turn. Unfortunately, there is significant intra- and inter-observer variability when these measurements are made clinically. The objective of this study was to develop an automated A-value measurement algorithm to improve accuracy and eliminate observer variability. Clinical and micro-CT images of 20 cadaveric cochleae specimens were acquired. The micro-CT of one sample was chosen as the atlas, and A-value fiducials were placed onto that image. Image registration (rigid affine and non-rigid B-spline) was applied between the atlas and the 19 remaining clinical CT images. The registration transform was applied to the A-value fiducials, and the A-value was then automatically calculated for each specimen. High resolution micro-CT images of the same 19 specimens were used to measure the gold standard A-values for comparison against the manual and automated methods. The registration algorithm had excellent qualitative overlap between the atlas and target images. The automated method eliminated the observer variability and the systematic underestimation by experts. Manual measurement of the A-value on clinical CT had a mean error of 9.5 ± 4.3% compared to micro-CT, and this improved to an error of 2.7 ± 2.1% using the automated algorithm. Both the automated and manual methods correlated significantly with the gold standard micro-CT A-values (r = 0.70, p value measurement tool using atlas-based registration methods was successfully developed and validated. The automated method eliminated the observer variability and improved accuracy as compared to manual measurements by experts. This open-source tool has the potential to benefit

  15. Estimating the Value of Price Risk Reduction in Energy Efficiency Investments in Buildings

    Directory of Open Access Journals (Sweden)

    Pekka Tuominen

    2017-10-01

    Full Text Available This paper presents a method for calculating the value of price risk reduction to a consumer that can be achieved with investments in energy efficiency. The value of price risk reduction is discussed to some length in general terms in the literature reviewed but, so far, no methodology for calculating the value has been presented. Here we suggest such a method. The problem of valuating price risk reduction is approached using a variation of the Black–Scholes model by considering a hypothetical financial instrument that a consumer would purchase to insure herself against unexpected price hikes. This hypothetical instrument is then compared with an actual energy efficiency investment that reaches the same level of price risk reduction. To demonstrate the usability of the method, case examples are calculated for typical single-family houses in Finland. The results show that the price risk entailed in household energy consumption can be reduced by a meaningful amount with energy efficiency investments, and that the monetary value of this reduction can be calculated. It is argued that this often-overlooked benefit of energy efficiency investments merits more consideration in future studies.

  16. Compilation and testing of tools and methods for sustainable coastal management at local and regional scales : Deliverable D2.5.4, Thresholds project, 6th framework programme, EU, 108 p.

    OpenAIRE

    Håkanson, Lars

    2008-01-01

    This work describes how general methods and models for sustainable coastal ecosystem management at local to regional scales may be used to address key questions in coastal management and threshold science. The general, process-based mass-balance model (CoastMab) for substances transported to, within and from for coastal areas may be used as a tool to: 1. Combat eutrophication, 2. Rank nutrient fluxes, 3. Estimate the system response related to nutrient reductions and 4. Estimate realistic val...

  17. OligoHeatMap (OHM): an online tool to estimate and display hybridizations of oligonucleotides onto DNA sequences.

    Science.gov (United States)

    Croce, Olivier; Chevenet, François; Christen, Richard

    2008-07-01

    The efficiency of molecular methods involving DNA/DNA hybridizations depends on the accurate prediction of the melting temperature (T(m)) of the duplex. Many softwares are available for T(m) calculations, but difficulties arise when one wishes to check if a given oligomer (PCR primer or probe) hybridizes well or not on more than a single sequence. Moreover, the presence of mismatches within the duplex is not sufficient to estimate specificity as it does not always significantly decrease the T(m). OHM (OligoHeatMap) is an online tool able to provide estimates of T(m) for a set of oligomers and a set of aligned sequences, not only as text files of complete results but also in a graphical way: T(m) values are translated into colors and displayed as a heat map image, either stand alone or to be used by softwares such as TreeDyn to be included in a phylogenetic tree. OHM is freely available at http://bioinfo.unice.fr/ohm/, with links to the full source code and online help.

  18. Applying the Land Use Portfolio Model to Estimate Natural-Hazard Loss and Risk - A Hypothetical Demonstration for Ventura County, California

    Science.gov (United States)

    Dinitz, Laura B.

    2008-01-01

    With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS

  19. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    Science.gov (United States)

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  20. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    International Nuclear Information System (INIS)

    Franco, P.; Estrems, M.; Faura, F.

    2007-01-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools

  1. Brain Volume Estimation Enhancement by Morphological Image Processing Tools

    Directory of Open Access Journals (Sweden)

    Zeinali R.

    2017-12-01

    Full Text Available Background: Volume estimation of brain is important for many neurological applications. It is necessary in measuring brain growth and changes in brain in normal/ abnormal patients. Thus, accurate brain volume measurement is very important. Magnetic resonance imaging (MRI is the method of choice for volume quantification due to excellent levels of image resolution and between-tissue contrast. Stereology method is a good method for estimating volume but it requires to segment enough MRI slices and have a good resolution. In this study, it is desired to enhance stereology method for volume estimation of brain using less MRI slices with less resolution. Methods: In this study, a program for calculating volume using stereology method has been introduced. After morphologic method, dilation was applied and the stereology method enhanced. For the evaluation of this method, we used T1-wighted MR images from digital phantom in BrainWeb which had ground truth. Results: The volume of 20 normal brain extracted from BrainWeb, was calculated. The volumes of white matter, gray matter and cerebrospinal fluid with given dimension were estimated correctly. Volume calculation from Stereology method in different cases was made. In three cases, Root Mean Square Error (RMSE was measured. Case I with T=5, d=5, Case II with T=10, D=10 and Case III with T=20, d=20 (T=slice thickness, d=resolution as stereology parameters. By comparing these results of two methods, it is obvious that RMSE values for our proposed method are smaller than Stereology method. Conclusion: Using morphological operation, dilation allows to enhance the estimation volume method, Stereology. In the case with less MRI slices and less test points, this method works much better compared to Stereology method.

  2. Innovated Conceptual Design of Loading Unloading Tool for Livestock at the Port

    Science.gov (United States)

    Mustakim, Achmad; Hadi, Firmanto

    2018-03-01

    The condition of loading and unloading process of livestock in a number of Indonesian ports doesn’t meet the principle of animal welfare, which makes cattle lose weight and injury when unloaded. Livestock loading and unloading is done by throwing cattle into the sea one by one, tying cattle hung with a sling strap and push the cattle to the berth directly. This process is against PP. 82 year 2000 on Article 47 and 55 about animal welfare. Innovation of loading and unloading tools design offered are loading and unloading design with garbarata. In the design of loading and unloading tools with garbarata, apply the concept of semi-horizontal hydraulic ladder that connects the ship and truck directly. This livestock unloading equipment design innovation is a combination of fire extinguisher truck design and bridge equipped with weightlifting equipment. In 10 years of planning garbarata, requires a total cost of IDR 321,142,921; gets benefits IDR 923,352,333; and BCR (Benefit-Cost Ratio) Value worth 2.88. BCR value >1 means the tool is feasible applied. The designs of this loading and unloading tools are estimated up to 1 hour faster than existing way. It can also minimize risks such as injury and also weight reduction livestock agencies significantly.

  3. Real-time estimation of FLE for point-based registration

    Science.gov (United States)

    Wiles, Andrew D.; Peters, Terry M.

    2009-02-01

    In image-guide surgery, optimizing the accuracy in localizing the surgical tools within the virtual reality environment or 3D image is vitally important, significant effort has been spent reducing the measurement errors at the point of interest or target. This target registration error (TRE) is often defined by a root-mean-square statistic which reduces the vector data to a single term that can be minimized. However, lost in the data reduction is the directionality of the error which, can be modelled using a 3D covariance matrix. Recently, we developed a set of expressions that modeled the TRE statistics for point-based registrations as a function of the fiducial marker geometry, target location and the fiducial localizer error (FLE). Unfortunately, these expressions are only as good as the definition of the FLE. In order to close the gap, we have subsequently developed a closed form expression that estimates the FLE as a function of the estimated fiducial registration error (FRE, the error between the measured fiducials and the best fit locations of those fiducials). The FRE covariance matrix is estimated using a sliding window technique and used as input into the closed form expression to estimate the FLE. The estimated FLE can then used to estimate the TRE which, can be given to the surgeon to permit the procedure to be designed such that the errors associated with the point-based registrations are minimized.

  4. Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    Science.gov (United States)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average

  5. Reduction of soil erosion on forest roads

    Science.gov (United States)

    Edward R. Burroughs; John G. King

    1989-01-01

    Presents the expected reduction in surface erosion from selected treatments applied to forest road traveledways, cutslopes, fillslopes, and ditches. Estimated erosion reduction is expressed as functions of ground cover, slope gradient, and soil properties whenever possible. A procedure is provided to select rock riprap size for protection of the road ditch.

  6. Design features and cost reduction potential of JSFR

    International Nuclear Information System (INIS)

    Katoh, Atsushi; Hayafune, Hiroki; Kotake, Shoji

    2014-01-01

    Highlights: • Japan Sodium Cooled Fast Reactor (JSFR) is designed to reduce plant commodity. • Cost reduction effectiveness by innovative designs is estimated by bottom up method. • JSFR achieves 76% construction cost reduction compared with Monju by design effort. • Commercial JSFR construction cost could be less than that of conventional LWR. - Abstract: To improve the economic competitiveness of the Japan Sodium-cooled Fast Reactor (JSFR), several innovative designs have been introduced, e.g. reduction of number of main cooling loop, shorter pipe arrangement by adopting thermally durable material, in fact high chromium ferrite steel, a compact reactor vessel (RV), integration of a primary pump and an intermediate heat exchanger (IHX). Since they had not been introduced in the past and existing reactors, a new approach for construction cost estimation has been introduced to handle innovative technologies, for example, concerning different kinds of material, fabrication processes of equipment etc. As results of JSFR construction cost estimations based on the new method and the latest conceptual JSFR design, economic goals of Generation IV nuclear energy systems can be achieved by expecting the following cost reduction effects: commodity reduction by adopting innovative design, an economy of scale by power generation increase, learning effect etc. It is well analyzed quantitatively that feasibility of innovative designs is essential for economic competitiveness of JSFR

  7. Design features and cost reduction potential of JSFR

    Energy Technology Data Exchange (ETDEWEB)

    Katoh, Atsushi, E-mail: kato.atsushi@jaea.go.jp [Japan Atomic Energy Agency, 4002 Narita, Oarai-machi, Higashi-ibaraki-gun, Ibaraki-ken 311-1393 (Japan); Hayafune, Hiroki [Japan Atomic Energy Agency, 4002 Narita, Oarai-machi, Higashi-ibaraki-gun, Ibaraki-ken 311-1393 (Japan); Kotake, Shoji [The Japan Atomic Power Company, 1-1 Kanda-midoricyo, Chiyoda-ku, Tokyo-to 101-0053 (Japan)

    2014-12-15

    Highlights: • Japan Sodium Cooled Fast Reactor (JSFR) is designed to reduce plant commodity. • Cost reduction effectiveness by innovative designs is estimated by bottom up method. • JSFR achieves 76% construction cost reduction compared with Monju by design effort. • Commercial JSFR construction cost could be less than that of conventional LWR. - Abstract: To improve the economic competitiveness of the Japan Sodium-cooled Fast Reactor (JSFR), several innovative designs have been introduced, e.g. reduction of number of main cooling loop, shorter pipe arrangement by adopting thermally durable material, in fact high chromium ferrite steel, a compact reactor vessel (RV), integration of a primary pump and an intermediate heat exchanger (IHX). Since they had not been introduced in the past and existing reactors, a new approach for construction cost estimation has been introduced to handle innovative technologies, for example, concerning different kinds of material, fabrication processes of equipment etc. As results of JSFR construction cost estimations based on the new method and the latest conceptual JSFR design, economic goals of Generation IV nuclear energy systems can be achieved by expecting the following cost reduction effects: commodity reduction by adopting innovative design, an economy of scale by power generation increase, learning effect etc. It is well analyzed quantitatively that feasibility of innovative designs is essential for economic competitiveness of JSFR.

  8. Building the evidence base for stigma and discrimination-reduction programming in Thailand: development of tools to measure healthcare stigma and discrimination

    Directory of Open Access Journals (Sweden)

    Kriengkrai Srithanaviboonchai

    2017-03-01

    Full Text Available Abstract Background HIV-related stigma and discrimination (S&D are recognized as key impediments to controlling the HIV epidemic. S&D are particularly detrimental within health care settings because people who are at risk of HIV and people living with HIV (PLHIV must seek services from health care facilities. Standardized tools and monitoring systems are needed to inform S&D reduction efforts, measure progress, and monitor trends. This article describes the processes followed to adapt and refine a standardized global health facility staff S&D questionnaire for the context of Thailand and develop a similar questionnaire measuring health facility stigma experienced by PLHIV. Both questionnaires are currently being used for the routine monitoring of HIV-related S&D in the Thai healthcare system. Methods The questionnaires were adapted through a series of consultative meetings, pre-testing, and revision. The revised questionnaires then underwent field testing, and the data and field experiences were analyzed. Results Two brief questionnaires were finalized and are now being used by the Department of Disease Control to collect national routine data for monitoring health facility S&D: 1 a health facility staff questionnaire that collects data on key drivers of S&D in health facilities (i.e., fear of HIV infection, attitudes toward PLHIV and key populations, and health facility policy and environment and observed enacted stigma and 2 a brief PLHIV questionnaire that captures data on experienced discriminatory practices at health care facilities. Conclusions This effort provides an example of how a country can adapt global S&D measurement tools to a local context for use in national routine monitoring. Such data helps to strengthen the national response to HIV through the provision of evidence to shape S&D-reduction programming.

  9. Simulated dose reduction by adding artificial noise to measured raw data: A validation study

    International Nuclear Information System (INIS)

    Soederberg, M.; Gunnarsson, M.; Nilsson, M.

    2010-01-01

    The purpose of this study was to verify and validate a noise simulation tool called Dose Tutor (VAMP GmbH) in terms of level and texture of the simulated noise. By adding artificial noise to measured computed tomography (CT) raw data, a scan acquired with a lower dose (mAs) than the actual one can be simulated. A homogeneous polyethylene phantom and an anthropomorphic chest phantom were scanned for different mAs levels, tube voltages, slice thicknesses and reconstruction kernels. The simulated noise levels were compared with the noise levels in real transverse slice images actually acquired with corresponding mAs values. In general, the noise comparisons showed acceptable agreement in magnitude (<20% deviation in pixel standard deviation). Also, the calculated noise power spectra were similar, which indicates that the noise texture is correctly reproduced. In conclusion, this study establishes that the Dose Tutor might be a useful tool for estimating the dose reduction potential for CT protocols. (authors)

  10. How accurate are adolescents in portion-size estimation using the computer tool Young Adolescents' Nutrition Assessment on Computer (YANA-C)?

    Science.gov (United States)

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-06-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amounts of ten commonly consumed foods (breakfast cereals, French fries, pasta, rice, apple sauce, carrots and peas, crisps, creamy velouté, red cabbage, and peas). Two procedures were followed: (1) short-term recall: adolescents (n 73) self-served their usual portions of the ten foods and estimated the amounts later the same day; (2) real-time perception: adolescents (n 128) estimated two sets (different portions) of pre-weighed portions displayed near the computer. Self-served portions were, on average, 8 % underestimated; significant underestimates were found for breakfast cereals, French fries, peas, and carrots and peas. Spearman's correlations between the self-served and estimated weights varied between 0.51 and 0.84, with an average of 0.72. The kappa statistics were moderate (>0.4) for all but one item. Pre-weighed portions were, on average, 15 % underestimated, with significant underestimates for fourteen of the twenty portions. Photographs of food items can serve as a good aid in ranking subjects; however, to assess the actual intake at a group level, underestimation must be considered.

  11. Design and validation of new genotypic tools for easy and reliable estimation of HIV tropism before using CCR5 antagonists.

    Science.gov (United States)

    Poveda, Eva; Seclén, Eduardo; González, María del Mar; García, Federico; Chueca, Natalia; Aguilera, Antonio; Rodríguez, Jose Javier; González-Lahoz, Juan; Soriano, Vincent

    2009-05-01

    Genotypic tools may allow easier and less expensive estimation of HIV tropism before prescription of CCR5 antagonists compared with the Trofile assay (Monogram Biosciences, South San Francisco, CA, USA). Paired genotypic and Trofile results were compared in plasma samples derived from the maraviroc expanded access programme (EAP) in Europe. A new genotypic approach was built to improve the sensitivity to detect X4 variants based on an optimization of the webPSSM algorithm. Then, the new tool was validated in specimens from patients included in the ALLEGRO trial, a multicentre study conducted in Spain to assess the prevalence of R5 variants in treatment-experienced HIV patients. A total of 266 specimens from the maraviroc EAP were tested. Overall geno/pheno concordance was above 72%. A high specificity was generally seen for the detection of X4 variants using genotypic tools (ranging from 58% to 95%), while sensitivity was low (ranging from 31% to 76%). The PSSM score was then optimized to enhance the sensitivity to detect X4 variants changing the original threshold for R5 categorization. The new PSSM algorithms, PSSM(X4R5-8) and PSSM(SINSI-6.4), considered as X4 all V3 scoring values above -8 or -6.4, respectively, increasing the sensitivity to detect X4 variants up to 80%. The new algorithms were then validated in 148 specimens derived from patients included in the ALLEGRO trial. The sensitivity/specificity to detect X4 variants was 93%/69% for PSSM(X4R5-8) and 93%/70% for PSSM(SINSI-6.4). PSSM(X4R5-8) and PSSM(SINSI-6.4) may confidently assist therapeutic decisions for using CCR5 antagonists in HIV patients, providing an easier and rapid estimation of tropism in clinical samples.

  12. APPROACHES AND TOOLS FOR QUALITY EXAMINATION OF E-LEARNING TOOLS

    Directory of Open Access Journals (Sweden)

    Galina P. Lavrentieva

    2011-02-01

    Full Text Available The article highlights the scientific and methodological approaches to quality examination of e-learning tools for general education. There are considered terms of the research, described the essence of the main components and stages of the examination. A methodology for quality estimation tools elaboration is described that should be based on identifying criteria and parameters of evaluation. Complex of psycho-pedagogical and ergonomic requirements that should be used in organizing expertise is justified and the most expedient ways of their implementation are examined.

  13. PyCoTools: A Python Toolbox for COPASI.

    Science.gov (United States)

    Welsh, Ciaran M; Fullard, Nicola; Proctor, Carole J; Martinez-Guimera, Alvaro; Isfort, Robert J; Bascom, Charles C; Tasseff, Ryan; Przyborski, Stefan A; Shanley, Daryl P

    2018-05-22

    COPASI is an open source software package for constructing, simulating and analysing dynamic models of biochemical networks. COPASI is primarily intended to be used with a graphical user interface but often it is desirable to be able to access COPASI features programmatically, with a high level interface. PyCoTools is a Python package aimed at providing a high level interface to COPASI tasks with an emphasis on model calibration. PyCoTools enables the construction of COPASI models and the execution of a subset of COPASI tasks including time courses, parameter scans and parameter estimations. Additional 'composite' tasks which use COPASI tasks as building blocks are available for increasing parameter estimation throughput, performing identifiability analysis and performing model selection. PyCoTools supports exploratory data analysis on parameter estimation data to assist with troubleshooting model calibrations. We demonstrate PyCoTools by posing a model selection problem designed to show case PyCoTools within a realistic scenario. The aim of the model selection problem is to test the feasibility of three alternative hypotheses in explaining experimental data derived from neonatal dermal fibroblasts in response to TGF-β over time. PyCoTools is used to critically analyse the parameter estimations and propose strategies for model improvement. PyCoTools can be downloaded from the Python Package Index (PyPI) using the command 'pip install pycotools' or directly from GitHub (https://github.com/CiaranWelsh/pycotools). Documentation at http://pycotools.readthedocs.io. Supplementary data are available at Bioinformatics.

  14. A tool for safety evaluations of road improvements.

    Science.gov (United States)

    Peltola, Harri; Rajamäki, Riikka; Luoma, Juha

    2013-11-01

    Road safety impact assessments are requested in general, and the directive on road infrastructure safety management makes them compulsory for Member States of the European Union. However, there is no widely used, science-based safety evaluation tool available. We demonstrate a safety evaluation tool called TARVA. It uses EB safety predictions as the basis for selecting locations for implementing road-safety improvements and provides estimates of safety benefits of selected improvements. Comparing different road accident prediction methods, we demonstrate that the most accurate estimates are produced by EB models, followed by simple accident prediction models, the same average number of accidents for every entity and accident record only. Consequently, advanced model-based estimates should be used. Furthermore, we demonstrate regional comparisons that benefit substantially from such tools. Comparisons between districts have revealed significant differences. However, comparisons like these produce useful improvement ideas only after taking into account the differences in road characteristics between areas. Estimates on crash modification factors can be transferred from other countries but their benefit is greatly limited if the number of target accidents is not properly predicted. Our experience suggests that making predictions and evaluations using the same principle and tools will remarkably improve the quality and comparability of safety estimations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. How many holes is too many? A prototype tool for estimating mosquito entry risk into damaged bed nets.

    Science.gov (United States)

    Sutcliffe, James; Ji, Xin; Yin, Shaoman

    2017-08-01

    Insecticide-treated bed nets (ITNs) have played an integral role in malaria reduction but how insecticide depletion and accumulating physical damage affect ITN performance is poorly understood. More accurate methods are needed to assess damage to bed nets so that they can be designed, deployed and replaced optimally. Video recordings of female Anopheles gambiae in near approach (1-½ cm) to occupied untreated rectangular bed nets in a laboratory study were used to quantify the amount of mosquito activity (appearances over time) around different parts of the net, the per-appearance probability of a mosquito coming close to holes of different sizes (hole encounter) and the per-encounter probability of mosquitoes passing through holes of different sizes (hole passage). Appearance frequency on different parts of the net reflected previously reported patterns: the area of the net under greatest mosquito pressure was the roof, followed by the bottom 30 cm of the sides, followed by the 30 cm area immediately above this, followed by the upper two-thirds of the sides. The ratio of activity in these areas was (respectively) 250:33:5:1. Per-appearance probability of hole encounter on all parts of the net was strongly predicted by a factor combining hole perimeter and area. Per-encounter probability of hole passage, in turn, was strongly predicted by hole width. For a given width, there was a 20% greater risk of passage through holes on the roof than holes on the sides. Appearance, encounter and passage predictors correspond to various mosquito behaviours that have previously been described and are combined into a prototype mosquito entry risk tool that predicts mosquito entry rates for nets with various amounts of damage. Scenarios that use the entry risk tool to test the recommendations of the WHOPES proportionate hole index (pHI) suggest that the pHI hole size categories and failure to account for hole location likely sometimes lead to incorrect conclusions about net

  16. Data Reduction with Quantization Constraints for Decentralized Estimation in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yang Weng

    2014-01-01

    Full Text Available The unknown vector estimation problem with bandwidth constrained wireless sensor network is considered. In such networks, sensor nodes make distributed observations on the unknown vector and collaborate with a fusion center to generate a final estimate. Due to power and communication bandwidth limitations, each sensor node must compress its data and transmit to the fusion center. In this paper, both centralized and decentralized estimation frameworks are developed. The closed-form solution for the centralized estimation framework is proposed. The computational complexity of decentralized estimation problem is proven to be NP-hard and a Gauss-Seidel algorithm to search for an optimal solution is also proposed. Simulation results show the good performance of the proposed algorithms.

  17. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  18. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  19. LDRD report nonlinear model reduction

    Energy Technology Data Exchange (ETDEWEB)

    Segalman, D.; Heinstein, M.

    1997-09-01

    The very general problem of model reduction of nonlinear systems was made tractable by focusing on the very large subclass consisting of linear subsystems connected by nonlinear interfaces. Such problems constitute a large part of the nonlinear structural problems encountered in addressing the Sandia missions. A synthesis approach to this class of problems was developed consisting of: detailed modeling of the interface mechanics; collapsing the interface simulation results into simple nonlinear interface models; constructing system models by assembling model approximations of the linear subsystems and the nonlinear interface models. These system models, though nonlinear, would have very few degrees of freedom. A paradigm problem, that of machine tool vibration, was selected for application of the reduction approach outlined above. Research results achieved along the way as well as the overall modeling of a specific machine tool have been very encouraging. In order to confirm the interface models resulting from simulation, it was necessary to develop techniques to deduce interface mechanics from experimental data collected from the overall nonlinear structure. A program to develop such techniques was also pursued with good success.

  20. Reducing uncertainty of estimated nitrogen load reductions to aquatic systems through spatially targeting agricultural mitigation measures using groundwater nitrogen reduction

    DEFF Research Database (Denmark)

    Hashemi, Fatemeh; Olesen, Jørgen Eivind; Jabloun, Mohamed

    2018-01-01

    variation across the landscape in natural N-reduction (denitrification) of leached nitrate in the groundwater and surface water systems. A critical basis for including spatial targeting in regulation of N-load in Denmark is the uncertainty associated with the effect of spatially targeting measures, since......The need to further abate agricultural nitrate (N)-loadings to coastal waters in Denmark represents the main driver for development of a new spatially targeted regulation that focus on locating N-mitigation measures in agricultural areas with high N-load. This targeting makes use of the spatial...... the effect will be critically affected by uncertainty in the quantification of the spatial variation in N-reduction. In this study, we used 30 equally plausible N-reduction maps, at 100 m grid and sub-catchment resolutions, for the 85-km2 groundwater dominated Norsminde catchment in Denmark, applying set...

  1. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  2. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia; Harmandaris, Vagelis; Katsoulakis, Markos A.; Plechac, Petr

    2015-01-01

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics

  3. REDUCTIVE DEHALOGENATION OF HALOMETHANES IN NATURAL AND MODEL SYSTEMS: QSAR ANALYSIS

    Science.gov (United States)

    Reductive dehalogenation is a dominant reaction pathway for halogenated organics in anoxic environments. Towards the goal of developing predictive tools for this reaction process, the reduction kinetics for a series of halomethanes were measured in batch studies with both natural...

  4. EnergiTools(R) - a power plant performance monitoring and diagnosis tool

    International Nuclear Information System (INIS)

    Ancion, P.V.; Bastien, R.; Ringdahl, K.

    2000-01-01

    Westinghouse EnergiTools(R) is a performance diagnostic tool for power generation plants that combines the power of on-line process data acquisition with advanced diagnostics methodologies. The system uses analytical models based on thermodynamic principles combined with knowledge of component diagnostic experts. An issue in modeling expert knowledge is to have a framework that can represent and process uncertainty in complex systems. In such experiments, it is nearly impossible to build deterministic models for the effects of faults on symptoms. A methodology based on causal probabilistic graphs, more specifically on Bayesian belief networks, has been implemented in EnergiTools(R) to capture the fault-symptom relationships. The methodology estimates the likelihood of the various component failures using the fault-symptom relationships. The system also has the ability to use neural networks for processes that are difficult to model analytically. An application is the estimation of the reactor power in nuclear power plant by interpreting several plant indicators. EnergiTools(R) is used for the on-line performance monitoring and diagnostics at Vattenfall Ringhals nuclear power plants in Sweden. It has led to the diagnosis of various performance issues with plant components. Two case studies are presented. In the first case, an overestimate of the thermal power due to a faulty instrument was found, which led to a plant operation below its optimal power. The paper shows how the problem was discovered, using the analytical thermodynamic calculations. The second case shows an application of EnergiTools(R) for the diagnostic of a condenser failure using causal probabilistic graphs

  5. Towards a Novel Integrated Approach for Estimating Greenhouse Gas Emissions in Support of International Agreements

    Science.gov (United States)

    Reimann, S.; Vollmer, M. K.; Henne, S.; Brunner, D.; Emmenegger, L.; Manning, A.; Fraser, P. J.; Krummel, P. B.; Dunse, B. L.; DeCola, P.; Tarasova, O. A.

    2016-12-01

    In the recently adopted Paris Agreement the community of signatory states has agreed to limit the future global temperature increase between +1.5 °C and +2.0 °C, compared to pre-industrial times. To achieve this goal, emission reduction targets have been submitted by individual nations (called Intended Nationally Determined Contributions, INDCs). Inventories will be used for checking progress towards these envisaged goals. These inventories are calculated by combining information on specific activities (e.g. passenger cars, agriculture) with activity-related, typically IPCC-sanctioned, emission factors - the so-called bottom-up method. These calculated emissions are reported on an annual basis and are checked by external bodies by using the same method. A second independent method estimates emissions by translating greenhouse gas measurements made at regionally representative stations into regional/global emissions using meteorologically-based transport models. In recent years this so-called top-down approach has been substantially advanced into a powerful tool and emission estimates at the national/regional level have become possible. This method is already used in Switzerland, in the United Kingdom and in Australia to estimate greenhouse gas emissions and independently support the national bottom-up emission inventories within the UNFCCC framework. Examples of the comparison of the two independent methods will be presented and the added-value will be discussed. The World Meteorological Organization (WMO) and partner organizations are currently developing a plan to expand this top-down approach and to expand the globally representative GAW network of ground-based stations and remote-sensing platforms and integrate their information with atmospheric transport models. This Integrated Global Greenhouse Gas Information System (IG3IS) initiative will help nations to improve the accuracy of their country-based emissions inventories and their ability to evaluate the

  6. A Useful Tool for Atmospheric Correction and Surface Temperature Estimation of Landsat Infrared Thermal Data

    Science.gov (United States)

    Rivalland, Vincent; Tardy, Benjamin; Huc, Mireille; Hagolle, Olivier; Marcq, Sébastien; Boulet, Gilles

    2016-04-01

    Land Surface temperature (LST) is a critical variable for studying the energy and water budgets at the Earth surface, and is a key component of many aspects of climate research and services. The Landsat program jointly carried out by NASA and USGS has been providing thermal infrared data for 40 years, but no associated LST product has been yet routinely proposed to community. To derive LST values, radiances measured at sensor-level need to be corrected for the atmospheric absorption, the atmospheric emission and the surface emissivity effect. Until now, existing LST products have been generated with multi channel methods such as the Temperature/Emissivity Separation (TES) adapted to ASTER data or the generalized split-window algorithm adapted to MODIS multispectral data. Those approaches are ill-adapted to the Landsat mono-window data specificity. The atmospheric correction methodology usually used for Landsat data requires detailed information about the state of the atmosphere. This information may be obtained from radio-sounding or model atmospheric reanalysis and is supplied to a radiative transfer model in order to estimate atmospheric parameters for a given coordinate. In this work, we present a new automatic tool dedicated to Landsat thermal data correction which improves the common atmospheric correction methodology by introducing the spatial dimension in the process. The python tool developed during this study, named LANDARTs for LANDsat Automatic Retrieval of surface Temperature, is fully automatic and provides atmospheric corrections for a whole Landsat tile. Vertical atmospheric conditions are downloaded from the ERA Interim dataset from ECMWF meteorological organization which provides them at 0.125 degrees resolution, at a global scale and with a 6-hour-time step. The atmospheric correction parameters are estimated on the atmospheric grid using the commercial software MODTRAN, then interpolated to 30m resolution. We detail the processing steps

  7. Geospatial tools effectively estimate nonexceedance probabilities of daily streamflow at ungauged and intermittently gauged locations in Ohio

    Directory of Open Access Journals (Sweden)

    William H. Farmer

    2017-10-01

    New hydrological insights for the region: Several methods for estimating nonexceedance probabilities of daily mean streamflows are explored, including single-index methodologies (nearest-neighboring index and geospatial tools (kriging and topological kriging. These methods were evaluated by conducting leave-one-out cross-validations based on analyses of nearly 7 years of daily streamflow data from 79 unregulated streamgages in Ohio and neighboring states. The pooled, ordinary kriging model, with a median Nash–Sutcliffe performance of 0.87, was superior to the single-site index methods, though there was some bias in the tails of the probability distribution. Incorporating network structure through topological kriging did not improve performance. The pooled, ordinary kriging model was applied to 118 locations without systematic streamgaging across Ohio where instantaneous streamflow measurements had been made concurrent with water-quality sampling on at least 3 separate days. Spearman rank correlations between estimated nonexceedance probabilities and measured streamflows were high, with a median value of 0.76. In consideration of application, the degree of regulation in a set of sample sites helped to specify the streamgages required to implement kriging approaches successfully.

  8. Optimal Wavelength Selection in Ultraviolet Spectroscopy for the Estimation of Toxin Reduction Ratio during Hemodialysis

    Directory of Open Access Journals (Sweden)

    Amir Ghanifar

    2016-06-01

    Full Text Available Introduction The concentration of substances, including urea, creatinine, and uric acid, can be used as an index to measure toxic uremic solutes in the blood during dialysis and interdialytic intervals. The on-line monitoring of toxin concentration allows for the clearance measurement of some low-molecular-weight solutes at any time during hemodialysis.The aim of this study was to determine the optimal wavelength for estimating the changes in urea, creatinine, and uric acid in dialysate, using ultraviolet (UV spectroscopy. Materials and Methods In this study, nine uremic patients were investigated, using on-line spectrophotometry. The on-line absorption measurements (UV radiation were performed with a spectrophotometer module, connected to the fluid outlet of the dialysis machine. Dialysate samples were obtained and analyzed, using standard biochemical methods. Optimal wavelengths for both creatinine and uric acid were selected by using a combination of genetic algorithms (GAs, i.e., GA-partial least squares (GA-PLS and interval partial least squares (iPLS. Results The Artifitial Neural Network (ANN sensitivity analysis determined the wavelengths of the UV band most suitable for estimating the concentration of creatinine and uric acid. The two optimal wavelengths were 242 and 252 nm for creatinine and 295 and 298 nm for uric acid. Conclusion It can be concluded that the reduction ratio of creatinine and uric acid (dialysis efficiency could be continuously monitored during hemodialysis by UV spectroscopy.Compared to the conventional method, which is particularly sensitive to the sampling technique and involves post-dialysis blood sampling, iterative measurements throughout the dialysis session can yield more reliable data.

  9. Estimated medical cost reductions for paliperidone palmitate vs placebo in a randomized, double-blind relapse-prevention trial of patients with schizoaffective disorder.

    Science.gov (United States)

    Joshi, K; Lin, J; Lingohr-Smith, M; Fu, D J

    2015-01-01

    The objective of this economic model was to estimate the difference in medical costs among patients treated with paliperidone palmitate once-monthly injectable antipsychotic (PP1M) vs placebo, based on clinical event rates reported in the 15-month randomized, double-blind, placebo-controlled, parallel-group study of paliperidone palmitate evaluating time to relapse in subjects with schizoaffective disorder. Rates of psychotic, depressive, and/or manic relapses and serious and non-serious treatment-emergent adverse events (TEAEs) were obtained from the long-term paliperidone palmitate vs placebo relapse prevention study. The total annual medical cost for a relapse from a US payer perspective was obtained from published literature and the costs for serious and non-serious TEAEs were based on Common Procedure Terminology codes. Total annual medical cost differences for patients treated with PP1M vs placebo were then estimated. Additionally, one-way and Monte Carlo sensitivity analyses were conducted. Lower rates of relapse (-18.3%) and serious TEAEs (-3.9%) were associated with use of PP1M vs placebo as reported in the long-term paliperidone palmitate vs placebo relapse prevention study. As a result of the reduction in these clinical event rates, the total annual medical cost was reduced by $7140 per patient treated with PP1M vs placebo. One-way sensitivity analysis showed that variations in relapse rates had the greatest impact on the estimated medical cost differences (range: -$9786, -$4670). Of the 10,000 random cycles of Monte Carlo simulations, 100% showed a medical cost difference schizoaffective disorder was associated with a significantly lower rate of relapse and a reduction in medical costs compared to placebo. Further evaluation in the real-world setting is warranted.

  10. Effects of gamma irradiation, pH-Reduction and AW-reduction on the shelf-life of chilled 'tenderloin rolls'

    International Nuclear Information System (INIS)

    Farkas, J.; Andrassy, E.

    1993-01-01

    Experimental batches of refrigerated, vacuum-package, ready-to-fry, minced meat product, 'tenderloin rolls' were preserved by combinations of reduction of pH from 6.1 to 5.6 by ascrobic acid, reduction of the water activity from a w =0.975 to 0.962 by sodium lactate, and/or a radiation dose of 2 kGy. Storage of the untreated and irradiated samples at +2 C for 4 weeks was followed by one-week incubation at +10 C. Total plate counts, counts of presumptive lactobacilli, the Enterobacteriaceae and sulphite-reducing clostridia were estimated at weekly intervals. pH-changes during storage were also followed. Comparative estimations of sensory qualities, thiamine contents, and TBA-values were also performed. The results demonstrated the possibility of significantly extending the shelf-life of the chilled product - without hampering the microbiological safety - by the sensorically acceptable radiation dose in combination with slight reduction of the pH and the water activity. (orig.)

  11. Parameter Estimation in Stochastic Grey-Box Models

    DEFF Research Database (Denmark)

    Kristensen, Niels Rode; Madsen, Henrik; Jørgensen, Sten Bay

    2004-01-01

    An efficient and flexible parameter estimation scheme for grey-box models in the sense of discretely, partially observed Ito stochastic differential equations with measurement noise is presented along with a corresponding software implementation. The estimation scheme is based on the extended...... Kalman filter and features maximum likelihood as well as maximum a posteriori estimation on multiple independent data sets, including irregularly sampled data sets and data sets with occasional outliers and missing observations. The software implementation is compared to an existing software tool...... and proves to have better performance both in terms of quality of estimates for nonlinear systems with significant diffusion and in terms of reproducibility. In particular, the new tool provides more accurate and more consistent estimates of the parameters of the diffusion term....

  12. Drinking Water Consequences Tools. A Literature Review

    Energy Technology Data Exchange (ETDEWEB)

    Pasqualini, Donatella [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    In support of the goals of Department of Homeland Security’s (DHS) National Protection and Programs Directorate and the Federal Emergency Management Agency, the DHS Office of Science and Technology is seeking to develop and/or modify consequence assessment tools to enable drinking water systems owner/operators to estimate the societal and economic consequences of drinking water disruption due to the threats and hazards. This work will expand the breadth of consequence estimation methods and tools using the best-available data describing water distribution infrastructure, owner/assetlevel economic losses, regional-scale economic activity, and health. In addition, this project will deploy the consequence methodology and capability within a Web-based platform. This report is intended to support DHS effort providing a review literature review of existing assessment tools of water and wastewater systems consequences to disruptions. The review includes tools that assess water systems resilience, vulnerability, and risk. This will help to understand gaps and limitations of these tools in order to plan for the development of the next-generation consequences tool for water and waste water systems disruption.

  13. Tool Wear Monitoring Using Time Series Analysis

    Science.gov (United States)

    Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu

    A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.

  14. sTools - a data reduction pipeline for the GREGOR Fabry-Pérot Interferometer and the High-resolution Fast Imager at the GREGOR solar telescope

    Science.gov (United States)

    Kuckein, C.; Denker, C.; Verma, M.; Balthasar, H.; González Manrique, S. J.; Louis, R. E.; Diercke, A.

    2017-10-01

    A huge amount of data has been acquired with the GREGOR Fabry-Pérot Interferometer (GFPI), large-format facility cameras, and since 2016 with the High-resolution Fast Imager (HiFI). These data are processed in standardized procedures with the aim of providing science-ready data for the solar physics community. For this purpose, we have developed a user-friendly data reduction pipeline called ``sTools'' based on the Interactive Data Language (IDL) and licensed under creative commons license. The pipeline delivers reduced and image-reconstructed data with a minimum of user interaction. Furthermore, quick-look data are generated as well as a webpage with an overview of the observations and their statistics. All the processed data are stored online at the GREGOR GFPI and HiFI data archive of the Leibniz Institute for Astrophysics Potsdam (AIP). The principles of the pipeline are presented together with selected high-resolution spectral scans and images processed with sTools.

  15. Effects of exposure estimation errors on estimated exposure-response relations for PM2.5.

    Science.gov (United States)

    Cox, Louis Anthony Tony

    2018-07-01

    Associations between fine particulate matter (PM2.5) exposure concentrations and a wide variety of undesirable outcomes, from autism and auto theft to elderly mortality, suicide, and violent crime, have been widely reported. Influential articles have argued that reducing National Ambient Air Quality Standards for PM2.5 is desirable to reduce these outcomes. Yet, other studies have found that reducing black smoke and other particulate matter by as much as 70% and dozens of micrograms per cubic meter has not detectably affected all-cause mortality rates even after decades, despite strong, statistically significant positive exposure concentration-response (C-R) associations between them. This paper examines whether this disconnect between association and causation might be explained in part by ignored estimation errors in estimated exposure concentrations. We use EPA air quality monitor data from the Los Angeles area of California to examine the shapes of estimated C-R functions for PM2.5 when the true C-R functions are assumed to be step functions with well-defined response thresholds. The estimated C-R functions mistakenly show risk as smoothly increasing with concentrations even well below the response thresholds, thus incorrectly predicting substantial risk reductions from reductions in concentrations that do not affect health risks. We conclude that ignored estimation errors obscure the shapes of true C-R functions, including possible thresholds, possibly leading to unrealistic predictions of the changes in risk caused by changing exposures. Instead of estimating improvements in public health per unit reduction (e.g., per 10 µg/m 3 decrease) in average PM2.5 concentrations, it may be essential to consider how interventions change the distributions of exposure concentrations. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. EFFECTS OF REMITTANCES ON POVERTY REDUCTION: THE CASE OF INDONESIA

    Directory of Open Access Journals (Sweden)

    Faiza Husnayeni Nahar

    2017-09-01

    Full Text Available Remittances have been reported as a tool for fighting poverty in some selected countries, such as Indonesia. An increase of income through remittances tends to improve the economic status of the migrant’s household. Once they get a high salary, they will remit money (a remittance to their household in Indonesia via formal institutions, such as banks. The migrant’s household can fulfil their basic needs and can use the remittance for educational investment and productive activities. The education investment aims to educate the children or grandchildren of migrants, which will be beneficial for the future generations of the family, allowing them the chance of a more prosperous life. The poverty rate would be reduced gradually, and economic welfare can be achieved. The main objectives of this paper are first to estimate the effects of remittances on poverty in Indonesia from 1983 to 2015 and second, to propose several strategic policies related to remittances and poverty reduction. Other variables considered include inflation, exchange rates, income, income inequality and the labor force participation rate. An Ordinary Least Square (OLS method was used to explore the econometric and estimated results. The study found that an increase in remittances led to a reduction in poverty by 2.56%. Inflation and the exchange rate have positive and negative effects on poverty, respectively. The small effect of remittances on poverty’s reduction could possibly be explained by the low educational background of the migrants, low wage jobs, expensive remittance costs, and migrants not knowing how to remit money through formal financial institutions. Hence, to reduce the poverty level, the government needs to first facilitate skills training for the workers so that they could get a better job and earn more, second, lower the transaction costs of remittances, and lastly, provide agents at Indonesian banks overseas to provide better facilities to Indonesian

  17. Coronal pulp biomarker: A lesser known age estimation modality

    Directory of Open Access Journals (Sweden)

    Smrithi D Veera

    2014-01-01

    Full Text Available Introduction: The evolving state of art digital technology currently available is opening new avenues in forensic odontology for age estimation methods which are subject to debate in terms of accuracy and precision. A study was carried to analyze efficacy and practical application for age estimation using digital panoramic radiographs on South Indian population. Aims and Objectives: 1. To study reduction of coronal pulp chamber using Tooth Coronal Index (TCI on panoramic radiographs and correlate with chronologic age. 2. To establish accuracy of digital panoramic radiographs as a simple, non-invasive tool. Materials and Methods: The study illustrates the potential value of a little known aging method. The study groups comprised a total of 100 subjects of both sexes in age range of 20 and 60 years each who were subjected to panoramic radiography. A panoramic radiographic examination using digital panoramic machine was conducted on selected individuals. The TCI was calibrated using AGFA computer software for accuracy and precision. The values obtained were subjected to regression analysis, results calculated and correlated with chronologic age. In the present study a population of known age was studied and subjected to digital panoramic radiographic examination. The correlation between reduction of coronal pulp cavity and chronological age was examined. TCI was computed for each tooth and regressed on real age. Statistical Analysis Used: Pearson correlation co-efficient was used to find the significance of relationship between age and TCI. Regression analysis has been used for predicting age using TCI for premolar and molar. Inaccuracy and bias have been determined to assess the precision of prediction equations. Results and Conclusion: Prediction potential of TCI comes down for ages above 50 years and is comfortably good below 50 years without much difference between premolars and molars. This study demonstrates the potential value of TCI for age

  18. Reduced order methods for modeling and computational reduction

    CERN Document Server

    Rozza, Gianluigi

    2014-01-01

    This monograph addresses the state of the art of reduced order methods for modeling and computational reduction of complex parametrized systems, governed by ordinary and/or partial differential equations, with a special emphasis on real time computing techniques and applications in computational mechanics, bioengineering and computer graphics.  Several topics are covered, including: design, optimization, and control theory in real-time with applications in engineering; data assimilation, geometry registration, and parameter estimation with special attention to real-time computing in biomedical engineering and computational physics; real-time visualization of physics-based simulations in computer science; the treatment of high-dimensional problems in state space, physical space, or parameter space; the interactions between different model reduction and dimensionality reduction approaches; the development of general error estimation frameworks which take into account both model and discretization effects. This...

  19. Dose Reduction and Dose Management in Computed Tomography - State of the Art.

    Science.gov (United States)

    Zinsser, Dominik; Marcus, Roy; Othman, Ahmed E; Bamberg, Fabian; Nikolaou, Konstantin; Flohr, Thomas; Notohamiprodjo, Mike

    2018-03-13

     For years, the number of performed CT examinations has been rising. At the same time, computed tomography became more dose efficient. The aim of this article is to give an overview about the state of the art in dose reduction in CT and to highlight currently available tools in dose management.  By performing a literature research on Pubmed regarding dose reduction in CT, relevant articles were identified and analyzed.  Technical innovations with individual adaptation of tube current and voltage as well as iterative image reconstruction enable a considerable dose reduction with preserved image quality. At the same time, dedicated software tools are able to handle huge amounts of data and allow to optimize existing examination protocols.   · CT examinations are increasingly performed and contribute considerably to non-natural radiation exposure.. · A correct indication is crucial for each CT examination.. · The examination protocol has to be tailored to the medical question and patient.. · Multiple technical innovations enable considerable dose reduction with constant image quality.. · Dose management with dedicated software tools gains importance.. · Zinsser D, Marcus R, Othman AE et al. Dose reduction and dose management in computed tomography - State of the art. Fortschr Röntgenstr 2018; DOI: 10.1055/s-0044-101261. © Georg Thieme Verlag KG Stuttgart · New York.

  20. A novel tool for user-friendly estimation of natural, diagnostic and professional radiation risk: Radio-Risk software

    International Nuclear Information System (INIS)

    Carpeggiani, Clara; Paterni, Marco; Caramella, Davide; Vano, Eliseo; Semelka, Richard C.; Picano, Eugenio

    2012-01-01

    Background: Awareness of radiological risk is low among doctors and patients. An educational/decision tool that considers each patient’ s cumulative lifetime radiation exposure would facilitate provider–patient communication. Aim: The purpose of this work was to develop user-friendly software for simple estimation and communication of radiological risk to patients and doctors as a part of the SUIT-Heart (Stop Useless Imaging Testing in Heart disease) Project of the Tuscany Region. Methods: We developed a novel software program (PC-platform, Windows OS fully downloadable at (http://suit-heart.ifc.cnr.it)) considering reference dose estimates from American Heart Association Radiological Imaging 2009 guidelines and UK Royal College of Radiology 2007 guidelines. Cancer age and gender-weighted risk were derived from Biological Effects of Ionising Radiation VII Committee, 2006. Results: With simple input functions (demographics, age, gender) the user selects from a predetermined menu variables relating to natural (e.g., airplane flights and geo-tracked background exposure), professional (e.g., cath lab workers) and medical (e.g., CT, cardiac scintigraphy, coronary stenting) sources. The program provides a simple numeric (cumulative effective dose in milliSievert, mSv, and equivalent number of chest X-rays) and graphic (cumulative temporal trends of exposure, cancer cases out of 100 exposed persons) display. Conclusions: A simple software program allows straightforward estimation of cumulative dose (in multiples of chest X-rays) and risk (in extra % lifetime cancer risk), with simple numbers quantifying lifetime extra cancer risk. Pictorial display of radiation risk may be valuable for increasing radiological awareness in cardiologists.

  1. A novel tool for user-friendly estimation of natural, diagnostic and professional radiation risk: Radio-Risk software

    Energy Technology Data Exchange (ETDEWEB)

    Carpeggiani, Clara; Paterni, Marco [CNR, Institute of Clinical Physiology (Italy); Caramella, Davide [Radiology Department, Pisa University, Pisa (Italy); Vano, Eliseo [San Carlos Hospital, Radiology Department, Complutense University, Madrid (Spain); Semelka, Richard C. [University of North Carolina, Chapel Hill, NC (United States); Picano, Eugenio, E-mail: picano@ifc.cnr.it [CNR, Institute of Clinical Physiology (Italy)

    2012-11-15

    Background: Awareness of radiological risk is low among doctors and patients. An educational/decision tool that considers each patient' s cumulative lifetime radiation exposure would facilitate provider-patient communication. Aim: The purpose of this work was to develop user-friendly software for simple estimation and communication of radiological risk to patients and doctors as a part of the SUIT-Heart (Stop Useless Imaging Testing in Heart disease) Project of the Tuscany Region. Methods: We developed a novel software program (PC-platform, Windows OS fully downloadable at (http://suit-heart.ifc.cnr.it)) considering reference dose estimates from American Heart Association Radiological Imaging 2009 guidelines and UK Royal College of Radiology 2007 guidelines. Cancer age and gender-weighted risk were derived from Biological Effects of Ionising Radiation VII Committee, 2006. Results: With simple input functions (demographics, age, gender) the user selects from a predetermined menu variables relating to natural (e.g., airplane flights and geo-tracked background exposure), professional (e.g., cath lab workers) and medical (e.g., CT, cardiac scintigraphy, coronary stenting) sources. The program provides a simple numeric (cumulative effective dose in milliSievert, mSv, and equivalent number of chest X-rays) and graphic (cumulative temporal trends of exposure, cancer cases out of 100 exposed persons) display. Conclusions: A simple software program allows straightforward estimation of cumulative dose (in multiples of chest X-rays) and risk (in extra % lifetime cancer risk), with simple numbers quantifying lifetime extra cancer risk. Pictorial display of radiation risk may be valuable for increasing radiological awareness in cardiologists.

  2. Decommissioning Cost Estimating -The ''Price'' Approach

    International Nuclear Information System (INIS)

    Manning, R.; Gilmour, J.

    2002-01-01

    Over the past 9 years UKAEA has developed a formalized approach to decommissioning cost estimating. The estimating methodology and computer-based application are known collectively as the PRICE system. At the heart of the system is a database (the knowledge base) which holds resource demand data on a comprehensive range of decommissioning activities. This data is used in conjunction with project specific information (the quantities of specific components) to produce decommissioning cost estimates. PRICE is a dynamic cost-estimating tool, which can satisfy both strategic planning and project management needs. With a relatively limited analysis a basic PRICE estimate can be produced and used for the purposes of strategic planning. This same estimate can be enhanced and improved, primarily by the improvement of detail, to support sanction expenditure proposals, and also as a tender assessment and project management tool. The paper will: describe the principles of the PRICE estimating system; report on the experiences of applying the system to a wide range of projects from contaminated car parks to nuclear reactors; provide information on the performance of the system in relation to historic estimates, tender bids, and outturn costs

  3. The Uncertainty estimation of Alanine/ESR dosimetry

    International Nuclear Information System (INIS)

    Kim, Bo Rum; An, Jin Hee; Choi, Hoon; Kim, Young Ki

    2008-01-01

    Machinery, tools and cable etc are in the nuclear power plant which environment is very severe. By measuring actual dose, it needs for extending life expectancy of the machinery and tools and the cable. Therefore, we estimated on dose (gamma ray) of Wolsong nuclear power division 1 by dose estimation technology for three years. The dose estimation technology was secured by ESR(Electron Spin Resonance) dose estimation using regression analysis. We estimate uncertainty for secure a reliability of results. The uncertainty estimation will be able to judge the reliability of measurement results. The estimation of uncertainty referred the international unified guide in order; GUM(Guide to the Expression of Uncertainty in Measurement). It was published by International Standardization for Organization (ISO) in 1993. In this study the uncertainty of e-scan and EMX those are ESR equipment were evaluated and compared. Base on these results, it will improve the reliability of measurement

  4. Assessing the Effect of Potential Reductions in Non-Hepatic Mortality on the Estimated Cost-Effectiveness of Hepatitis C Treatment in Early Stages of Liver Disease

    Science.gov (United States)

    Chesson, Harrell W.; Spradling, Philip R.; Holmberg, Scott D.

    2018-01-01

    Background Most cost-effectiveness analyses of hepatitis C (HCV) therapy focus on the benefits of reducing liver-related morbidity and mortality. Objectives Our objective was to assess how cost-effectiveness estimates of HCV therapy can vary depending on assumptions regarding the potential impact of HCV therapy on non-hepatic mortality. Methods We adapted a state-transition model to include potential effects of HCV therapy on non-hepatic mortality. We assumed successful treatment could reduce non-hepatic mortality by as little as 0 % to as much as 100 %. Incremental cost-effectiveness ratios were computed comparing immediate treatment versus delayed treatment and comparing immediate treatment versus non-treatment. Results Comparing immediate treatment versus delayed treatment, when we included a 44 % reduction in nonhepatic mortality following successful HCV treatment, the incremental cost per quality-adjusted life year (QALY) gained by HCV treatment fell by 76 % (from US$314,100 to US$76,900) for patients with no fibrosis and by 43 % (from US$62,500 to US$35,800) for patients with moderate fibrosis. Comparing immediate treatment versus non-treatment, assuming a 44 % reduction in non-hepatic mortality following successful HCV treatment, the incremental cost per QALY gained by HCV treatment fell by 64 % (from US$186,700 to US$67,300) for patients with no fibrosis and by 27 % (from US$35,000 to US$25,500) for patients with moderate fibrosis. Conclusion Including reductions in non-hepatic mortality from HCV treatment can have substantial effects on the estimated cost-effectiveness of treatment. PMID:27480538

  5. Energy Information Augmented Community-Based Energy Reduction

    Directory of Open Access Journals (Sweden)

    Mark Rembert

    2012-06-01

    Full Text Available More than one-half of all U.S. states have instituted energy efficiency mandates requiring utilities to reduce energy use. To achieve these goals, utilities have been permitted rate structures to help them incentivize energy reduction projects. This strategy is proving to be only modestly successful in stemming energy consumption growth. By the same token, community energy reduction programs have achieved moderate to very significant energy reduction. The research described here offers an important tool to strengthen the community energy reduction efforts—by providing such efforts energy information tailored to the energy use patterns of each building occupant. The information provided most importantly helps each individual energy customer understand their potential for energy savings and what reduction measures are most important to them. This information can be leveraged by the leading community organization to prompt greater action in its community. A number of case studies of this model are shown. Early results are promising.

  6. Predictive Engineering Tools for Injection-Molded Long-Carbon-Thermoplastic Composites: Weight and Cost Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fifield, Leonard S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gandhi, Umesh N. [Toyota Research Inst. North America, Ann Arbor, MI (United States); Mori, Steven [MAGNA Exteriors and Interiors Corporation, Aurora, ON (Canada); Wollan, Eric J. [PlastiComp, Inc., Winona, MN (United States)

    2016-08-01

    This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used as resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.

  7. Structural correlation method for model reduction and practical estimation of patient specific parameters illustrated on heart rate regulation

    DEFF Research Database (Denmark)

    Ottesen, Johnny T.; Mehlsen, Jesper; Olufsen, Mette

    2014-01-01

    We consider the inverse and patient specific problem of short term (seconds to minutes) heart rate regulation specified by a system of nonlinear ODEs and corresponding data. We show how a recent method termed the structural correlation method (SCM) can be used for model reduction and for obtaining...... a set of practically identifiable parameters. The structural correlation method includes two steps: sensitivity and correlation analysis. When combined with an optimization step, it is possible to estimate model parameters, enabling the model to fit dynamics observed in data. This method is illustrated...... in detail on a model predicting baroreflex regulation of heart rate and applied to analysis of data from a rat and healthy humans. Numerous mathematical models have been proposed for prediction of baroreflex regulation of heart rate, yet most of these have been designed to provide qualitative predictions...

  8. Beyond Self-Report: Tools to Compare Estimated and Real-World Smartphone Use

    Science.gov (United States)

    Andrews, Sally; Ellis, David A.; Shaw, Heather; Piwek, Lukasz

    2015-01-01

    Psychologists typically rely on self-report data when quantifying mobile phone usage, despite little evidence of its validity. In this paper we explore the accuracy of using self-reported estimates when compared with actual smartphone use. We also include source code to process and visualise these data. We compared 23 participants’ actual smartphone use over a two-week period with self-reported estimates and the Mobile Phone Problem Use Scale. Our results indicate that estimated time spent using a smartphone may be an adequate measure of use, unless a greater resolution of data are required. Estimates concerning the number of times an individual used their phone across a typical day did not correlate with actual smartphone use. Neither estimated duration nor number of uses correlated with the Mobile Phone Problem Use Scale. We conclude that estimated smartphone use should be interpreted with caution in psychological research. PMID:26509895

  9. Comparative analysis of old-age mortality estimations in Africa.

    Directory of Open Access Journals (Sweden)

    Eran Bendavid

    Full Text Available Survival to old ages is increasing in many African countries. While demographic tools for estimating mortality up to age 60 have improved greatly, mortality patterns above age 60 rely on models based on little or no demographic data. These estimates are important for social planning and demographic projections. We provide direct estimations of older-age mortality using survey data.Since 2005, nationally representative household surveys in ten sub-Saharan countries record counts of living and recently deceased household members: Burkina Faso, Côte d'Ivoire, Ethiopia, Namibia, Nigeria, Swaziland, Tanzania, Uganda, Zambia, and Zimbabwe. After accounting for age heaping using multiple imputation, we use this information to estimate probability of death in 5-year intervals ((5q(x. We then compare our (5q(x estimates to those provided by the World Health Organization (WHO and the United Nations Population Division (UNPD to estimate the differences in mortality estimates, especially among individuals older than 60 years old.We obtained information on 505,827 individuals (18.4% over age 60, 1.64% deceased. WHO and UNPD mortality models match our estimates closely up to age 60 (mean difference in probability of death -1.1%. However, mortality probabilities above age 60 are lower using our estimations than either WHO or UNPD. The mean difference between our sample and the WHO is 5.9% (95% CI 3.8-7.9% and between our sample is UNPD is 13.5% (95% CI 11.6-15.5%. Regardless of the comparator, the difference in mortality estimations rises monotonically above age 60.Mortality estimations above age 60 in ten African countries exhibit large variations depending on the method of estimation. The observed patterns suggest the possibility that survival in some African countries among adults older than age 60 is better than previously thought. Improving the quality and coverage of vital information in developing countries will become increasingly important with

  10. U.S. Virgin Islands Transportation Petroleum Reduction Plan

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2011-09-01

    This NREL technical report determines a way for USVI to meet its petroleum reduction goal in the transportation sector. It does so first by estimating current petroleum use and key statistics and characteristics of USVI transportation. It then breaks the goal down into subordinate goals and estimates the petroleum impacts of these goals with a wedge analysis. These goals focus on reducing vehicle miles, improving fuel economy, improving traffic flow, using electric vehicles, using biodiesel and renewable diesel, and using 10% ethanol in gasoline. The final section of the report suggests specific projects to achieve the goals, and ranks the projects according to cost, petroleum reduction, time frame, and popularity.

  11. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Sandra Brown; Patrick Gonzalez; Brent Sohngen; Neil Sampson; Mark Anderson; Miguel Calmon; Sean Grimland; Ellen Hawes; Zoe Kant; Dan Morse; Sarah Woodhouse Murdock; Arlene Olivero; Tim Pearson; Sarah Walker; Jon Winsten; Chris Zganjar

    2006-09-30

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st and July 30th 2006. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.

  12. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Patrick Gonzalez; Sandra Brown; Jenny Henman; Zoe Kant; Sarah Woodhouse Murdock; Neil Sampson; Gilberto Tiepolo; Tim Pearson; Sarah Walker; Miguel Calmon

    2006-01-01

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.

  13. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Patrick Gonzalez; Sandra Brown; Jenny Henman; Sarah Woodhouse Murdock; Neil Sampson; Tim Pearson; Sarah Walker; Zoe Kant; Miguel Calmon

    2006-04-01

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between January 1st and March 31st 2006. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.

  14. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Sandra Brown; Patrick Gonzalez; Brent Sohngen; Neil Sampson; Mark Anderson; Miguel Calmon; Sean Grimland; Zoe Kant; Dan Morse; Sarah Woodhouse Murdock; Arlene Olivero; Tim Pearson; Sarah Walker; Jon Winsten; Chris Zganjar

    2007-03-31

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between January 1st and March 31st 2007. The specific tasks discussed include: Task 1--carbon inventory advancements; Task 2--emerging technologies for remote sensing of terrestrial carbon; Task 3--baseline method development; Task 4--third-party technical advisory panel meetings; Task 5--new project feasibility studies; and Task 6--development of new project software screening tool.

  15. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Patrick Gonzalez; Sandra Brown; Sarah Woodhouse Murdock; Jenny Henman; Zoe Kant; Gilberto Tiepolo; Tim Pearson; Neil Sampson; Miguel Calmon

    2005-10-01

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.

  16. Low Cost Lithography Tool for High Brightness LED Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Andrew Hawryluk; Emily True

    2012-06-30

    The objective of this activity was to address the need for improved manufacturing tools for LEDs. Improvements include lower cost (both capital equipment cost reductions and cost-ofownership reductions), better automation and better yields. To meet the DOE objective of $1- 2/kilolumen, it will be necessary to develop these highly automated manufacturing tools. Lithography is used extensively in the fabrication of high-brightness LEDs, but the tools used to date are not scalable to high-volume manufacturing. This activity addressed the LED lithography process. During R&D and low volume manufacturing, most LED companies use contact-printers. However, several industries have shown that these printers are incompatible with high volume manufacturing and the LED industry needs to evolve to projection steppers. The need for projection lithography tools for LED manufacturing is identified in the Solid State Lighting Manufacturing Roadmap Draft, June 2009. The Roadmap states that Projection tools are needed by 2011. This work will modify a stepper, originally designed for semiconductor manufacturing, for use in LED manufacturing. This work addresses improvements to yield, material handling, automation and throughput for LED manufacturing while reducing the capital equipment cost.

  17. Metaproteomics Identifies the Protein Machinery Involved in Metal and Radionuclide Reduction in Subsurface Microbiomes and Elucidates Mechanisms and U(VI) Reduction Immobilization

    Energy Technology Data Exchange (ETDEWEB)

    Pfiffner, Susan M. [Univ. of Tennessee, Knoxville, TN (United States); Löffler, Frank [Univ. of Tennessee, Knoxville, TN (United States); Ritalahti, Kirsti [Univ. of Tennessee, Knoxville, TN (United States); Sayler, Gary [Univ. of Tennessee, Knoxville, TN (United States); Layton, Alice [Univ. of Tennessee, Knoxville, TN (United States); Hettich, Robert [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-08-31

    The overall goal for this funded project was to develop and exploit environmental metaproteomics tools to identify biomarkers for monitoring microbial activity affecting U speciation at U-contaminated sites, correlate metaproteomics profiles with geochemical parameters and U(VI) reduction activity (or lack thereof), elucidate mechanisms contributing to U(VI) reduction, and provide remediation project managers with additional information to make science-based site management decisions for achieving cleanup goals more efficiently. Although significant progress has been made in elucidating the microbiology contribution to metal and radionuclide reduction, the cellular components, pathway(s), and mechanisms involved in U trans-formation remain poorly understood. Recent advances in (meta)proteomics technology enable detailed studies of complex samples, including environmental samples, which differ between sites and even show considerable variability within the same site (e.g., the Oak Ridge IFRC site). Additionally, site-specific geochemical conditions affect microbial activity and function, suggesting generalized assessment and interpretations may not suffice. This research effort integrated current understanding of the microbiology and biochemistry of U(VI) reduction and capitalize on advances in proteomics technology made over the past few years. Field-related analyses used Oak Ridge IFRC field ground water samples from locations where slow-release substrate biostimulation has been implemented to accelerate in situ U(VI) reduction rates. Our overarching hypothesis was that the metabolic signature in environmental samples, as deciphered by the metaproteome measurements, would show a relationship with U(VI) reduction activity. Since metaproteomic and metagenomic characterizations were computationally challenging and time-consuming, we used a tiered approach that combines database mining, controlled laboratory studies, U(VI) reduction activity measurements, phylogenetic

  18. Improvements in Spectrum's fit to program data tool.

    Science.gov (United States)

    Mahiane, Severin G; Marsh, Kimberly; Grantham, Kelsey; Crichlow, Shawna; Caceres, Karen; Stover, John

    2017-04-01

    The Joint United Nations Program on HIV/AIDS-supported Spectrum software package (Glastonbury, Connecticut, USA) is used by most countries worldwide to monitor the HIV epidemic. In Spectrum, HIV incidence trends among adults (aged 15-49 years) are derived by either fitting to seroprevalence surveillance and survey data or generating curves consistent with program and vital registration data, such as historical trends in the number of newly diagnosed infections or people living with HIV and AIDS related deaths. This article describes development and application of the fit to program data (FPD) tool in Joint United Nations Program on HIV/AIDS' 2016 estimates round. In the FPD tool, HIV incidence trends are described as a simple or double logistic function. Function parameters are estimated from historical program data on newly reported HIV cases, people living with HIV or AIDS-related deaths. Inputs can be adjusted for proportions undiagnosed or misclassified deaths. Maximum likelihood estimation or minimum chi-squared distance methods are used to identify the best fitting curve. Asymptotic properties of the estimators from these fits are used to estimate uncertainty. The FPD tool was used to fit incidence for 62 countries in 2016. Maximum likelihood and minimum chi-squared distance methods gave similar results. A double logistic curve adequately described observed trends in all but four countries where a simple logistic curve performed better. Robust HIV-related program and vital registration data are routinely available in many middle-income and high-income countries, whereas HIV seroprevalence surveillance and survey data may be scarce. In these countries, the FPD tool offers a simpler, improved approach to estimating HIV incidence trends.

  19. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    Science.gov (United States)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  20. Screening the performance of lubricants for ironing of stainless steel with a strip reduction test

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; Bay, Niels; Andersen, Mette Merete

    1997-01-01

    A laboratory strip reduction test simulating the tribological conditions of an ironing process is proposed. The test is capable of simulating varying process conditions such as reduction, drawing speed, tool temperature and sliding length. The test makes it possible to quantify the onset of break...... of breakdown of the lubricant film and subsequent galling. Experimental investigations of stainless steel show the influence of varying process conditions and the performance of different lubricants.......A laboratory strip reduction test simulating the tribological conditions of an ironing process is proposed. The test is capable of simulating varying process conditions such as reduction, drawing speed, tool temperature and sliding length. The test makes it possible to quantify the onset...

  1. Problems in repair-welding of duplex-treated tool steels

    OpenAIRE

    T. Muhič; J. Tušek; M. Pleterski; D. Bombač

    2009-01-01

    The present paper addresses problems in laser welding of die-cast tools used for aluminum pressure die-castings and plastic moulds. To extend life cycle of tools various surface improvements are used. These surface improvements significantly reduce weldability of the material. This paper presents development of defects in repair welding of duplex-treated tool steel. The procedure is aimed at reduction of defects by the newly developed repair laser welding techniques. Effects of different repa...

  2. Los Alamos transuranic waste size reduction facility

    International Nuclear Information System (INIS)

    Briesmeister, A.; Harper, J.; Reich, B.; Warren, J.L.

    1982-01-01

    To facilitate disposal of transuranic (TRU) waste, Los Alamos National Laboratory designed and constructed the Size Reduction Facility (SRF) during the period 1977 to 1981. This report summarizes the engineering development, installation, and early test operations of the SRF. The facility incorporates a large stainless steel enclosure fitted with remote handling and cutting equipment to obtain an estimated 4:1 volume reduction of gloveboxes and other bulky metallic wastes

  3. Los Alamos transuranic waste size reduction facility

    International Nuclear Information System (INIS)

    Briesmeister, A.; Harper, J.; Reich, B.; Warren, J.L.

    1982-01-01

    A transuranic (TRU) Waste Size Reduction Facility (SRF) was designed and constructed at the Los Alamos National Laboratory during the period of 1977 to 1981. This paper summarizes the engineering development, installation, and early test operations of the SRF. The facility incorporates a large stainless steel enclosure fitted with remote handling and cutting equipment to obtain an estimated 4:1 volume reduction of gloveboxes and other bulky metallic wastes

  4. Error reduction techniques for Monte Carlo neutron transport calculations

    International Nuclear Information System (INIS)

    Ju, J.H.W.

    1981-01-01

    Monte Carlo methods have been widely applied to problems in nuclear physics, mathematical reliability, communication theory, and other areas. The work in this thesis is developed mainly with neutron transport applications in mind. For nuclear reactor and many other applications, random walk processes have been used to estimate multi-dimensional integrals and obtain information about the solution of integral equations. When the analysis is statistically based such calculations are often costly, and the development of efficient estimation techniques plays a critical role in these applications. All of the error reduction techniques developed in this work are applied to model problems. It is found that the nearly optimal parameters selected by the analytic method for use with GWAN estimator are nearly identical to parameters selected by the multistage method. Modified path length estimation (based on the path length importance measure) leads to excellent error reduction in all model problems examined. Finally, it should be pointed out that techniques used for neutron transport problems may be transferred easily to other application areas which are based on random walk processes. The transport problems studied in this dissertation provide exceptionally severe tests of the error reduction potential of any sampling procedure. It is therefore expected that the methods of this dissertation will prove useful in many other application areas

  5. Development of a Cost Estimation Process for Human Systems Integration Practitioners During the Analysis of Alternatives

    Science.gov (United States)

    2010-12-01

    processes. Novice estimators must often use of these complicated cost estimation tools (e.g., ACEIT , SEER-H, SEER-S, PRICE-H, PRICE-S, etc.) until...However, the thesis will leverage the processes embedded in cost estimation tools such as the Automated Cost Estimating Integration Tool ( ACEIT ) and the

  6. Estimation of lung volume and pressure from electrocardiogram

    KAUST Repository

    Elsayed, Gamal Eldin Fathy Amin

    2011-05-01

    The Electrocardiography (ECG) is a tool measuring the electrical excitation of the heart that is extensively used for diagnosis and monitoring of heart diseases. The ECG signal reflects not only the heart activity but also many other physiological processes. The respiratory activity is a prominent process that affects the ECG signal due to the close proximity of the heart and the lungs and, on the other hand, due to neural regulatory processes. In this paper, several means for the estimation of the respiratory process from the ECG signal are presented. The results show a strong correlation of the voltage difference between the R and S peak of the ECG and the lung\\'s volume and pressure. Correlation was also found for some features of the vector ECG, which is a two dimensional graph of two different ECG signals. The potential benefit of the multiparametric evaluation of the ECG signal is a reduction of the number of sensors connected to patients, which will increase the patients\\' comfort and reduce the costs associated with healthcare. In particular, it is relevant for sleep monitoring, where a reduction of the number of different sensors would facilitate a more natural sleeping environment and hence a higher sensitivity of the diagnosis. © 2011 IEEE.

  7. Energy-Saving Melting and Revert Reduction Technology (E-SMARRT): Final Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    White, Thornton C [SCRA Appiled R& D

    2014-03-31

    Energy-Saving Melting and Revert Reduction Technology (E-SMARRT) is a balanced portfolio of R&D tasks that address energy-saving opportunities in the metalcasting industry. E-SMARRT was created to: • Improve important capabilities of castings • Reduce carbon footprint of the foundry industry • Develop new job opportunities in manufacturing • Significantly reduce metalcasting process energy consumption and includes R&D in the areas of: • Improvements in Melting Efficiency • Innovative Casting Processes for Yield Improvement/Revert Reduction • Instrumentation and Control Improvement • Material properties for Casting or Tooling Design Improvement The energy savings and process improvements developed under E-SMARRT have been made possible through the unique collaborative structure of the E-SMARRT partnership. The E-SMARRT team consisted of DOE’s Office of Industrial Technology, the three leading metalcasting technical associations in the U.S: the American Foundry Society; the North American Die Casting Association; and the Steel Founders’ Society of America; and SCRA Applied R&D, doing business as the Advanced Technology Institute (ATI), a recognized leader in distributed technology management. This team provided collaborative leadership to a complex industry composed of approximately 2,000 companies, 80% of which employ less than 100 people, and only 4% of which employ more than 250 people. Without collaboration, these new processes and technologies that enable energy efficiencies and environment-friendly improvements would have been slow to develop and had trouble obtaining a broad application. The E-SMARRT R&D tasks featured low-threshold energy efficiency improvements that are attractive to the domestic industry because they do not require major capital investment. The results of this portfolio of projects are significantly reducing metalcasting process energy consumption while improving the important capabilities of metalcastings. Through June

  8. Numerical tools to estimate the flux of a gas across the air–water interface and assess the heterogeneity of its forcing functions

    Directory of Open Access Journals (Sweden)

    V. M. N. C. S. Vieira

    2013-03-01

    Full Text Available A numerical tool was developed for the estimation of gas fluxes across the air–water interface. The primary objective is to use it to estimate CO2 fluxes. Nevertheless application to other gases is easily accomplished by changing the values of the parameters related to the physical properties of the gases. A user-friendly software was developed allowing to build upon a standard kernel a custom-made gas flux model with the preferred parameterizations. These include single or double layer models; several numerical schemes for the effects of wind in the air-side and water-side transfer velocities; the effects of atmospheric stability, surface roughness and turbulence from current drag with the bottom; and the effects on solubility of water temperature, salinity, air temperature and pressure. An analysis was also developed which decomposes the difference between the fluxes in a reference situation and in alternative situations into its several forcing functions. This analysis relies on the Taylor expansion of the gas flux model, requiring the numerical estimation of partial derivatives by a multivariate version of the collocation polynomial. Both the flux model and the difference decomposition analysis were tested with data taken from surveys done in the lagoon system of Ria Formosa, south Portugal, in which the CO2 fluxes were estimated using the infrared gas analyzer (IRGA and floating chamber method, whereas the CO2 concentrations were estimated using the IRGA and degasification chamber. Observations and estimations show a remarkable fit.

  9. Spica MRI after closed reduction for developmental dysplasia of the hip

    Energy Technology Data Exchange (ETDEWEB)

    Desai, Aditi A. [Vanderbilt University School of Medicine, Nashville, TN (United States); Martus, Jeffrey E.; Schoenecker, Jon [Vanderbilt University School of Medicine, Department of Orthopaedics and Rehabilitation, Monroe Carroll Jr. Children' s Hospital at Vanderbilt, Nashville, TN (United States); Kan, J.H. [Vanderbilt University School of Medicine, Department of Radiology and Radiological Sciences, Monroe Carroll Jr. Children' s Hospital at Vanderbilt, Nashville, TN (United States)

    2011-04-15

    Spica MRI is a fast and effective tool to assess morphology after closed reduction for developmental dysplasia of the hip (DDH) without the need for sedation. The multiplanar capabilities allow depiction of coronal and axial reduction of the hips. Due to MRI's inherent ability to delineate soft tissue structures, both intrinsic and extrinsic obstacles to failed reduction may be identified. Technical and interpretative challenges of spica MRI are discussed. (orig.)

  10. Progress report on the development of remotely operated tools

    International Nuclear Information System (INIS)

    Shenton, J.R.

    1985-02-01

    Various tools will be required during the size reduction of contaminated plant and equipment, necessitating the removal and replacement of tool modules on the remotely operated pantograph arm, and it is envisaged that there will be a carrier holding a range of tool modules which may be selected for use. This report covers the trials work carried out to date using the single module tool change station, which was manufactured in order to assess the problems likely to occur when disconnecting the existing interchangeable modules from the end effector. (author)

  11. Changes in Effective Thermal Conductivity During the Carbothermic Reduction of Magnetite Using Graphite

    Science.gov (United States)

    Kiamehr, Saeed; Ahmed, Hesham; Viswanathan, Nurni; Seetharaman, Seshadri

    2017-06-01

    Knowledge of the effective thermal diffusivity changes of systems undergoing reactions where heat transfer plays an important role in the reaction kinetics is essential for process understanding and control. Carbothermic reduction process of magnetite containing composites is a typical example of such systems. The reduction process in this case is highly endothermic and hence, the overall rate of the reaction is greatly influenced by the heat transfer through composite compact. Using Laser-Flash method, the change of effective thermal diffusivity of magnetite-graphite composite pellet was monitored in the dynamic mode over a pre-defined thermal cycle (heating at the rate of 7 K/min to 1423 K (1150 °C), holding the sample for 270 minutes at this temperature and then cooling it down to the room temperature at the same rate as heating). These measurements were supplemented by Thermogravimetric Analysis under comparable experimental conditions as well as quenching tests of the samples in order to combine the impact of various factors such as sample dilatations and changes in apparent density on the progress of the reaction. The present results show that monitoring thermal diffusivity changes during the course of reduction would be a very useful tool in a total understanding of the underlying physicochemical phenomena. At the end, effort is made to estimate the apparent thermal conductivity values based on the measured thermal diffusivity and dilatations.

  12. Sulfate reduction in freshwater peatlands

    Energy Technology Data Exchange (ETDEWEB)

    Oequist, M.

    1996-12-31

    This text consist of two parts: Part A is a literature review on microbial sulfate reduction with emphasis on freshwater peatlands, and part B presents the results from a study of the relative importance of sulfate reduction and methane formation for the anaerobic decomposition in a boreal peatland. The relative importance of sulfate reduction and methane production for the anaerobic decomposition was studied in a small raised bog situated in the boreal zone of southern Sweden. Depth distribution of sulfate reduction- and methane production rates were measured in peat sampled from three sites (A, B, and C) forming an minerotrophic-ombrotrophic gradient. SO{sub 4}{sup 2-} concentrations in the three profiles were of equal magnitude and ranged from 50 to 150 {mu}M. In contrast, rates of sulfate reduction were vastly different: Maximum rates in the three profiles were obtained at a depth of ca. 20 cm below the water table. In A it was 8 {mu}M h{sup -1} while in B and C they were 1 and 0.05 {mu}M h{sup -1}, respectively. Methane production rates, however, were more uniform across the three nutrient regimes. Maximum rates in A (ca. 1.5 {mu}g d{sup -1} g{sup -1}) were found 10 cm below the water table, in B (ca. 1.0 {mu}g d{sup -1} g{sup -1}) in the vicinity of the water table, and in C (0.75 {mu}g d{sup -1} g{sup -1}) 20 cm below the water table. In all profiles both sulfate reduction and methane production rates were negligible above the water table. The areal estimates of methane production for the profiles were 22.4, 9.0 and 6.4 mmol m{sup -2} d{sup -1}, while the estimates for sulfate reduction were 26.4, 2.5, and 0.1 mmol m{sup -2} d{sup -1}, respectively. The calculated turnover times at the sites were 1.2, 14.2, and 198.7 days, respectively. The study shows that sulfate reducing bacteria are important for the anaerobic degradation in the studied peatland, especially in the minerotrophic sites, while methanogenic bacteria dominate in ombrotrophic sites Examination

  13. Sulfate reduction in freshwater peatlands

    International Nuclear Information System (INIS)

    Oequist, M.

    1996-01-01

    This text consist of two parts: Part A is a literature review on microbial sulfate reduction with emphasis on freshwater peatlands, and part B presents the results from a study of the relative importance of sulfate reduction and methane formation for the anaerobic decomposition in a boreal peatland. The relative importance of sulfate reduction and methane production for the anaerobic decomposition was studied in a small raised bog situated in the boreal zone of southern Sweden. Depth distribution of sulfate reduction- and methane production rates were measured in peat sampled from three sites (A, B, and C) forming an minerotrophic-ombrotrophic gradient. SO 4 2- concentrations in the three profiles were of equal magnitude and ranged from 50 to 150 μM. In contrast, rates of sulfate reduction were vastly different: Maximum rates in the three profiles were obtained at a depth of ca. 20 cm below the water table. In A it was 8 μM h -1 while in B and C they were 1 and 0.05 μM h -1 , respectively. Methane production rates, however, were more uniform across the three nutrient regimes. Maximum rates in A (ca. 1.5 μg d -1 g -1 ) were found 10 cm below the water table, in B (ca. 1.0 μg d -1 g -1 ) in the vicinity of the water table, and in C (0.75 μg d -1 g -1 ) 20 cm below the water table. In all profiles both sulfate reduction and methane production rates were negligible above the water table. The areal estimates of methane production for the profiles were 22.4, 9.0 and 6.4 mmol m -2 d -1 , while the estimates for sulfate reduction were 26.4, 2.5, and 0.1 mmol m -2 d -1 , respectively. The calculated turnover times at the sites were 1.2, 14.2, and 198.7 days, respectively. The study shows that sulfate reducing bacteria are important for the anaerobic degradation in the studied peatland, especially in the minerotrophic sites, while methanogenic bacteria dominate in ombrotrophic sites Examination paper. 67 refs, 6 figs, 3 tabs

  14. Statistical tools applied for the reduction of the defect rate of coffee degassing valves

    Directory of Open Access Journals (Sweden)

    Giorgio Olmi

    2015-04-01

    Full Text Available Coffee is a very common beverage exported all over the world: just after roasting, coffee beans are packed in plastic or paper bags, which then experience long transfers with long storage times. Fresh roasted coffee emits large amounts of CO2 for several weeks. This gas must be gradually released, to prevent package over-inflation and to preserve aroma, moreover beans must be protected from oxygen coming from outside. Therefore, one-way degassing valves are applied to each package: their correct functionality is strictly related to the interference coupling between their bodies and covers and to the correct assembly of the other involved parts. This work takes inspiration from an industrial problem: a company that assembles valve components, supplied by different manufacturers, observed a high level of defect rate, affecting its valve production. An integrated approach, consisting in the adoption of quality charts, in an experimental campaign for the dimensional analysis of the mating parts and in the statistical processing of the data, was necessary to tackle the question. In particular, a simple statistical tool was made available to predict the defect rate and to individuate the best strategy for its reduction. The outcome was that requiring a strict protocol, regarding the combinations of parts from different manufacturers for assembly, would have been almost ineffective. Conversely, this study led to the individuation of the weak point in the manufacturing process of the mating components and to the suggestion of a slight improvement to be performed, with the final result of a significant (one order of magnitude decrease of the defect rate.

  15. Stated Preference Survey Estimating the Willingness to Pay ...

    Science.gov (United States)

    A national stated preference survey designed to elicit household willingness to pay for reductions in impinged and entrained fish at cooling water intake structures. To improve estimation of environmental benefits estimation

  16. GEOMETRICAL CHARACTERIZATION OF MICRO END MILLING TOOLS

    DEFF Research Database (Denmark)

    Borsetto, Francesca; Bariani, Paolo

    The milling process is one of the most common metal removal operation used in industry. This machining process is well known since the beginning of last century and has experienced, along the years, many improvements of the basic technology, as concerns tools, machine tools, coolants/lubricants, ......The milling process is one of the most common metal removal operation used in industry. This machining process is well known since the beginning of last century and has experienced, along the years, many improvements of the basic technology, as concerns tools, machine tools, coolants....../lubricants, milling strategies and controls. Moreover the accuracy of tool geometry directly affects the performance of the milling process influencing the dimensional tolerances of the machined part, the surface topography, the chip formation, the cutting forces and the tool-life. The dimensions of certain...... geometrical details, as for instance the cutting edge radius, are determined by characteristics of the manufacturing process, tool material, coating etc. While for conventional size end mills the basic tool manufacturing process is well established, the reduction of the size of the tools required...

  17. Integrated Wind Power Planning Tool

    DEFF Research Database (Denmark)

    Rosgaard, M. H.; Hahmann, Andrea N.; Nielsen, T. S.

    This poster describes the status as of April 2012 of the Public Service Obligation (PSO) funded project PSO 10464 \\Integrated Wind Power Planning Tool". The project goal is to integrate a meso scale numerical weather prediction (NWP) model with a statistical tool in order to better predict short...... term power variation from off shore wind farms, as well as to conduct forecast error assessment studies in preparation for later implementation of such a feature in an existing simulation model. The addition of a forecast error estimation feature will further increase the value of this tool, as it...

  18. Cost assessment and ecological effectiveness of nutrient reduction options for mitigating Phaeocystis colony blooms in the Southern North Sea: an integrated modeling approach.

    Science.gov (United States)

    Lancelot, Christiane; Thieu, Vincent; Polard, Audrey; Garnier, Josette; Billen, Gilles; Hecq, Walter; Gypens, Nathalie

    2011-05-01

    Nutrient reduction measures have been already taken by wealthier countries to decrease nutrient loads to coastal waters, in most cases however, prior to having properly assessed their ecological effectiveness and their economic costs. In this paper we describe an original integrated impact assessment methodology to estimate the direct cost and the ecological performance of realistic nutrient reduction options to be applied in the Southern North Sea watershed to decrease eutrophication, visible as Phaeocystis blooms and foam deposits on the beaches. The mathematical tool couples the idealized biogeochemical GIS-based model of the river system (SENEQUE-RIVERSTRAHLER) implemented in the Eastern Channel/Southern North Sea watershed to the biogeochemical MIRO model describing Phaeocystis blooms in the marine domain. Model simulations explore how nutrient reduction options regarding diffuse and/or point sources in the watershed would affect the Phaeocystis colony spreading in the coastal area. The reference and prospective simulations are performed for the year 2000 characterized by mean meteorological conditions, and nutrient reduction scenarios include and compare upgrading of wastewater treatment plants and changes in agricultural practices including an idealized shift towards organic farming. A direct cost assessment is performed for each realistic nutrient reduction scenario. Further the reduction obtained for Phaeocystis blooms is assessed by comparison with ecological indicators (bloom magnitude and duration) and the cost for reducing foam events on the beaches is estimated. Uncertainty brought by the added effect of meteorological conditions (rainfall) on coastal eutrophication is discussed. It is concluded that the reduction obtained by implementing realistic environmental measures on the short-term is costly and insufficient to restore well-balanced nutrient conditions in the coastal area while the replacement of conventional agriculture by organic farming

  19. Aerial Survey as a Tool to Estimate Abundance and Describe Distribution of a Carcharhinid Species, the Lemon Shark, Negaprion brevirostris

    Directory of Open Access Journals (Sweden)

    S. T. Kessel

    2013-01-01

    Full Text Available Aerial survey provides an important tool to assess the abundance of both terrestrial and marine vertebrates. To date, limited work has tested the effectiveness of this technique to estimate the abundance of smaller shark species. In Bimini, Bahamas, the lemon shark (Negaprion brevirostris shows high site fidelity to a shallow sandy lagoon, providing an ideal test species to determine the effectiveness of localised aerial survey techniques for a Carcharhinid species in shallow subtropical waters. Between September 2007 and September 2008, visual surveys were conducted from light aircraft following defined transects ranging in length between 8.8 and 4.4 km. Count results were corrected for “availability”, “perception”, and “survey intensity” to provide unbiased abundance estimates. The abundance of lemon sharks was greatest in the central area of the lagoon during high tide, with a change in abundance distribution to the east and western regions of the lagoon with low tide. Mean abundance of sharks was estimated at 49 (±8.6 individuals, and monthly abundance was significantly positively correlated with mean water temperature. The successful implementation of the aerial survey technique highlighted the potential of further employment for shark abundance assessments in shallow coastal marine environments.

  20. Effect of large weight reductions on measured and estimated kidney function

    DEFF Research Database (Denmark)

    von Scholten, Bernt Johan; Persson, Frederik; Svane, Maria S

    2017-01-01

    GFR (creatinine-based equations), whereas measured GFR (mGFR) and cystatin C-based eGFR would be unaffected if adjusted for body surface area. METHODS: Prospective, intervention study including 19 patients. All attended a baseline visit before gastric bypass surgery followed by a visit six months post-surgery. m...... for body surface area was unchanged. Estimates of GFR based on creatinine overestimate renal function likely due to changes in muscle mass, whereas cystatin C based estimates are unaffected. TRIAL REGISTRATION: ClinicalTrials.gov, NCT02138565 . Date of registration: March 24, 2014....

  1. Uncertainty quantification of CO2 emission reduction for maritime shipping

    International Nuclear Information System (INIS)

    Yuan, Jun; Ng, Szu Hui; Sou, Weng Sut

    2016-01-01

    The International Maritime Organization (IMO) has recently proposed several operational and technical measures to improve shipping efficiency and reduce the greenhouse gases (GHG) emissions. The abatement potentials estimated for these measures have been further used by many organizations to project future GHG emission reductions and plot Marginal Abatement Cost Curves (MACC). However, the abatement potentials estimated for many of these measures can be highly uncertain as many of these measures are new, with limited sea trial information. Furthermore, the abatements obtained are highly dependent on ocean conditions, trading routes and sailing patterns. When the estimated abatement potentials are used for projections, these ‘input’ uncertainties are often not clearly displayed or accounted for, which can lead to overly optimistic or pessimistic outlooks. In this paper, we propose a methodology to systematically quantify and account for these input uncertainties on the overall abatement potential forecasts. We further propose improvements to MACCs to better reflect the uncertainties in marginal abatement costs and total emissions. This approach provides a fuller and more accurate picture of abatement forecasts and potential reductions achievable, and will be useful to policy makers and decision makers in the shipping industry to better assess the cost effective measures for CO 2 emission reduction. - Highlights: • We propose a systematic method to quantify uncertainty in emission reduction. • Marginal abatement cost curves are improved to better reflect the uncertainties. • Percentage reduction probability is given to determine emission reduction target. • The methodology is applied to a case study on maritime shipping.

  2. Advanced pressure tube sampling tools

    International Nuclear Information System (INIS)

    Wittich, K.C.; King, J.M.

    2002-01-01

    Deuterium concentration is an important parameter that must be assessed to evaluate the Fitness for service of CANDU pressure tubes. In-reactor pressure tube sampling allows accurate deuterium concentration assessment to be made without the expenses associated with fuel channel removal. This technology, which AECL has developed over the past fifteen years, has become the standard method for deuterium concentration assessment. AECL is developing a multi-head tool that would reduce in-reactor handling overhead by allowing one tool to sequentially sample at all four axial pressure tube locations before removal from the reactor. Four sets of independent cutting heads, like those on the existing sampling tools, facilitate this incorporating proven technology demonstrated in over 1400 in-reactor samples taken to date. The multi-head tool is delivered by AECL's Advanced Delivery Machine or other similar delivery machines. Further, AECL has developed an automated sample handling system that receives and processes the tool once out of the reactor. This system retrieves samples from the tool, dries, weighs and places them in labelled vials which are then directed into shielded shipping flasks. The multi-head wet sampling tool and the automated sample handling system are based on proven technology and offer continued savings and dose reduction to utilities in a competitive electricity market. (author)

  3. Uncertainty estimation and multi sensor fusion for kinematic laser tracker measurements

    Science.gov (United States)

    Ulrich, Thomas

    2013-08-01

    Laser trackers are widely used to measure kinematic tasks such as tracking robot movements. Common methods to evaluate the uncertainty in the kinematic measurement include approximations specified by the manufacturers, various analytical adjustment methods and the Kalman filter. In this paper a new, real-time technique is proposed, which estimates the 4D-path (3D-position + time) uncertainty of an arbitrary path in space. Here a hybrid system estimator is applied in conjunction with the kinematic measurement model. This method can be applied to processes, which include various types of kinematic behaviour, constant velocity, variable acceleration or variable turn rates. The new approach is compared with the Kalman filter and a manufacturer's approximations. The comparison was made using data obtained by tracking an industrial robot's tool centre point with a Leica laser tracker AT901 and a Leica laser tracker LTD500. It shows that the new approach is more appropriate to analysing kinematic processes than the Kalman filter, as it reduces overshoots and decreases the estimated variance. In comparison with the manufacturer's approximations, the new approach takes account of kinematic behaviour with an improved description of the real measurement process and a reduction in estimated variance. This approach is therefore well suited to the analysis of kinematic processes with unknown changes in kinematic behaviour as well as the fusion among laser trackers.

  4. The Asset Drivers, Well-being Interaction Matrix (ADWIM: A participatory tool for estimating future impacts on ecosystem services and livelihoods

    Directory of Open Access Journals (Sweden)

    T.D. Skewes

    2016-01-01

    Full Text Available Building an effective response for communities to climate change requires decision-support tools that deliver information which stakeholders find relevant for exploring potential short and long-term impacts on livelihoods. Established principles suggest that to successfully communicate scientific information, such tools must be transparent, replicable, relevant, credible, flexible, affordable and unbiased. In data-poor contexts typical of developing countries, they should also be able to integrate stakeholders’ knowledge and values, empowering them in the process. We present a participatory tool, the Asset Drivers Well-being Interaction Matrix (ADWIM, which estimates future impacts on ecosystem goods and services (EGS and communities’ well-being through the cumulative effects of system stressors. ADWIM consists of two modelling steps: an expert-informed, cumulative impact assessment for EGS; which is then integrated with a stakeholder-informed EGS valuation process carried out during adaptation planning workshops. We demonstrate the ADWIM process using examples from Nusa Tenggara Barat Province (NTB in eastern Indonesia. The semi-quantitative results provide an assessment of the relative impacts on EGS and human well-being under the ‘Business as Usual’ scenario of climate change and human population growth at different scales in NTB, information that is subsequently used for designing adaptation strategies. Based on these experiences, we discuss the relative strengths and weaknesses of ADWIM relative to principles of effective science communication and ecosystem services modelling. ADWIM’s apparent attributes as an analysis, decision support and communication tool promote its utility for participatory adaptation planning. We also highlight its relevance as a ‘boundary object’ to provide learning and reflection about the current and likely future importance of EGS to livelihoods in NTB.

  5. Tools for Managing Repository Objects

    OpenAIRE

    Banker, Rajiv D.; Isakowitz, Tomas; Kauffman, Robert J.; Kumar, Rachna; Zweig, Dani

    1993-01-01

    working Paper Series: STERN IS-93-46 The past few years have seen the introduction of repository-based computer aided software engineering (CASE) tools which may finally enable us to develop software which is reliable and affordable. With the new tools come new challenges for management: Repository-based CASE changes software development to such an extent that traditional approaches to estimation, performance, and productivity assessment may no longer suffice - if they ever...

  6. Coal-Fired Power Plant Heat Rate Reductions

    Science.gov (United States)

    View a report that identifies systems and equipment in coal-fired power plants where efficiency improvements can be realized, and provides estimates of the resulting net plant heat rate reductions and costs for implementation.

  7. Improving multisensor estimation of heavy-to-extreme precipitation via conditional bias-penalized optimal estimation

    Science.gov (United States)

    Kim, Beomgeun; Seo, Dong-Jun; Noh, Seong Jin; Prat, Olivier P.; Nelson, Brian R.

    2018-01-01

    A new technique for merging radar precipitation estimates and rain gauge data is developed and evaluated to improve multisensor quantitative precipitation estimation (QPE), in particular, of heavy-to-extreme precipitation. Unlike the conventional cokriging methods which are susceptible to conditional bias (CB), the proposed technique, referred to herein as conditional bias-penalized cokriging (CBPCK), explicitly minimizes Type-II CB for improved quantitative estimation of heavy-to-extreme precipitation. CBPCK is a bivariate version of extended conditional bias-penalized kriging (ECBPK) developed for gauge-only analysis. To evaluate CBPCK, cross validation and visual examination are carried out using multi-year hourly radar and gauge data in the North Central Texas region in which CBPCK is compared with the variant of the ordinary cokriging (OCK) algorithm used operationally in the National Weather Service Multisensor Precipitation Estimator. The results show that CBPCK significantly reduces Type-II CB for estimation of heavy-to-extreme precipitation, and that the margin of improvement over OCK is larger in areas of higher fractional coverage (FC) of precipitation. When FC > 0.9 and hourly gauge precipitation is > 60 mm, the reduction in root mean squared error (RMSE) by CBPCK over radar-only (RO) is about 12 mm while the reduction in RMSE by OCK over RO is about 7 mm. CBPCK may be used in real-time analysis or in reanalysis of multisensor precipitation for which accurate estimation of heavy-to-extreme precipitation is of particular importance.

  8. A generic tool for cost estimating in aircraft design

    NARCIS (Netherlands)

    Castagne, S.; Curran, R.; Rothwell, A.; Price, M.; Benard, E.; Raghunathan, S.

    2008-01-01

    A methodology to estimate the cost implications of design decisions by integrating cost as a design parameter at an early design stage is presented. The model is developed on a hierarchical basis, the manufacturing cost of aircraft fuselage panels being analysed in this paper. The manufacturing cost

  9. Net Pay Estimator | Alaska Division of Retirement and Benefits

    Science.gov (United States)

    Benefits > Net Pay Estimator Online Counselor Scheduler Empower Retirement Account Info Online myRnB Accessibility Net Pay Estimator Click here for the Retiree Net Pay Estimator? The net pay estimator is a useful tool to estimate your net pay under different salaries, federal withholding tax exemptions, and

  10. Trajectory-Based Operations (TBO) Cost Estimation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The Innovation Laboratory, Inc., proposes to build a tool to estimate airline costs under TBO. This tool includes a cost model that explicitly reasons about traffic...

  11. Model reduction of parametrized systems

    CERN Document Server

    Ohlberger, Mario; Patera, Anthony; Rozza, Gianluigi; Urban, Karsten

    2017-01-01

    The special volume offers a global guide to new concepts and approaches concerning the following topics: reduced basis methods, proper orthogonal decomposition, proper generalized decomposition, approximation theory related to model reduction, learning theory and compressed sensing, stochastic and high-dimensional problems, system-theoretic methods, nonlinear model reduction, reduction of coupled problems/multiphysics, optimization and optimal control, state estimation and control, reduced order models and domain decomposition methods, Krylov-subspace and interpolatory methods, and applications to real industrial and complex problems. The book represents the state of the art in the development of reduced order methods. It contains contributions from internationally respected experts, guaranteeing a wide range of expertise and topics. Further, it reflects an important effor t, carried out over the last 12 years, to build a growing research community in this field. Though not a textbook, some of the chapters ca...

  12. Estimated effects on radiation doses from alternatives in a spent fuel transportation system

    International Nuclear Information System (INIS)

    Schneider, K.J.; Ross, W.A.; Smith, R.I.

    1988-07-01

    This paper contains the results of a study of estimated radiation doses to the public and workers from the transport of spent fuel from commercial nuclear power reactors to a geologic repository. A postulated reference rail/legal-weight truck transportation system is defined that would use current transportation technology, and provide a breakdown of activities and time/distance/dose-rate estimates for each activity within the system. Collective doses are estimated for each of the major activities at the reactor site, in transit, and at the repository receiving facility. Annual individual doses to the maximally exposed individuals or groups of individuals are also estimated. The dose-reduction potentials and costs are estimated for a total of 17 conceptual alternatives and subalternatives to the postulated reference system. Most of the alternatives evaluated are estimated to provide both cost and dose reductions. The major conclusion is that the potential exists for significant future reductions in radiation doses to the public and workers and for reductions in costs compared to those based on a continuation of past practices in the US

  13. Estimated effects on radiation doses from alternatives in a spent fuel transportation system

    International Nuclear Information System (INIS)

    Schneider, K.J.; Ross, W.A.; Smith, R.I.

    1988-01-01

    This paper contains the results of a study of estimated radiation doses to the public and workers from the transport of spent fuel from commercial nuclear power reactors to a geologic repository. A postulated reference rail/legal-weight truck transportation system is defined that would use current transportation technology, and provide a breakdown of activities and time/distance/dose-rate estimates for each activity within the system. Collective doses are estimated for each of the major activities at the reactor site, in transit, and at the repository receiving facility. Annual individual doses to the maximally exposed individuals or groups of individuals also estimated. The dose-reduction potentials and costs are estimated for a total of 17 conceptual alternatives and subalternatives to the postulated reference system. Most of the alternatives evaluated are estimated to provide both cost and dose reductions. The major conclusion is that the potential exists for significant future reductions in radiation doses to the public and workers and for reductions in costs compared to those based on a continuation of past practices in the U.S

  14. Model Checking Markov Chains: Techniques and Tools

    NARCIS (Netherlands)

    Zapreev, I.S.

    2008-01-01

    This dissertation deals with four important aspects of model checking Markov chains: the development of efficient model-checking tools, the improvement of model-checking algorithms, the efficiency of the state-space reduction techniques, and the development of simulation-based model-checking

  15. Development of a customised design flood estimation tool to ...

    African Journals Online (AJOL)

    The estimation of design flood events, i.e., floods characterised by a specific magnitude-frequency relationship, at a particular site in a specific region is necessary for the planning, design and operation of hydraulic structures. Both the occurrence and frequency of flood events, along with the uncertainty involved in the ...

  16. Generating Sub-nanometer Displacement Using Reduction Mechanism Consisting of Torsional Leaf Spring Hinges

    Directory of Open Access Journals (Sweden)

    Fukuda Makoto

    2014-02-01

    Full Text Available Recent demand on the measurement resolution of precise positioning comes up to tens of picometers. Some distinguished researches have been performed to measure the displacement in picometer order, however, few of them can verify the measurement performance as available tools in industry. This is not only because the picometer displacement is not yet required for industrial use, but also due to the lack of standard tools to verify such precise displacement. We proposed a displacement reduction mechanism for generating precise displacement using torsional leaf spring hinges (TLSHs that consist of four leaf springs arranged radially. It has been demonstrated that a prototype of the reduction mechanism was able to provide one-nanometer displacement with 1/1000 reduction rate by a piezoelectric actuator. In order to clarify the potential of the reduction mechanism, a displacement reduction table that can be mounted on AFM stage was newly developed using TLSHs. This paper describes the design of the reduction mechanism and the sub-nanometer displacement performance of the table obtained from its dynamic and static characteristics measured by displacement sensors and from the AFM images

  17. Estimating 3D Object Parameters from 2D Grey-Level Images

    NARCIS (Netherlands)

    Houkes, Z.

    2000-01-01

    This thesis describes a general framework for parameter estimation, which is suitable for computer vision applications. The approach described combines 3D modelling, animation and estimation tools to determine parameters of objects in a scene from 2D grey-level images. The animation tool predicts

  18. A holistic approach to age estimation in refugee children.

    Science.gov (United States)

    Sypek, Scott A; Benson, Jill; Spanner, Kate A; Williams, Jan L

    2016-06-01

    Many refugee children arriving in Australia have an inaccurately documented date of birth (DOB). A medical assessment of a child's age is often requested when there is a concern that their documented DOB is incorrect. This study's aim was to assess the accuracy a holistic age assessment tool (AAT) in estimating the age of refugee children newly settled in Australia. A holistic AAT that combines medical and non-medical approaches was used to estimate the ages of 60 refugee children with a known DOB. The tool used four components to assess age: an oral narrative, developmental assessment, anthropometric measures and pubertal assessment. Assessors were blinded to the true age of the child. Correlation coefficients for the actual and estimated age were calculated for the tool overall and individual components. The correlation coefficient between the actual and estimated age from the AAT was very strong at 0.9802 (boys 0.9748, girls 0.9876). The oral narrative component of the tool performed best (R = 0.9603). Overall, 86.7% of age estimates were within 1 year of the true age. The range of differences was -1.43 to 3.92 years with a standard deviation of 0.77 years (9.24 months). The AAT is a holistic, simple and safe instrument that can be used to estimate age in refugee children with results comparable with radiological methods currently used. © 2016 Paediatrics and Child Health Division (The Royal Australasian College of Physicians).

  19. Estimating incidence of problem drug use using the Horwitz-Thompson estimator - A new approach applied to people who inject drugs in Oslo 1985-2008.

    Science.gov (United States)

    Amundsen, Ellen J; Bretteville-Jensen, Anne L; Kraus, Ludwig

    2016-01-01

    The trend in the number of new problem drug users per year (incidence) is the most important measure for studying the diffusion of problem drug use. Due to sparse data sources and complicated statistical models, estimation of incidence of problem drug use is challenging. The aim of this study is to widen the palette of available methods and data types for estimating incidence of problem drug use over time, and for identifying the trends. This study presents a new method of incidence estimation, applied to people who inject drugs (PWID) in Oslo. The method took into account the transition between different phases of drug use progression - active use, temporary cessation, and permanent cessation. The Horwitz-Thompson estimator was applied. Data included 16 cross-sectional samples of problem drug users who reported their onset of injecting drug use. We explored variation in results for selected probable scenarios of parameter variation for disease progression, as well as the stability of the results based on fewer years of cross-sectional samples. The method yielded incidence estimates of problem drug use, over time. When applied to people in Oslo who inject drugs, we found a significant reduction of incidence of 63% from 1985 to 2008. This downward trend was also present when the estimates were based on fewer surveys (five) and in the results of sensitivity analysis for likely scenarios of disease progression. This new method, which incorporates temporarily inactive problem drug users, may become a useful tool for estimating the incidence of problem drug use over time. The method may be less data intensive than other methods based on first entry to treatment and may be generalized to other groups of substance users. Further studies on drug use progression would improve the validity of the results. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Predicting tool life in turning operations using neural networks and image processing

    Science.gov (United States)

    Mikołajczyk, T.; Nowicki, K.; Bustillo, A.; Yu Pimenov, D.

    2018-05-01

    A two-step method is presented for the automatic prediction of tool life in turning operations. First, experimental data are collected for three cutting edges under the same constant processing conditions. In these experiments, the parameter of tool wear, VB, is measured with conventional methods and the same parameter is estimated using Neural Wear, a customized software package that combines flank wear image recognition and Artificial Neural Networks (ANNs). Second, an ANN model of tool life is trained with the data collected from the first two cutting edges and the subsequent model is evaluated on two different subsets for the third cutting edge: the first subset is obtained from the direct measurement of tool wear and the second is obtained from the Neural Wear software that estimates tool wear using edge images. Although the complete-automated solution, Neural Wear software for tool wear recognition plus the ANN model of tool life prediction, presented a slightly higher error than the direct measurements, it was within the same range and can meet all industrial requirements. These results confirm that the combination of image recognition software and ANN modelling could potentially be developed into a useful industrial tool for low-cost estimation of tool life in turning operations.

  1. SBAT: A Tool for Estimating Metal Bioaccessibility in Soils

    Energy Technology Data Exchange (ETDEWEB)

    Heuscher, S.A.

    2004-04-21

    Heavy metals such as chromium and arsenic are widespread in the environment due to their usage in many industrial processes. These metals may pose significant health risks to humans, especially children, due to their mutagenic and carcinogenic properties. Typically, the health risks associated with the ingestion of soil-bound metals are estimated by assuming that the metals are completely absorbed through the human intestinal tract (100% bioavailable). This assumption potentially overestimates the risk since soils are known to strongly sequester metals thereby potentially lowering their bioavailability. Beginning in 2000, researchers at Oak Ridge National Laboratory, with funding from the Strategic Environmental Research and Development Program (SERDP), studied the effect of soil properties on the bioaccessibility of soil-bound arsenic and chromium. Representative A and upper-B horizons from seven major U.S. soil orders were obtained from the U.S. Department of Agriculture's National Resources Conservation Service and the U.S. Department of Energy's Oak Ridge Reservation. The soils were spiked with known concentrations of arsenic (As(III) and As(V)) and chromium (Cr(III) and Cr(VI)), and the bioaccessibility was measured using a physiologically based extraction test that mimics the gastric activity of children. Linear regression models were then developed to relate the bioaccessibility measurements to the soil properties (Yang et al. 2002; Stewart et al. 2003a). Important results from these publications and other studies include: (1) Cr(VI) and As(III) are more toxic and bioavailable than Cr(III) and As(V) respectively. (2) Several favorable processes can occur in soils that promote the oxidation of As(III) to As(V) and the reduction of Cr(VI) to Cr(III), thereby lowering bioaccessibility. Iron and manganese oxides are capable of oxidizing As(III) to As(V), whereas organic matter and Fe(II)-bearing minerals are capable of reducing Cr(VI) to Cr(III). (3

  2. Gender and poverty reduction strategy processes in Latin America

    OpenAIRE

    Dijkstra, Geske

    2007-01-01

    textabstractIn 1999, countries that wished to qualify for the Enhanced Initiative for the Heavily Indebted Poor Countries (HIPC initiative) had to elaborate Poverty Reduction Strategy Papers (PRSP) and had to do so with participation of civil society. Since then, the elaboration and subsequent implementation of PRSs (Poverty Reduction Strategies)have been seen as a tool for the international donor community to guarantee that not only debt relief, but also aid in general would be spent well. T...

  3. Twitter as a Potential Disaster Risk Reduction Tool. Part II: Descriptive Analysis of Identified Twitter Activity during the 2013 Hattiesburg F4 Tornado.

    Science.gov (United States)

    Cooper, Guy Paul; Yeager, Violet; Burkle, Frederick M; Subbarao, Italo

    2015-06-29

    This article describes a novel triangulation methodological approach for identifying twitter activity of regional active twitter users during the 2013 Hattiesburg EF-4 Tornado. A data extraction and geographically centered filtration approach was utilized to generate Twitter data for 48 hrs pre- and post-Tornado. The data was further validated using six sigma approach utilizing GPS data. The regional analysis revealed a total of 81,441 tweets, 10,646 Twitter users, 27,309 retweets and 2637 tweets with GPS coordinates. Twitter tweet activity increased 5 fold during the response to the Hattiesburg Tornado.  Retweeting activity increased 2.2 fold. Tweets with a hashtag increased 1.4 fold. Twitter was an effective disaster risk reduction tool for the Hattiesburg EF-4 Tornado 2013.

  4. Power fluctuation reduction methodology for the grid-connected renewable power systems

    Science.gov (United States)

    Aula, Fadhil T.; Lee, Samuel C.

    2013-04-01

    This paper presents a new methodology for eliminating the influence of the power fluctuations of the renewable power systems. The renewable energy, which is to be considered an uncertain and uncontrollable resource, can only provide irregular electrical power to the power grid. This irregularity creates fluctuations of the generated power from the renewable power systems. These fluctuations cause instability to the power system and influence the operation of conventional power plants. Overall, the power system is vulnerable to collapse if necessary actions are not taken to reduce the impact of these fluctuations. This methodology aims at reducing these fluctuations and makes the generated power capability for covering the power consumption. This requires a prediction tool for estimating the generated power in advance to provide the range and the time of occurrence of the fluctuations. Since most of the renewable energies are weather based, as a result a weather forecast technique will be used for predicting the generated power. The reduction of the fluctuation also requires stabilizing facilities to maintain the output power at a desired level. In this study, a wind farm and a photovoltaic array as renewable power systems and a pumped-storage and batteries as stabilizing facilities are used, since they are best suitable for compensating the fluctuations of these types of power suppliers. As an illustrative example, a model of wind and photovoltaic power systems with battery energy and pumped hydro storage facilities for power fluctuation reduction is included, and its power fluctuation reduction is verified through simulation.

  5. Experience Curves: A Tool for Energy Policy Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Neij, Lena; Helby, Peter [Lund Univ. (Sweden). Environmental and Energy Systems Studies; Dannemand Andersen, Per; Morthorst, Poul Erik [Riso National Laboratory, Roskilde (Denmark); Durstewitz, Michael; Hoppe-Kilpper, Martin [Inst. fuer Solare Energieversorgungstechnik e.V., Kassel (DE); and others

    2003-07-01

    The objective of the project, Experience curves: a tool for energy policy assessment (EXTOOL), was to analyse the experience curve as a tool for the assessment of energy policy measures. This is of special interest, since the use of experience curves for the assessment of energy policy measures requires the development of the established experience curve methodology. This development raises several questions which have been addressed and analysed in this project. The analysis is based on case studies of wind power, an area with considerable experience in technology development, deployment and policy measures. Therefore, a case study based on wind power provides a good opportunity to study the usefulness of experience curves as a tool for the assessment of energy policy measures. However, the results are discussed in terms of using experience curves for the assessment of any energy technology. The project shows that experience curves can be used to assess the effect of combined policy measures in terms of cost reductions. Moreover, the result of the project show that experience curves could be used to analyse international 'learning systems', i.e. cost reductions brought about by the development of wind power and policy measures used in other countries. Nevertheless, the use of experience curves for the assessment of policy programmes has several limitations. First, the analysis and assessment of policy programmes cannot be achieved unless relevant experience curves based on good data can be developed. The authors are of the opinion that only studies that provide evidence of the validity, reliability and relevance of experience curves should be taken into account in policy making. Second, experience curves provide an aggregated picture of the situation and more detailed analysis of various sources of cost reduction, and cost reductions resulting from individual policy measures, requires additional data and analysis tools. Third, we do not recommend the use of

  6. Estimation of snow albedo reduction by light absorbing impurities using Monte Carlo radiative transfer model

    Science.gov (United States)

    Sengupta, D.; Gao, L.; Wilcox, E. M.; Beres, N. D.; Moosmüller, H.; Khlystov, A.

    2017-12-01

    Radiative forcing and climate change greatly depends on earth's surface albedo and its temporal and spatial variation. The surface albedo varies greatly depending on the surface characteristics ranging from 5-10% for calm ocean waters to 80% for some snow-covered areas. Clean and fresh snow surfaces have the highest albedo and are most sensitive to contamination with light absorbing impurities that can greatly reduce surface albedo and change overall radiative forcing estimates. Accurate estimation of snow albedo as well as understanding of feedbacks on climate from changes in snow-covered areas is important for radiative forcing, snow energy balance, predicting seasonal snowmelt, and run off rates. Such information is essential to inform timely decision making of stakeholders and policy makers. Light absorbing particles deposited onto the snow surface can greatly alter snow albedo and have been identified as a major contributor to regional climate forcing if seasonal snow cover is involved. However, uncertainty associated with quantification of albedo reduction by these light absorbing particles is high. Here, we use Mie theory (under the assumption of spherical snow grains) to reconstruct the single scattering parameters of snow (i.e., single scattering albedo ῶ and asymmetry parameter g) from observation-based size distribution information and retrieved refractive index values. The single scattering parameters of impurities are extracted with the same approach from datasets obtained during laboratory combustion of biomass samples. Instead of using plane-parallel approximation methods to account for multiple scattering, we have used the simple "Monte Carlo ray/photon tracing approach" to calculate the snow albedo. This simple approach considers multiple scattering to be the "collection" of single scattering events. Using this approach, we vary the effective snow grain size and impurity concentrations to explore the evolution of snow albedo over a wide

  7. Fuzzy Relational Databases: Representational Issues and Reduction Using Similarity Measures.

    Science.gov (United States)

    Prade, Henri; Testemale, Claudette

    1987-01-01

    Compares and expands upon two approaches to dealing with fuzzy relational databases. The proposed similarity measure is based on a fuzzy Hausdorff distance and estimates the mismatch between two possibility distributions using a reduction process. The consequences of the reduction process on query evaluation are studied. (Author/EM)

  8. Automated data reduction workflows for astronomy. The ESO Reflex environment

    Science.gov (United States)

    Freudling, W.; Romaniello, M.; Bramich, D. M.; Ballester, P.; Forchi, V.; García-Dabló, C. E.; Moehler, S.; Neeser, M. J.

    2013-11-01

    Context. Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. Aims: The efficiency of data reduction can be improved by using automatic workflows to organise data and execute a sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. Methods: The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowchart-like representation of the data reduction process. Key features of Reflex are a rule-based data organiser, infrastructure to re-use results, thorough book-keeping, data progeny tracking, interactive user interfaces, and a novel concept to exploit information created during data organisation for the workflow execution. Results: Automated workflows can greatly increase the efficiency of astronomical data reduction. In Reflex, workflows can be run non-interactively as a first step. Subsequent optimization can then be carried out while transparently re-using all unchanged intermediate products. We found that such workflows enable the reduction of complex data by non-expert users and minimizes mistakes due to book-keeping errors. Conclusions: Reflex includes novel concepts to increase the efficiency of astronomical data processing. While Reflex is a specific implementation of astronomical scientific workflows within the Kepler workflow

  9. The Acquisition Cost-Estimating Workforce. Census and Characteristics

    Science.gov (United States)

    2009-01-01

    Abbreviations AAC Air Armament Center ACAT acquisition category ACEIT Automated Cost Estimating Integrated Tools AF Air Force AFB Air Force Base AFCAA Air...3 3 4 Automated Cost Estimating Integrated Tools ( ACEIT ) 0 1 12 6 Tecolotea training 0 0 10 5 Other 3 13 24 18 No training 18 4 29 18 Total 100 100...other sources, including AFIT, ACEIT ,9 or the contracting agency that employed them. The remain- ing 29 percent reported having received no training

  10. Rainfall estimation by inverting SMOS soil moisture estimates: A comparison of different methods over Australia

    Science.gov (United States)

    Brocca, Luca; Pellarin, Thierry; Crow, Wade T.; Ciabatta, Luca; Massari, Christian; Ryu, Dongryeol; Su, Chun-Hsu; Rüdiger, Christoph; Kerr, Yann

    2016-10-01

    Remote sensing of soil moisture has reached a level of maturity and accuracy for which the retrieved products can be used to improve hydrological and meteorological applications. In this study, the soil moisture product from the Soil Moisture and Ocean Salinity (SMOS) satellite is used for improving satellite rainfall estimates obtained from the Tropical Rainfall Measuring Mission multisatellite precipitation analysis product (TMPA) using three different "bottom up" techniques: SM2RAIN, Soil Moisture Analysis Rainfall Tool, and Antecedent Precipitation Index Modification. The implementation of these techniques aims at improving the well-known "top down" rainfall estimate derived from TMPA products (version 7) available in near real time. Ground observations provided by the Australian Water Availability Project are considered as a separate validation data set. The three algorithms are calibrated against the gauge-corrected TMPA reanalysis product, 3B42, and used for adjusting the TMPA real-time product, 3B42RT, using SMOS soil moisture data. The study area covers the entire Australian continent, and the analysis period ranges from January 2010 to November 2013. Results show that all the SMOS-based rainfall products improve the performance of 3B42RT, even at daily time scale (differently from previous investigations). The major improvements are obtained in terms of estimation of accumulated rainfall with a reduction of the root-mean-square error of more than 25%. Also, in terms of temporal dynamic (correlation) and rainfall detection (categorical scores) the SMOS-based products provide slightly better results with respect to 3B42RT, even though the relative performance between the methods is not always the same. The strengths and weaknesses of each algorithm and the spatial variability of their performances are identified in order to indicate the ways forward for this promising research activity. Results show that the integration of bottom up and top down approaches

  11. Analysis of Beamformer Directed Single-Channel Noise Reduction System for Hearing Aid Applications

    DEFF Research Database (Denmark)

    Jensen, Jesper; Pedersen, Michael Syskind

    2015-01-01

    We study multi-microphone noise reduction systems consisting of a beamformer and a single-channel (SC) noise reduction stage. In particular, we present and analyse a maximum likelihood (ML) method for jointly estimating the target and noise power spectral densities (psd's) entering the SC filter....... We show that the estimators are minimum variance and unbiased, and provide closed-form expressions for their mean-square error (MSE). Furthermore, we show that the MSE of the noise psd estimator is particularly simple: it is independent of target signal characteristics, frequency, and microphone...

  12. Joint carbon footprint assessment and data envelopment analysis for the reduction of greenhouse gas emissions in agriculture production.

    Science.gov (United States)

    Rebolledo-Leiva, Ricardo; Angulo-Meza, Lidia; Iriarte, Alfredo; González-Araya, Marcela C

    2017-09-01

    Operations management tools are critical in the process of evaluating and implementing action towards a low carbon production. Currently, a sustainable production implies both an efficient resource use and the obligation to meet targets for reducing greenhouse gas (GHG) emissions. The carbon footprint (CF) tool allows estimating the overall amount of GHG emissions associated with a product or activity throughout its life cycle. In this paper, we propose a four-step method for a joint use of CF assessment and Data Envelopment Analysis (DEA). Following the eco-efficiency definition, which is the delivery of goods using fewer resources and with decreasing environmental impact, we use an output oriented DEA model to maximize production and reduce CF, taking into account simultaneously the economic and ecological perspectives. In another step, we stablish targets for the contributing CF factors in order to achieve CF reduction. The proposed method was applied to assess the eco-efficiency of five organic blueberry orchards throughout three growing seasons. The results show that this method is a practical tool for determining eco-efficiency and reducing GHG emissions. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Results Evaluation in Reduction Rhinoplasty

    Directory of Open Access Journals (Sweden)

    Arima, Lisandra Megumi

    2011-01-01

    Full Text Available Introduction: Final results evaluation after rhinoplasty is a not a topic widely studied from the patient's viewpoint. Objective:Evaluate the satisfaction of the patients submitted to reduction rhinoplasty, from the questionnaire Rhinoplasty Outcomes Evaluation (ROE. Method: Longitudinal study, retrospective cut type, of the preoperative and postoperative satisfaction. The sample was composed by 28 patients who were submitted to rhinoplasty and answered the ROE questionnaire. Three variables were obtained: satisfaction note that the patient had with his/her image before the surgery; note of satisfaction with the current appearance; the difference of the average satisfaction notes between postoperative and preoperative approaches. Results: The postoperative note was higher than the preoperative in all patients. We noticed a difference between the average of the postoperative and preoperative of 48.3 (p75 considered to be an excellent outcome (67.9%. Conclusions: The ROE questionnaire is a helpful tool to show the satisfaction of the patient submitted to reduction rhinoplasty. About 92% of the patients submitted to reduction rhinoplasty consider the postoperative result to be good or excellent.

  14. Can genetic estimators provide robust estimates of the effective number of breeders in small populations?

    Directory of Open Access Journals (Sweden)

    Marion Hoehn

    Full Text Available The effective population size (N(e is proportional to the loss of genetic diversity and the rate of inbreeding, and its accurate estimation is crucial for the monitoring of small populations. Here, we integrate temporal studies of the gecko Oedura reticulata, to compare genetic and demographic estimators of N(e. Because geckos have overlapping generations, our goal was to demographically estimate N(bI, the inbreeding effective number of breeders and to calculate the N(bI/N(a ratio (N(a =number of adults for four populations. Demographically estimated N(bI ranged from 1 to 65 individuals. The mean reduction in the effective number of breeders relative to census size (N(bI/N(a was 0.1 to 1.1. We identified the variance in reproductive success as the most important variable contributing to reduction of this ratio. We used four methods to estimate the genetic based inbreeding effective number of breeders N(bI(gen and the variance effective populations size N(eV(gen estimates from the genotype data. Two of these methods - a temporal moment-based (MBT and a likelihood-based approach (TM3 require at least two samples in time, while the other two were single-sample estimators - the linkage disequilibrium method with bias correction LDNe and the program ONeSAMP. The genetic based estimates were fairly similar across methods and also similar to the demographic estimates excluding those estimates, in which upper confidence interval boundaries were uninformative. For example, LDNe and ONeSAMP estimates ranged from 14-55 and 24-48 individuals, respectively. However, temporal methods suffered from a large variation in confidence intervals and concerns about the prior information. We conclude that the single-sample estimators are an acceptable short-cut to estimate N(bI for species such as geckos and will be of great importance for the monitoring of species in fragmented landscapes.

  15. Specifying residential retrofit packages for 30 % reductions in energy consumption in hot-humid climate zones

    Energy Technology Data Exchange (ETDEWEB)

    Burgett, J.M.; Chini, A.R.; Oppenheim, P. [University of Florida, 573 Rinker Hall, Newell Drive, Gainesville, FL 32611 (United States)

    2013-08-15

    The purpose of this research was to demonstrate the application of energy simulation as an effective tool for specifying cost-effective residential retrofit packages that will reduce energy consumption by 30 %. Single-family homes in the hot-humid climate type of the Southeastern USA were used to demonstrate the application. US census data from both state and federal studies were used to create 12 computer simulation homes representing the most common characteristics of single-family houses specific to this area. Well-recognized energy efficiency measures (EEMs) were simulated to determine their cumulative energy reduction potential. Detailed cost estimates were created for cost-to-benefit analysis. For each of the 12 simulated homes, 4 packages of EEMs were created. The four packages provided home owners options for reducing their energy by 30 % along with the estimated up-front cost and simple payback periods. The simple payback period was used to determine how cost-effective a measure was. The packages are specific to a geographic area to provide a higher degree of confidence in the projected cost and energy savings. The study provides a generic methodology to create a similar 30 % energy reduction packages for other locations and a detailed description of a case study to serve as an example. The study also highlights the value that computer simulation models can have to develop energy efficiency packages cost-effectively and specific to home owner's location and housing type.

  16. Effects of gamma irradiation, pH-Reduction and A[sub W]-reduction on the shelf-life of chilled 'tenderloin rolls'

    Energy Technology Data Exchange (ETDEWEB)

    Farkas, J. (Dept. of Refrigeration and Livestock Products Technology, Univ. of Horticulture and Food Industry, Budapest (Hungary)); Andrassy, E. (Dept. of Refrigeration and Livestock Products Technology, Univ. of Horticulture and Food Industry, Budapest (Hungary))

    1993-01-01

    Experimental batches of refrigerated, vacuum-package, ready-to-fry, minced meat product, 'tenderloin rolls' were preserved by combinations of reduction of pH from 6.1 to 5.6 by ascrobic acid, reduction of the water activity from a[sub w]=0.975 to 0.962 by sodium lactate, and/or a radiation dose of 2 kGy. Storage of the untreated and irradiated samples at +2 C for 4 weeks was followed by one-week incubation at +10 C. Total plate counts, counts of presumptive lactobacilli, the Enterobacteriaceae and sulphite-reducing clostridia were estimated at weekly intervals. pH-changes during storage were also followed. Comparative estimations of sensory qualities, thiamine contents, and TBA-values were also performed. The results demonstrated the possibility of significantly extending the shelf-life of the chilled product - without hampering the microbiological safety - by the sensorically acceptable radiation dose in combination with slight reduction of the pH and the water activity. (orig.)

  17. Cost Based Value Stream Mapping as a Sustainable Construction Tool for Underground Pipeline Construction Projects

    Directory of Open Access Journals (Sweden)

    Murat Gunduz

    2017-11-01

    Full Text Available This paper deals with application of Value Stream Mapping (VSM as a sustainable construction tool on a real construction project of installation of underground pipelines. VSM was adapted to reduce the high percentage of non-value-added activities and time wastes during each construction stage and the paper searched for an effective way to consider the cost for studied construction of underground pipeline. This paper is unique in its way that it adopts cost implementation of VSM to improve the productivity in underground pipeline projects. The data was observed and collected from site during construction, indicating the cycle time, value added and non-value added of each construction stage. The current state was built based on these details. This was an eye-opening exercise and a process management tool as a trigger for improvement. After the current state assessment, a future state is attempted by Value Stream Mapping tool balancing the resources using a Line of Balance (LOB technique. Moreover, a sustainable cost estimation model was developed during current state and future state to calculate the cost of underground pipeline construction. The result shows a cost reduction of 20.8% between current and future states. This reflects the importance of the cost based Value Stream Mapping in construction as a sustainable measurement tool. This new tool could be utilized in construction industry to add the sustainability and effective cost management.

  18. An Evaluation Tool for CONUS-Scale Estimates of Components of the Water Balance

    Science.gov (United States)

    Saxe, S.; Hay, L.; Farmer, W. H.; Markstrom, S. L.; Kiang, J. E.

    2016-12-01

    Numerous research groups are independently developing data products to represent various components of the water balance (e.g. runoff, evapotranspiration, recharge, snow water equivalent, soil moisture, and climate) at the scale of the conterminous United States. These data products are derived from a range of sources, including direct measurement, remotely-sensed measurement, and statistical and deterministic model simulations. An evaluation tool is needed to compare these data products and the components of the water balance they contain in order to identify the gaps in the understanding and representation of continental-scale hydrologic processes. An ideal tool will be an objective, universally agreed upon, framework to address questions related to closing the water balance. This type of generic, model agnostic evaluation tool would facilitate collaboration amongst different hydrologic research groups and improve modeling capabilities with respect to continental-scale water resources. By adopting a comprehensive framework to consider hydrologic modeling in the context of a complete water balance, it is possible to identify weaknesses in process modeling, data product representation and regional hydrologic variation. As part of its National Water Census initiative, the U.S. Geological survey is facilitating this dialogue to developing prototype evaluation tools.

  19. Stochastic differential equations as a tool to regularize the parameter estimation problem for continuous time dynamical systems given discrete time measurements.

    Science.gov (United States)

    Leander, Jacob; Lundh, Torbjörn; Jirstrand, Mats

    2014-05-01

    In this paper we consider the problem of estimating parameters in ordinary differential equations given discrete time experimental data. The impact of going from an ordinary to a stochastic differential equation setting is investigated as a tool to overcome the problem of local minima in the objective function. Using two different models, it is demonstrated that by allowing noise in the underlying model itself, the objective functions to be minimized in the parameter estimation procedures are regularized in the sense that the number of local minima is reduced and better convergence is achieved. The advantage of using stochastic differential equations is that the actual states in the model are predicted from data and this will allow the prediction to stay close to data even when the parameters in the model is incorrect. The extended Kalman filter is used as a state estimator and sensitivity equations are provided to give an accurate calculation of the gradient of the objective function. The method is illustrated using in silico data from the FitzHugh-Nagumo model for excitable media and the Lotka-Volterra predator-prey system. The proposed method performs well on the models considered, and is able to regularize the objective function in both models. This leads to parameter estimation problems with fewer local minima which can be solved by efficient gradient-based methods. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Measuring the Value of Mortality Risk Reductions in Turkey

    Science.gov (United States)

    Tekeşin, Cem; Ara, Shihomi

    2014-01-01

    The willingness to pay (WTP) for mortality risk reduction from four causes (lung cancer, other type of cancer, respiratory disease, traffic accident) are estimated using random parameter logit model with data from choice experiment for three regions in Turkey. The value of statistical life (VSL) estimated for Afsin-Elbistan, Kutahya-Tavsanli, Ankara and the pooled case are found as 0.56, 0.35, 0.46 and 0.49 million Purchasing Power Parity (PPP) adjusted 2012 US dollars (USD). Different types of risk cause different VSL estimates and we found the lung cancer premium of 213% against traffic accident. The effects of one-year-delayed provision of risk-reduction service are the reduction of WTP by 482 TL ($318 in PPP adjusted USD) per person on average, and the disutility from status-quo (zero risk reduction) against alternative is found to be 891 TL ($589 in PPP adjusted USD) per person on average. Senior discounts of VSL are partially determined by status-quo preference and the amount of discount decreases once the status-quo bias is removed. The peak VSL is found to be for the age group 30–39 and the average VSL for the age group is 0.8 million PPP adjusted USD). Turkey’s compliance to European Union (EU) air quality standard will cause welfare gains of total 373 million PPP adjusted USD for our study areas in terms of reduced number of premature mortality. PMID:25000150

  1. Measuring the Value of Mortality Risk Reductions in Turkey

    Directory of Open Access Journals (Sweden)

    Cem Tekeşin

    2014-07-01

    Full Text Available The willingness to pay (WTP for mortality risk reduction from four causes (lung cancer, other type of cancer, respiratory disease, traffic accident are estimated using random parameter logit model with data from choice experiment for three regions in Turkey. The value of statistical life (VSL estimated for Afsin-Elbistan, Kutahya-Tavsanli, Ankara and the pooled case are found as 0.56, 0.35, 0.46 and 0.49 million Purchasing Power Parity (PPP adjusted 2012 US dollars (USD. Different types of risk cause different VSL estimates and we found the lung cancer premium of 213% against traffic accident. The effects of one-year-delayed provision of risk-reduction service are the reduction of WTP by 482 TL ($318 in PPP adjusted USD per person on average, and the disutility from status-quo (zero risk reduction against alternative is found to be 891 TL ($589 in PPP adjusted USD per person on average. Senior discounts of VSL are partially determined by status-quo preference and the amount of discount decreases once the status-quo bias is removed. The peak VSL is found to be for the age group 30–39 and the average VSL for the age group is 0.8 million PPP adjusted USD. Turkey’s compliance to European Union (EU air quality standard will cause welfare gains of total 373 million PPP adjusted USD for our study areas in terms of reduced number of premature mortality.

  2. PROMAB-GIS: A GIS based Tool for Estimating Runoff and Sediment Yield in running Waters

    Science.gov (United States)

    Jenewein, S.; Rinderer, M.; Ploner, A.; Sönser, T.

    2003-04-01

    In recent times settlements have expanded, traffic and tourist activities have increased in most alpine regions. As a consequence, on the one hand humans and goods are affected by natural hazard processes more often, while on the other hand the demand for protection by both technical constructions and planning measures carried out by public authorities is growing. This situation results in an ever stronger need of reproducibility, comparability, transparency of all methods applied in modern natural hazard management. As a contribution to a new way of coping this situation Promab-GIS Version 1.0 has been developed. Promab-Gis has been designed as a model for time- and space-dependent determination of both runoff and bedload transport in rivers of small alpine catchment areas. The estimation of the unit hydrograph relies upon the "rational formula" and the time-area curves of the watershed. The time area diagram is a graph of cumulative drainage area contributing to discharge at the watershed outlet within a specified time of travel. The sediment yield is estimated for each cell of the channel network by determining the actual process type (erosion, transport or accumulation). Two types of transport processes are considered, sediment transport and debris flows. All functions of Promab-GIS are integrated in the graphical user interface of ArcView as pull-up menus and tool buttons. Hence the application of Promab-GIS does not rely on a sophisticated knowledge of GIS in general, respectively the ArcView software. However, despite the use of computer assistance, Promab-GIS still is an expert support system. In order to obtain plausible results, the users must be familiar with all the relevant processes controlling runoff and sediment yield in torrent catchments.

  3. The Influence of Tool Composite's Structure During Process of Diamond Grinding of Ceramic Materials

    Directory of Open Access Journals (Sweden)

    Gawlik Józef

    2014-12-01

    Full Text Available This paper presents the results of the tests performed during the grinding process of the ceramic materials: – polycrystalline ceramics (Zirconium ZrO2 and mono-crystalline ceramics (sapphire α-Al2O3 by the diamond tools. Studies have shown that the concentration (thickening of the tool composite changes the tool's pore structure when using suitable wetted adamantine additives. Such modified composite has positive impact on tribological properties of the subsurface layer of the machined components. This is manifested by the reduction of the surface roughness and reduction of the vibration amplitude of the coefficient of friction. The possibilities of the positive effects when using wetted additives on the tool's composite during the pressing (briquetting stage confirm the study results.

  4. Mixing the Green-Ampt model and Curve Number method as an empirical tool for rainfall excess estimation in small ungauged catchments.

    Science.gov (United States)

    Grimaldi, S.; Petroselli, A.; Romano, N.

    2012-04-01

    The Soil Conservation Service - Curve Number (SCS-CN) method is a popular rainfall-runoff model that is widely used to estimate direct runoff from small and ungauged basins. The SCS-CN is a simple and valuable approach to estimate the total stream-flow volume generated by a storm rainfall, but it was developed to be used with daily rainfall data. To overcome this drawback, we propose to include the Green-Ampt (GA) infiltration model into a mixed procedure, which is referred to as CN4GA (Curve Number for Green-Ampt), aiming to distribute in time the information provided by the SCS-CN method so as to provide estimation of sub-daily incremental rainfall excess. For a given storm, the computed SCS-CN total net rainfall amount is used to calibrate the soil hydraulic conductivity parameter of the Green-Ampt model. The proposed procedure was evaluated by analyzing 100 rainfall-runoff events observed in four small catchments of varying size. CN4GA appears an encouraging tool for predicting the net rainfall peak and duration values and has shown, at least for the test cases considered in this study, a better agreement with observed hydrographs than that of the classic SCS-CN method.

  5. Success of commonly used operating room management tools in reducing tardiness of first case of the day starts: evidence from German hospitals.

    Science.gov (United States)

    Ernst, Christian; Szczesny, Andrea; Soderstrom, Naomi; Siegmund, Frank; Schleppers, Alexander

    2012-09-01

    One of the declared objectives of surgical suite management in Germany is to increase operating room (OR) efficiency by reducing tardiness of first case of the day starts. We analyzed whether the introduction of OR management tools by German hospitals in response to increasing economic pressure was successful in achieving this objective. The OR management tools we considered were the appointment of an OR manager and the development and adoption of a surgical suite governance document (OR charter). We hypothesized that tardiness of first case starts was less in ORs that have adopted one or both of these tools. Using representative 2005 survey data from 107 German anesthesiology departments, we used a Tobit model to estimate the effect of the introduction of an OR manager or OR charter on tardiness of first case starts, while controlling for hospital size and surgical suite complexity. Adoption reduced tardiness of first case starts by at least 7 minutes (mean reduction 15 minutes, 95% confidence interval (CI): 7-22 minutes, P case starts figure prominently the objectives of surgical suite management in Germany. Our results suggest that the appointment of an OR manager or the adoption of an OR charter support this objective. For short-term decision making on the day of surgery, this reduction in tardiness may have economic implications, because it reduced overutilized OR time.

  6. Transit boardings estimation and simulation tool (TBEST) calibration for guideway and BRT modes : [summary].

    Science.gov (United States)

    2013-06-01

    As demand for public transportation grows, planning tools used by Florida Department of Transportation (FDOT) and other transit agencies must evolve to effectively predict changing patterns of ridership. A key tool for this purpose is the Transit Boa...

  7. Multi-category micro-milling tool wear monitoring with continuous hidden Markov models

    Science.gov (United States)

    Zhu, Kunpeng; Wong, Yoke San; Hong, Geok Soon

    2009-02-01

    In-process monitoring of tool conditions is important in micro-machining due to the high precision requirement and high tool wear rate. Tool condition monitoring in micro-machining poses new challenges compared to conventional machining. In this paper, a multi-category classification approach is proposed for tool flank wear state identification in micro-milling. Continuous Hidden Markov models (HMMs) are adapted for modeling of the tool wear process in micro-milling, and estimation of the tool wear state given the cutting force features. For a noise-robust approach, the HMM outputs are connected via a medium filter to minimize the tool state before entry into the next state due to high noise level. A detailed study on the selection of HMM structures for tool condition monitoring (TCM) is presented. Case studies on the tool state estimation in the micro-milling of pure copper and steel demonstrate the effectiveness and potential of these methods.

  8. Problems in repair-welding of duplex-treated tool steels

    Directory of Open Access Journals (Sweden)

    T. Muhič

    2009-01-01

    Full Text Available The present paper addresses problems in laser welding of die-cast tools used for aluminum pressure die-castings and plastic moulds. To extend life cycle of tools various surface improvements are used. These surface improvements significantly reduce weldability of the material. This paper presents development of defects in repair welding of duplex-treated tool steel. The procedure is aimed at reduction of defects by the newly developed repair laser welding techniques. Effects of different repair welding process parameters and techniques are considered. A microstructural analysis is conducted to detect defect formation and reveal the best laser welding method for duplex-treated tools.

  9. On distribution reduction and algorithm implementation in inconsistent ordered information systems.

    Science.gov (United States)

    Zhang, Yanqin

    2014-01-01

    As one part of our work in ordered information systems, distribution reduction is studied in inconsistent ordered information systems (OISs). Some important properties on distribution reduction are studied and discussed. The dominance matrix is restated for reduction acquisition in dominance relations based information systems. Matrix algorithm for distribution reduction acquisition is stepped. And program is implemented by the algorithm. The approach provides an effective tool for the theoretical research and the applications for ordered information systems in practices. For more detailed and valid illustrations, cases are employed to explain and verify the algorithm and the program which shows the effectiveness of the algorithm in complicated information systems.

  10. Model reduction of nonlinear systems subject to input disturbances

    KAUST Repository

    Ndoye, Ibrahima; Laleg-Kirati, Taous-Meriem

    2017-01-01

    The method of convex optimization is used as a tool for model reduction of a class of nonlinear systems in the presence of disturbances. It is shown that under some conditions the nonlinear disturbed system can be approximated by a reduced order

  11. Policy strategies and paths to promote sustainable energy systems-The dynamic Invert simulation tool

    International Nuclear Information System (INIS)

    Stadler, Michael; Kranzl, Lukas; Huber, Claus; Haas, Reinhard; Tsioliaridou, Elena

    2007-01-01

    The European Union has established a number of targets regarding energy efficiency, Renewable Energy Sources (RES) and CO 2 reductions as the 'GREEN PAPER on Energy Efficiency', the Directive for 'promotion of the use of bio-fuels or other renewable fuels for transport' or 'Directive of the European Parliament of the Council on the promotion of cogeneration based on a useful heat demand in the internal energy market'. Many of the according RES and RUE measures are not attractive for investors from an economic point of view. Therefore, governments all over the world have to spend public money to promote these technologies/measures to bring them into market. These expenditures have to be adjusted to budget concerns and should be spent most efficiently. Therefore, the spent money has to be dedicated to technologies and efficiency measures with the best yield in CO 2 reduction without wasting money. The core question: 'How can public money-for promoting sustainable energy systems-be spent most efficiently to reduce GHG emissions?' has well been investigated by the European project Invert. In course of this project, a simulation tool has been designed to answer this core question. This paper describes the modelling with the Invert simulation tool and shows the key features necessary for simulating the energy system. A definition of 'Promotion Scheme Efficiency' is given, which allows estimating the most cost-effective technologies and/or efficiency measures to reduce CO 2 emissions. Investigations performed with the Invert simulation tool deliver an optimum portfolio mix of technologies and efficiency measures for each selected region. Within Invert, seven European regions were simulated and for the Austrian case study, the detailed portfolio mix is shown and political conclusions are derived

  12. Uncertain Emission Reductions from Forest Conservation: REDD in the Bale Mountains, Ethiopia

    Directory of Open Access Journals (Sweden)

    Charlene Watson

    2013-09-01

    Full Text Available The environmental integrity of a mechanism rewarding Reduced Emissions from Deforestation and Degradation (REDD depends on appropriate accounting for emission reductions. Largely stemming from a lack of forest data in developing countries, emission reductions accounting contains substantial uncertainty as a result of forest carbon stock estimates, where the application of biome-averaged data over large forest areas is commonplace. Using a case study in the Bale Mountains in Ethiopia, we exemplify the implications of primary and secondary forest carbon stock estimates on predicted REDD project emission reductions and revenues. Primary data estimate area-weighted mean forest carbon stock of 195 tC/ha ± 81, and biome-averaged data reported by the Intergovernmental Panel on Climate Change underestimate forest carbon stock in the Bale Mountains by as much as 63% in moist forest and 58% in dry forest. Combining forest carbon stock estimates and uncertainty in voluntary carbon market prices demonstrates the financial impact of uncertainty: potential revenues over the 20-year project ranged between US$9 million and US$185 million. Estimated revenues will influence decisions to implement a project or not and may have profound implications for the level of benefit sharing that can be supported. Strong financial incentives exist to improve forest carbon stock estimates in tropical forests, as well as the environmental integrity of REDD projects.

  13. Ash reduction system using electrically heated particulate matter filter

    Science.gov (United States)

    Gonze, Eugene V [Pinckney, MI; Paratore, Jr., Michael J; He, Yongsheng [Sterling Heights, MI

    2011-08-16

    A control system for reducing ash comprises a temperature estimator module that estimates a temperature of an electrically heated particulate matter (PM) filter. A temperature and position estimator module estimates a position and temperature of an oxidation wave within the electrically heated PM filter. An ash reduction control module adjusts at least one of exhaust flow, fuel and oxygen levels in the electrically heated PM filter to adjust a position of the oxidation wave within the electrically heated PM filter based on the oxidation wave temperature and position.

  14. Quantitative genetic tools for insecticide resistance risk assessment: estimating the heritability of resistance

    Science.gov (United States)

    Michael J. Firko; Jane Leslie Hayes

    1990-01-01

    Quantitative genetic studies of resistance can provide estimates of genetic parameters not available with other types of genetic analyses. Three methods are discussed for estimating the amount of additive genetic variation in resistance to individual insecticides and subsequent estimation of heritability (h2) of resistance. Sibling analysis and...

  15. Breast dose reduction for chest CT by modifying the scanning parameters based on the pre-scan size-specific dose estimate (SSDE)

    Energy Technology Data Exchange (ETDEWEB)

    Kidoh, Masafumi; Utsunomiya, Daisuke; Oda, Seitaro; Nakaura, Takeshi; Yuki, Hideaki; Hirata, Kenichiro; Namimoto, Tomohiro; Sakabe, Daisuke; Hatemura, Masahiro; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Faculty of Life Sciences, Honjo, Kumamoto (Japan); Funama, Yoshinori [Kumamoto University, Department of Medical Physics, Faculty of Life Sciences, Honjo, Kumamoto (Japan)

    2017-06-15

    To investigate the usefulness of modifying scanning parameters based on the size-specific dose estimate (SSDE) for a breast-dose reduction for chest CT. We scanned 26 women with a fixed volume CT dose index (CTDI{sub vol}) (15 mGy) and another 26 with a fixed SSDE (15 mGy) protocol (protocol 1 and 2, respectively). In protocol 2, tube current was calculated based on the patient habitus obtained on scout images. We compared the mean breast dose and the inter-patient breast dose variability and performed linear regression analysis of the breast dose and the body mass index (BMI) of the two protocols. The mean breast dose was about 35 % lower under protocol 2 than protocol 1 (10.9 mGy vs. 16.8 mGy, p < 0.01). The inter-patient breast dose variability was significantly lower under protocol 2 than 1 (1.2 mGy vs. 2.5 mGy, p < 0.01). We observed a moderate negative correlation between the breast dose and the BMI under protocol 1 (r = 0.43, p < 0.01); there was no significant correlation (r = 0.06, p = 0.35) under protocol 2. The SSDE-based protocol achieved a reduction in breast dose and in inter-patient breast dose variability. (orig.)

  16. Reduction of Topographic Effect for Curve Number Estimated from Remotely Sensed Imagery

    Science.gov (United States)

    Zhang, Wen-Yan; Lin, Chao-Yuan

    2016-04-01

    The Soil Conservation Service Curve Number (SCS-CN) method is commonly used in hydrology to estimate direct runoff volume. The CN is the empirical parameter which corresponding to land use/land cover, hydrologic soil group and antecedent soil moisture condition. In large watersheds with complex topography, satellite remote sensing is the appropriate approach to acquire the land use change information. However, the topographic effect have been usually found in the remotely sensed imageries and resulted in land use classification. This research selected summer and winter scenes of Landsat-5 TM during 2008 to classified land use in Chen-You-Lan Watershed, Taiwan. The b-correction, the empirical topographic correction method, was applied to Landsat-5 TM data. Land use were categorized using K-mean classification into 4 groups i.e. forest, grassland, agriculture and river. Accuracy assessment of image classification was performed with national land use map. The results showed that after topographic correction, the overall accuracy of classification was increased from 68.0% to 74.5%. The average CN estimated from remotely sensed imagery decreased from 48.69 to 45.35 where the average CN estimated from national LULC map was 44.11. Therefore, the topographic correction method was recommended to normalize the topographic effect from the satellite remote sensing data before estimating the CN.

  17. Fiscal 1997 research report. International energy use rationalization project (Analytical tool research project for energy consumption efficiency improvement in Asia); 1997 nendo kokusai energy shiyo gorika nado taisaku jigyo chosa hokokusho. Asia energy shohi koritsuka bunseki tool chosa jigyo (honpen)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-05-01

    Efforts have been under way to prepare inter-industry relations tables and energy data for four Asian countries, namely, China, Taiwan, Singapore and Malaysia, and a tool for energy consumption efficiency analysis has been developed and improved. In Chapter 1, energy supply and demand in the above-named four countries is reviewed on the basis of recent economic situations in these countries. In Chapter 2, bilateral inter-industry relations tables usable under the project are employed for the analysis of the economic status of each of the countries and energy transactions between them, and a method is described of converting the tables into one-nation inter-industry relations tables which meet the need of this project. In Chapter 3, national characteristics reflected on the respective energy input tables are described, and a method is shown of converting a nationally characterized unit energy table into a common unit energy input table for registration with a database. In Chapter 4, the constitution of the Asian energy consumption efficiency improvement analyzing tool and a system using the tool are explained. In Chapter 5, some examples of analyses conducted by use of the analyzing tool are shown, in which the energy saving effect and CO2 emission reduction effect are estimated for Indonesia by use of the analyzing tool. (NEDO)

  18. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    Science.gov (United States)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities

  19. Variance Reduction Techniques in Monte Carlo Methods

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.; Ridder, A.A.N.; Rubinstein, R.Y.

    2010-01-01

    Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the

  20. PROBLEMS OF ICT-BASED TOOLS ESTIMATION IN THE CONTEXT OF INFORMATION SOCIETY FORMATION

    Directory of Open Access Journals (Sweden)

    M. Shyshkina

    2012-03-01

    Full Text Available The article describes the problems of improvement of quality of implementation and use of e-learning tools which arise in terms of increasing quality and accessibility of education. It is determined that those issues are closely linked to specific scientific and methodological approaches to evaluation of quality, selection and use of ICT-based tools in view of emergence of promising information technological platforms of these resources implementation and delivery.

  1. The Role of Zakah and Binary Economics in Poverty Reduction

    OpenAIRE

    Aisyah, Muniaty

    2014-01-01

    Poverty reduction remains the most important challenge for every countries. Zakah, as an Islamic faith-based institution, is a strategic tool for combating poverty. This study aims to identify the role of zakah and compare its principles with an overview of the characteristics and practices within binary economics which also provides a systemic solution for poverty. The study shows that zakah has an essential role to the economic growth and poverty reduction in Muslim community, as well as,...

  2. Estimation of portion size in children's dietary assessment: lessons learnt.

    Science.gov (United States)

    Foster, E; Adamson, A J; Anderson, A S; Barton, K L; Wrieden, W L

    2009-02-01

    Assessing the dietary intake of young children is challenging. In any 1 day, children may have several carers responsible for providing them with their dietary requirements, and once children reach school age, traditional methods such as weighing all items consumed become impractical. As an alternative to weighed records, food portion size assessment tools are available to assist subjects in estimating the amounts of foods consumed. Existing food photographs designed for use with adults and based on adult portion sizes have been found to be inappropriate for use with children. This article presents a review and summary of a body of work carried out to improve the estimation of portion sizes consumed by children. Feasibility work was undertaken to determine the accuracy and precision of three portion size assessment tools; food photographs, food models and a computer-based Interactive Portion Size Assessment System (IPSAS). These tools were based on portion sizes served to children during the National Diet and Nutrition Survey. As children often do not consume all of the food served to them, smaller portions were included in each tool for estimation of leftovers. The tools covered 22 foods, which children commonly consume. Children were served known amounts of each food and leftovers were recorded. They were then asked to estimate both the amount of food that they were served and the amount of any food leftover. Children were found to estimate food portion size with an accuracy approaching that of adults using both the food photographs and IPSAS. Further development is underway to increase the number of food photographs and to develop IPSAS to cover a much wider range of foods and to validate the use of these tools in a 'real life' setting.

  3. 3D Design Tools for Vacuum Electron Devices

    International Nuclear Information System (INIS)

    Levush, Baruch

    2003-01-01

    A reduction of development costs will have a significant impact on the total cost of the vacuum electron devices. Experimental testing cycles can be reduced or eliminated through the use of simulation-based design methodology, thereby reducing the time and cost of development. Moreover, by use of modern optimization tools for automating the process of seeking specific solution parameters and for studying dependencies of performance on parameters, new performance capabilities can be achieved, without resorting to expensive cycles of hardware fabrication and testing. Simulation-based-design will also provide the basis for sensitivity studies for determining the manufacturing tolerances associated with a particular design. Since material properties can have a critical effect on the performance of the vacuum electron devices, the design tools require precise knowledge of material characteristics, such as dielectric properties of the support rods, loss profile etc. Sensitivity studies must therefore include the effects of materials properties variation on device performance. This will provide insight for choosing the proper technological processes in order to achieve these tolerances, which is of great importance for achieving cost reduction. A successful design methodology depends on the development of accurate and efficient design tools with predictive capabilities. These design tools must be based on realistic models capable of high fidelity representation of geometry and materials, they must have optimization capabilities, and they must be easy to use

  4. Multiple-hit parameter estimation in monolithic detectors.

    Science.gov (United States)

    Hunter, William C J; Barrett, Harrison H; Lewellen, Tom K; Miyaoka, Robert S

    2013-02-01

    We examine a maximum-a-posteriori method for estimating the primary interaction position of gamma rays with multiple interaction sites (hits) in a monolithic detector. In assessing the performance of a multiple-hit estimator over that of a conventional one-hit estimator, we consider a few different detector and readout configurations of a 50-mm-wide square cerium-doped lutetium oxyorthosilicate block. For this study, we use simulated data from SCOUT, a Monte-Carlo tool for photon tracking and modeling scintillation- camera output. With this tool, we determine estimate bias and variance for a multiple-hit estimator and compare these with similar metrics for a one-hit maximum-likelihood estimator, which assumes full energy deposition in one hit. We also examine the effect of event filtering on these metrics; for this purpose, we use a likelihood threshold to reject signals that are not likely to have been produced under the assumed likelihood model. Depending on detector design, we observe a 1%-12% improvement of intrinsic resolution for a 1-or-2-hit estimator as compared with a 1-hit estimator. We also observe improved differentiation of photopeak events using a 1-or-2-hit estimator as compared with the 1-hit estimator; more than 6% of photopeak events that were rejected by likelihood filtering for the 1-hit estimator were accurately identified as photopeak events and positioned without loss of resolution by a 1-or-2-hit estimator; for PET, this equates to at least a 12% improvement in coincidence-detection efficiency with likelihood filtering applied.

  5. Phase-amplitude reduction of transient dynamics far from attractors for limit-cycling systems

    Science.gov (United States)

    Shirasaka, Sho; Kurebayashi, Wataru; Nakao, Hiroya

    2017-02-01

    Phase reduction framework for limit-cycling systems based on isochrons has been used as a powerful tool for analyzing the rhythmic phenomena. Recently, the notion of isostables, which complements the isochrons by characterizing amplitudes of the system state, i.e., deviations from the limit-cycle attractor, has been introduced to describe the transient dynamics around the limit cycle [Wilson and Moehlis, Phys. Rev. E 94, 052213 (2016)]. In this study, we introduce a framework for a reduced phase-amplitude description of transient dynamics of stable limit-cycling systems. In contrast to the preceding study, the isostables are treated in a fully consistent way with the Koopman operator analysis, which enables us to avoid discontinuities of the isostables and to apply the framework to system states far from the limit cycle. We also propose a new, convenient bi-orthogonalization method to obtain the response functions of the amplitudes, which can be interpreted as an extension of the adjoint covariant Lyapunov vector to transient dynamics in limit-cycling systems. We illustrate the utility of the proposed reduction framework by estimating the optimal injection timing of external input that efficiently suppresses deviations of the system state from the limit cycle in a model of a biochemical oscillator.

  6. An elaborated feeding cycle model for reductions in vectorial capacity of night-biting mosquitoes by insecticide-treated nets.

    Science.gov (United States)

    Le Menach, Arnaud; Takala, Shannon; McKenzie, F Ellis; Perisse, Andre; Harris, Anthony; Flahault, Antoine; Smith, David L

    2007-01-25

    Insecticide Treated Nets (ITNs) are an important tool for malaria control. ITNs are effective because they work on several parts of the mosquito feeding cycle, including both adult killing and repelling effects. Using an elaborated description of the classic feeding cycle model, simple formulas have been derived to describe how ITNs change mosquito behaviour and the intensity of malaria transmission, as summarized by vectorial capacity and EIR. The predicted changes are illustrated as a function of the frequency of ITN use for four different vector populations using parameter estimates from the literature. The model demonstrates that ITNs simultaneously reduce mosquitoes' lifespans, lengthen the feeding cycle, and by discouraging human biting divert more bites onto non-human hosts. ITNs can substantially reduce vectorial capacity through small changes to all of these quantities. The total reductions in vectorial capacity differ, moreover, depending on baseline behavior in the absence of ITNs. Reductions in lifespan and vectorial capacity are strongest for vector species with high baseline survival. Anthropophilic and zoophilic species are affected differently by ITNs; the feeding cycle is lengthened more for anthrophilic species, and the proportion of bites that are diverted onto non-human hosts is higher for zoophilic species. This model suggests that the efficacy of ITNs should be measured as a total reduction in transmission intensity, and that the quantitative effects will differ by species and by transmission intensity. At very high rates of ITN use, ITNs can generate large reductions in transmission intensity that could provide very large reductions in transmission intensity, and effective malaria control in some areas, especially when used in combination with other control measures. At high EIR, ITNs will probably not substantially reduce the parasite rate, but when transmission intensity is low, reductions in vectorial capacity combine with reductions in

  7. A tool to estimate bar patterns and flow conditions in estuaries when limited data is available

    Science.gov (United States)

    Leuven, J.; Verhoeve, S.; Bruijns, A. J.; Selakovic, S.; van Dijk, W. M.; Kleinhans, M. G.

    2017-12-01

    The effects of human interventions, natural evolution of estuaries and rising sea-level on food security and flood safety are largely unknown. In addition, ecologists require quantified habitat area to study future evolution of estuaries, but they lack predictive capability of bathymetry and hydrodynamics. For example, crucial input required for ecological models are values of intertidal area, inundation time, peak flow velocities and salinity. While numerical models can reproduce these spatial patterns, their computational times are long and for each case a new model must be developed. Therefore, we developed a comprehensive set of relations that accurately predict the hydrodynamics and the patterns of channels and bars, using a combination of the empirical relations derived from approximately 50 estuaries and theory for bars and estuaries. The first step is to predict local tidal prisms, which is the tidal prism that flows through a given cross-section. Second, the channel geometry is predicted from tidal prism and hydraulic geometry relations. Subsequently, typical flow velocities can be estimated from the channel geometry and tidal prism. Then, an ideal estuary shape is fitted to the measured planform: the deviation from the ideal shape, which is defined as the excess width, gives a measure of the locations where tidal bars form and their summed width (Leuven et al., 2017). From excess width, typical hypsometries can be predicted per cross-section. In the last step, flow velocities are calculated for the full range of occurring depths and salinity is calculated based on the estuary shape. Here, we will present a prototype tool that predicts equilibrium bar patterns and typical flow conditions. The tool is easy to use because the only input required is the estuary outline and tidal amplitude. Therefore it can be used by policy makers and researchers from multiple disciplines, such as ecologists, geologists and hydrologists, for example for paleogeographic

  8. Advancing the research agenda for diagnostic error reduction.

    Science.gov (United States)

    Zwaan, Laura; Schiff, Gordon D; Singh, Hardeep

    2013-10-01

    Diagnostic errors remain an underemphasised and understudied area of patient safety research. We briefly summarise the methods that have been used to conduct research on epidemiology, contributing factors and interventions related to diagnostic error and outline directions for future research. Research methods that have studied epidemiology of diagnostic error provide some estimate on diagnostic error rates. However, there appears to be a large variability in the reported rates due to the heterogeneity of definitions and study methods used. Thus, future methods should focus on obtaining more precise estimates in different settings of care. This would lay the foundation for measuring error rates over time to evaluate improvements. Research methods have studied contributing factors for diagnostic error in both naturalistic and experimental settings. Both approaches have revealed important and complementary information. Newer conceptual models from outside healthcare are needed to advance the depth and rigour of analysis of systems and cognitive insights of causes of error. While the literature has suggested many potentially fruitful interventions for reducing diagnostic errors, most have not been systematically evaluated and/or widely implemented in practice. Research is needed to study promising intervention areas such as enhanced patient involvement in diagnosis, improving diagnosis through the use of electronic tools and identification and reduction of specific diagnostic process 'pitfalls' (eg, failure to conduct appropriate diagnostic evaluation of a breast lump after a 'normal' mammogram). The last decade of research on diagnostic error has made promising steps and laid a foundation for more rigorous methods to advance the field.

  9. Duplicate laboratory test reduction using a clinical decision support tool.

    Science.gov (United States)

    Procop, Gary W; Yerian, Lisa M; Wyllie, Robert; Harrison, A Marc; Kottke-Marchant, Kandice

    2014-05-01

    Duplicate laboratory tests that are unwarranted increase unnecessary phlebotomy, which contributes to iatrogenic anemia, decreased patient satisfaction, and increased health care costs. We employed a clinical decision support tool (CDST) to block unnecessary duplicate test orders during the computerized physician order entry (CPOE) process. We assessed laboratory cost savings after 2 years and searched for untoward patient events associated with this intervention. This CDST blocked 11,790 unnecessary duplicate test orders in these 2 years, which resulted in a cost savings of $183,586. There were no untoward effects reported associated with this intervention. The movement to CPOE affords real-time interaction between the laboratory and the physician through CDSTs that signal duplicate orders. These interactions save health care dollars and should also increase patient satisfaction and well-being.

  10. Dose Reduction Techniques

    International Nuclear Information System (INIS)

    WAGGONER, L.O.

    2000-01-01

    As radiation safety specialists, one of the things we are required to do is evaluate tools, equipment, materials and work practices and decide whether the use of these products or work practices will reduce radiation dose or risk to the environment. There is a tendency for many workers that work with radioactive material to accomplish radiological work the same way they have always done it rather than look for new technology or change their work practices. New technology is being developed all the time that can make radiological work easier and result in less radiation dose to the worker or reduce the possibility that contamination will be spread to the environment. As we discuss the various tools and techniques that reduce radiation dose, keep in mind that the radiological controls should be reasonable. We can not always get the dose to zero, so we must try to accomplish the work efficiently and cost-effectively. There are times we may have to accept there is only so much you can do. The goal is to do the smart things that protect the worker but do not hinder him while the task is being accomplished. In addition, we should not demand that large amounts of money be spent for equipment that has marginal value in order to save a few millirem. We have broken the handout into sections that should simplify the presentation. Time, distance, shielding, and source reduction are methods used to reduce dose and are covered in Part I on work execution. We then look at operational considerations, radiological design parameters, and discuss the characteristics of personnel who deal with ALARA. This handout should give you an overview of what it takes to have an effective dose reduction program

  11. Dose Reduction Techniques

    Energy Technology Data Exchange (ETDEWEB)

    WAGGONER, L.O.

    2000-05-16

    As radiation safety specialists, one of the things we are required to do is evaluate tools, equipment, materials and work practices and decide whether the use of these products or work practices will reduce radiation dose or risk to the environment. There is a tendency for many workers that work with radioactive material to accomplish radiological work the same way they have always done it rather than look for new technology or change their work practices. New technology is being developed all the time that can make radiological work easier and result in less radiation dose to the worker or reduce the possibility that contamination will be spread to the environment. As we discuss the various tools and techniques that reduce radiation dose, keep in mind that the radiological controls should be reasonable. We can not always get the dose to zero, so we must try to accomplish the work efficiently and cost-effectively. There are times we may have to accept there is only so much you can do. The goal is to do the smart things that protect the worker but do not hinder him while the task is being accomplished. In addition, we should not demand that large amounts of money be spent for equipment that has marginal value in order to save a few millirem. We have broken the handout into sections that should simplify the presentation. Time, distance, shielding, and source reduction are methods used to reduce dose and are covered in Part I on work execution. We then look at operational considerations, radiological design parameters, and discuss the characteristics of personnel who deal with ALARA. This handout should give you an overview of what it takes to have an effective dose reduction program.

  12. Creation of complexity assessment tool for patients receiving home care

    Directory of Open Access Journals (Sweden)

    Maria Leopoldina de Castro Villas Bôas

    2016-06-01

    Full Text Available Abstract OBJECTIVE To create and validate a complexity assessment tool for patients receiving home care from a public health service. METHOD A diagnostic accuracy study, with estimates for the tool's validity and reliability. Measurements of sensitivity and specificity were considered when producing validity estimates. The resulting tool was used for testing. Assessment by a specialized team of home care professionals was used as the gold standard. In the tool's reliability study, the authors used the Kappa statistic. The tool's sensitivity and specificity were analyzed using various cut-off points. RESULTS On the best cut-off point-21-with the gold standard, a sensitivity of 75.5% was obtained, with the limits of confidence interval (95% at 68.3% and 82.8% and specificity of 53.2%, with the limits of confidence interval (95% at 43.8% and 62.7%. CONCLUSION The tool presented evidence of validity and reliability, possibly helping in service organization at patient admission, care type change, or support during the creation of care plans.

  13. Estimation of spectral kurtosis

    Science.gov (United States)

    Sutawanir

    2017-03-01

    Rolling bearings are the most important elements in rotating machinery. Bearing frequently fall out of service for various reasons: heavy loads, unsuitable lubrications, ineffective sealing. Bearing faults may cause a decrease in performance. Analysis of bearing vibration signals has attracted attention in the field of monitoring and fault diagnosis. Bearing vibration signals give rich information for early detection of bearing failures. Spectral kurtosis, SK, is a parameter in frequency domain indicating how the impulsiveness of a signal varies with frequency. Faults in rolling bearings give rise to a series of short impulse responses as the rolling elements strike faults, SK potentially useful for determining frequency bands dominated by bearing fault signals. SK can provide a measure of the distance of the analyzed bearings from a healthy one. SK provides additional information given by the power spectral density (psd). This paper aims to explore the estimation of spectral kurtosis using short time Fourier transform known as spectrogram. The estimation of SK is similar to the estimation of psd. The estimation falls in model-free estimation and plug-in estimator. Some numerical studies using simulations are discussed to support the methodology. Spectral kurtosis of some stationary signals are analytically obtained and used in simulation study. Kurtosis of time domain has been a popular tool for detecting non-normality. Spectral kurtosis is an extension of kurtosis in frequency domain. The relationship between time domain and frequency domain analysis is establish through power spectrum-autocovariance Fourier transform. Fourier transform is the main tool for estimation in frequency domain. The power spectral density is estimated through periodogram. In this paper, the short time Fourier transform of the spectral kurtosis is reviewed, a bearing fault (inner ring and outer ring) is simulated. The bearing response, power spectrum, and spectral kurtosis are plotted to

  14. REDUCTION CAPACITY OF SALTSTONE AND SALTSTONE COMPONENTS

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, K.; Kaplan, D.

    2009-11-30

    The duration that saltstone retains its ability to immobilize some key radionuclides, such as technetium (Tc), plutonium (Pu), and neptunium (Np), depends on its capacity to maintain a low redox status (or low oxidation state). The reduction capacity is a measure of the mass of reductants present in the saltstone; the reductants are the active ingredients that immobilize Tc, Pu, and Np. Once reductants are exhausted, the saltstone loses its ability to immobilize these radionuclides. The reduction capacity values reported here are based on the Ce(IV)/Fe(II) system. The Portland cement (198 {micro}eq/g) and especially the fly ash (299 {micro}eq/g) had a measurable amount of reduction capacity, but the blast furnace slag (820 {micro}eq/g) not surprisingly accounted for most of the reduction capacity. The blast furnace slag contains ferrous iron and sulfides which are strong reducing and precipitating species for a large number of solids. Three saltstone samples containing 45% slag or one sample containing 90% slag had essentially the same reduction capacity as pure slag. There appears to be some critical concentration between 10% and 45% slag in the Saltstone formulation that is needed to create the maximum reduction capacity. Values from this work supported those previously reported, namely that the reduction capacity of SRS saltstone is about 820 {micro}eq/g; this value is recommended for estimating the longevity that the Saltstone Disposal Facility will retain its ability to immobilize radionuclides.

  15. 41 CFR 102-80.50 - Are Federal agencies responsible for identifying/estimating risks and for appropriate risk...

    Science.gov (United States)

    2010-07-01

    ... Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for... identify and estimate safety and environmental management risks and appropriate risk reduction strategies... responsible for identifying/estimating risks and for appropriate risk reduction strategies? 102-80.50 Section...

  16. THELI: CONVENIENT REDUCTION OF OPTICAL, NEAR-INFRARED, AND MID-INFRARED IMAGING DATA

    International Nuclear Information System (INIS)

    Schirmer, M.

    2013-01-01

    The last 15 years have seen a surge of new multi-chip optical and near-IR imagers. While some of them are accompanied by specific reduction pipelines, user-friendly and generic reduction tools are uncommon. In this paper I introduce THELI, an easy-to-use graphical interface driving an end-to-end pipeline for the reduction of any optical, near-IR, and mid-IR imaging data. The advantages of THELI when compared to other approaches are highlighted. Combining a multitude of processing algorithms and third party software, THELI provides researchers with a single, homogeneous tool. A short learning curve ensures quick success for new and more experienced observers alike. All tasks are largely automated, while at the same time a high level of flexibility and alternative reduction schemes ensure that widely different scientific requirements can be met. Over 90 optical and infrared instruments at observatories world-wide are pre-configured, while more can be added by the user. The Appendices contain three walk-through examples using public data (optical, near-IR, and mid-IR). Additional extensive documentation for training and troubleshooting is available online

  17. Canadian options for greenhouse gas emission reduction (COGGER)

    International Nuclear Information System (INIS)

    Robinson, J.; Fraser, M.; Haites, E.; Harvey, D.; Jaccard, M.; Reinsch, A.; Torrie, R.

    1993-09-01

    A panel was formed to assess the feasibility and cost of energy-related greenhouse gas (GHG) emissions reduction in Canada. The panel studies focused on the potential for increased energy efficiency and fuel switching and their effect in reducing CO 2 emissions by reviewing the extensive literature available on those topics and assessing their conclusions. Economically feasible energy savings are estimated mostly in the range of 20-40% savings by the year 2010 relative to a reference-case projection, with a median of 23%. The panel concluded that achieving the identified economic potential for increased energy efficiency by 2010 will depend on development of additional demand-side management or energy efficiency programs that go well beyond current policies and programs. Fuel switching will play a much smaller role in stabilizing energy-related CO 2 emissions than improved energy efficiency. Technology substitution and broader structural change would enable Canada to achieve significant reductions in CO 2 emissions; however, more research is needed on achieving emission reductions that would approach the levels estimated to be required globally for stabilization of atmospheric CO 2 concentrations. Achieving such emissions reductions would likely require a combination of significant improvements in energy efficiency, major changes in energy sources, and substantial changes in economic activity and life styles, relative to that projected in most reference-case forecasts. 5 refs., 1 fig., 10 tabs

  18. Iterative PSF Estimation and Its Application to Shift Invariant and Variant Blur Reduction

    Directory of Open Access Journals (Sweden)

    Seung-Won Jung

    2009-01-01

    Full Text Available Among image restoration approaches, image deconvolution has been considered a powerful solution. In image deconvolution, a point spread function (PSF, which describes the blur of the image, needs to be determined. Therefore, in this paper, we propose an iterative PSF estimation algorithm which is able to estimate an accurate PSF. In real-world motion-blurred images, a simple parametric model of the PSF fails when a camera moves in an arbitrary direction with an inconsistent speed during an exposure time. Moreover, the PSF normally changes with spatial location. In order to accurately estimate the complex PSF of a real motion blurred image, we iteratively update the PSF by using a directional spreading operator. The directional spreading is applied to the PSF when it reduces the amount of the blur and the restoration artifacts. Then, to generalize the proposed technique to the linear shift variant (LSV model, a piecewise invariant approach is adopted by the proposed image segmentation method. Experimental results show that the proposed method effectively estimates the PSF and restores the degraded images.

  19. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    National Research Council Canada - National Science Library

    Petre, P; Visher, J; Shringarpure, R; Valley, F; Swaminathan, M

    2005-01-01

    Automated design tools and integrated design flow methodologies were developed that demonstrated more than an order- of-magnitude reduction in cycle time and cost for mixed signal (digital/analoglRF...

  20. An online tool for tracking soil nitrogen

    Science.gov (United States)

    Wang, J.; Umar, M.; Banger, K.; Pittelkow, C. M.; Nafziger, E. D.

    2016-12-01

    Near real-time crop models can be useful tools for optimizing agricultural management practices. For example, model simulations can potentially provide current estimates of nitrogen availability in soil, helping growers decide whether more nitrogen needs to be applied in a given season. Traditionally, crop models have been used at point locations (i.e. single fields) with homogenous soil, climate and initial conditions. However, nitrogen availability across fields with varied weather and soil conditions at a regional or national level is necessary to guide better management decisions. This study presents the development of a publicly available, online tool that automates the integration of high-spatial-resolution forecast and past weather and soil data in DSSAT to estimate nitrogen availability for individual fields in Illinois. The model has been calibrated with field experiments from past year at six research corn fields across Illinois. These sites were treated with applications of different N fertilizer timings and amounts. The tool requires minimal management information from growers and yet has the capability to simulate nitrogen-water-crop interactions with calibrated parameters that are more appropriate for Illinois. The results from the tool will be combined with incoming field experiment data from 2016 for model validation and further improvement of model's predictive accuracy. The tool has the potential to help guide better nitrogen management practices to maximize economic and environmental benefits.

  1. Micromilling of hardened tool steel for mould making applications

    DEFF Research Database (Denmark)

    Bissacco, Giuliano; Hansen, Hans Nørgaard; De Chiffre, Leonardo

    2005-01-01

    geometries as those characterizing injection moulding moulds. The realization of the micromilling process in connection with hardened tool steel as workpiece material is particularly challenging. The low strength of the miniaturized end mills implies reduction and accurate control of the chip load which...... wear. This paper presents the micromilling process applied to the manufacturing of micro injection moulding moulds in hardened tool steel, presenting experimental evidence and possible solutions to the above-mentioned issues....

  2. Progress report on the development of remotely operated tools

    International Nuclear Information System (INIS)

    Smith, A.T.

    1984-08-01

    This report contains a number of individual trials reports based upon work conducted in aid of a programme of feasibility studies into the size reduction of radioactive contaminated solid waste. The work was directed towards the identification of acceptable remotely operated tools and the means of deploying them for dismantling operations in a radioactive environment. Reliability, ease of maintenance, change of tool bits and common power sources have been major considerations in the trials assessments. Alternative end effector drive systems have also been considered when defining suitable manipulative capabilities and attention has also been directed towards a remotely controlled tool changing capability. (author)

  3. Estimating the National Carbon Abatement Potential of City Policies: A Data- Driven Approach

    Energy Technology Data Exchange (ETDEWEB)

    Eric O’Shaughnessy, Jenny Heeter, David Keyser, Pieter Gagnon, and Alexandra Aznar

    2016-10-01

    Cities are increasingly taking actions such as building code enforcement, urban planning, and public transit expansion to reduce emissions of carbon dioxide in their communities and municipal operations. However, many cities lack the quantitative information needed to estimate policy impacts and prioritize city actions in terms of carbon abatement potential and cost effectiveness. This report fills this research gap by providing methodologies to assess the carbon abatement potential of a variety of city actions. The methodologies are applied to an energy use data set of 23,458 cities compiled for the U.S. Department of Energy’s City Energy Profile tool. The analysis estimates the national carbon abatement potential of the most commonly implemented actions in six specific policy areas. The results of this analysis suggest that, in aggregate, cities could reduce nationwide carbon emissions by about 210 million metric tons of carbon dioxide (MMT CO2) per year in a "moderate abatement scenario" by 2035 and 480 MMT CO2/year in a "high abatement scenario" by 2035 through these common actions typically within a city’s control in the six policy areas. The aggregate carbon abatement potential of these specific areas equates to a reduction of 3%-7% relative to 2013 U.S. emissions. At the city level, the results suggest the average city could reduce carbon emissions by 7% (moderate) to 19% (high) relative to current city-level emissions. City carbon abatement potential is sensitive to national and state policies that affect the carbon intensity of electricity and transportation. Specifically, the U.S. Clean Power Plan and further renewable energy cost reductions could reduce city carbon emissions overall, helping cities achieve their carbon reduction goals.

  4. the contributions of agricultural growth to poverty reduction in ethiopia

    African Journals Online (AJOL)

    RahelYilma

    in poverty is estimated indirectly. In the second ... Currently it is estimated that close to half of the ... of land nor private investors can purchase land from rural dwellers. ..... (1995-2000) and extrapolating to 2020 provides a poverty incidence of 39.4. The ..... return in terms of poverty reduction from a growth in farm productivity.

  5. Innovative Stormwater Quality Tools by SARA for Holistic Watershed Master Planning

    Science.gov (United States)

    Thomas, S. M.; Su, Y. C.; Hummel, P. R.

    2016-12-01

    Stormwater management strategies such as Best Management Practices (BMP) and Low-Impact Development (LID) have increasingly gained attention in urban runoff control, becoming vital to holistic watershed master plans. These strategies can help address existing water quality impairments and support regulatory compliance, as well as guide planning and management of future development when substantial population growth and urbanization is projected to occur. However, past efforts have been limited to qualitative planning due to the lack of suitable tools to conduct quantitative assessment. The San Antonio River Authority (SARA), with the assistance of Lockwood, Andrews & Newnam, Inc. (LAN) and AQUA TERRA Consultants (a division of RESPEC), developed comprehensive hydrodynamic and water quality models using the Hydrological Simulation Program-FORTRAN (HSPF) for several urban watersheds in the San Antonio River Basin. These models enabled watershed management to look at water quality issues on a more refined temporal and spatial scale than the limited monitoring data. They also provided a means to locate and quantify potential water quality impairments and evaluate the effects of mitigation measures. To support the models, a suite of software tools were developed. including: 1) SARA Timeseries Utility Tool for managing and processing of large model timeseries files, 2) SARA Load Reduction Tool to determine load reductions needed to achieve screening levels for each modeled constituent on a sub-basin basis, and 3) SARA Enhanced BMP Tool to determine the optimal combination of BMP types and units needed to achieve the required load reductions. Using these SARA models and tools, water quality agencies and stormwater professionals can determine the optimal combinations of BMP/LID to accomplish their goals and save substantial stormwater infrastructure and management costs. The tools can also help regulators and permittees evaluate the feasibility of achieving compliance

  6. Method for Friction Force Estimation on the Flank of Cutting Tools

    Directory of Open Access Journals (Sweden)

    Luis Huerta

    2017-01-01

    Full Text Available Friction forces are present in any machining process. These forces could play an important role in the dynamics of the system. In the cutting process, friction is mainly present in the rake face and the flank of the tool. Although the one that acts on the rake face has a major influence, the other one can become also important and could take part in the stability of the system. In this work, experimental identification of the friction on the flank is presented. The experimental determination was carried out by machining aluminum samples in a CNC lathe. As a result, two friction functions were obtained as a function of the cutting speed and the relative motion of the contact elements. Experiments using a worn and a new insert were carried out. Force and acceleration were recorded simultaneously and, from these results, different friction levels were observed depending on the cutting parameters, such as cutting speed, feed rate, and tool condition. Finally, a friction model for the flank friction is presented.

  7. Waste reduction possibilities for manufacturing systems in the industry 4.0

    Science.gov (United States)

    Tamás, P.; Illés, B.; Dobos, P.

    2016-11-01

    The industry 4.0 creates some new possibilities for the manufacturing companies’ waste reduction for example by appearance of the cyber physical systems and the big data concept and spreading the „Internet of things (IoT)”. This paper presents in details the fourth industrial revolutions’ more important achievements and tools. In addition there will be also numerous new research directions in connection with the waste reduction possibilities of the manufacturing systems outlined.

  8. MFV Reductions of MSSM Parameter Space

    CERN Document Server

    AbdusSalam, S.S.; Quevedo, F.

    2015-01-01

    The 100+ free parameters of the minimal supersymmetric standard model (MSSM) make it computationally difficult to compare systematically with data, motivating the study of specific parameter reductions such as the cMSSM and pMSSM. Here we instead study the reductions of parameter space implied by using minimal flavour violation (MFV) to organise the R-parity conserving MSSM, with a view towards systematically building in constraints on flavour-violating physics. Within this framework the space of parameters is reduced by expanding soft supersymmetry-breaking terms in powers of the Cabibbo angle, leading to a 24-, 30- or 42-parameter framework (which we call MSSM-24, MSSM-30, and MSSM-42 respectively), depending on the order kept in the expansion. We provide a Bayesian global fit to data of the MSSM-30 parameter set to show that this is manageable with current tools. We compare the MFV reductions to the 19-parameter pMSSM choice and show that the pMSSM is not contained as a subset. The MSSM-30 analysis favours...

  9. The efficiency of modified jackknife and ridge type regression estimators: a comparison

    Directory of Open Access Journals (Sweden)

    Sharad Damodar Gore

    2008-09-01

    Full Text Available A common problem in multiple regression models is multicollinearity, which produces undesirable effects on the least squares estimator. To circumvent this problem, two well known estimation procedures are often suggested in the literature. They are Generalized Ridge Regression (GRR estimation suggested by Hoerl and Kennard iteb8 and the Jackknifed Ridge Regression (JRR estimation suggested by Singh et al. iteb13. The GRR estimation leads to a reduction in the sampling variance, whereas, JRR leads to a reduction in the bias. In this paper, we propose a new estimator namely, Modified Jackknife Ridge Regression Estimator (MJR. It is based on the criterion that combines the ideas underlying both the GRR and JRR estimators. We have investigated standard properties of this new estimator. From a simulation study, we find that the new estimator often outperforms the LASSO, and it is superior to both GRR and JRR estimators, using the mean squared error criterion. The conditions under which the MJR estimator is better than the other two competing estimators have been investigated.

  10. Ex-vessel Fish Price Database: Disaggregating Prices for Low-Priced Species from Reduction Fisheries

    Directory of Open Access Journals (Sweden)

    Travis C. Tai

    2017-11-01

    Full Text Available Ex-vessel fish prices are essential for comprehensive fisheries management and socioeconomic analyses for fisheries science. In this paper, we reconstructed a global ex-vessel price database with the following areas of improvement: (1 compiling reported prices explicitly listed as “for reduction to fishmeal and fish oil” to estimate prices separately for catches destined for fishmeal and fish oil production, and other non-direct human consumption purposes; (2 including 95% confidence limit estimates for each price estimation; and (3 increasing the number of input data and the number of price estimates to match the reconstructed Sea Around Us catch database. Our primary focus was to address this first area of improvement as ex-vessel prices for catches destined for non-direct human consumption purposes were substantially overestimated, notably in countries with large reduction fisheries. For example in Peru, 2010 landed values were estimated as 3.8 billion real 2010 USD when using separate prices for reduction fisheries, compared with 5.8 billion using previous methods with only one price for all end-products. This update of the price database has significant global and country-specific impacts on fisheries price and landed value trends over time.

  11. Cost reduction study for the LANL KrF laser-driven LMF design

    International Nuclear Information System (INIS)

    1989-01-01

    This report is in fulfillment of the deliverable requirements for the optical components portions of the LANL-KrF Laser-Driven LMF Design Cost Reduction Study. This report examines the future cost reductions that may accrue through the use of mass production, innovative manufacturing techniques, and new materials. Results are based on data collection and survey of optical component manufacturers, BDM experience, and existing cost models. These data provide a good representation of current methods and technologies from which future estimates can be made. From these data, a series of scaling relationships were developed to project future costs for a selected set of technologies. The scaling relationships are sensitive to cost driving parameters such as size and surface figure requirements as well as quantity requirements, production rate, materials, and manufacturing processes. In addition to the scaling relationships, descriptions of the selected processes were developed along with graphical representations of the processes. This report provides a useful tool in projecting the costs of advanced laser concepts at the component level of detail. A mix of the most diverse yet comparable technologies was chosen for this study. This yielded a useful, yet manageable number of variables to examine. The study has resulted in a first-order cost model which predicts the relative cost behavior of optical components within different variable constraints

  12. Development of a smart city planning support tool using the cooperative method

    Directory of Open Access Journals (Sweden)

    Takeshi Kobayashi

    2015-12-01

    Full Text Available A reduction of environmental burdens is currently required. In particular, proposing a new approach for the construction of a smart city using renewable energy is important. The technological development of a smart city is founded building equipment and infrastructure. However, planning methods and their techniques using the collaboration approach with residents are only just developing. This study aimed to develop a support tool for the construction of a smart city using renewable energy while facilitating consensus-building efforts among residents using the method for a cooperative housing development. We organized the supporting methods for the construction of residential area using the cooperative method. Then, we developed supporting tools that interface the computer with these methods. We examined the support techniques for the construction of a residential area using renewable energy technology by analyzing Japanese cases of a smart city. Moreover, we developed a support tool for the construction of a smart city on a trial basis. We integrated the smart city construction tools and the cooperative housing construction support tool. This tool has a 3D modeling system that helps residents to easily understand the space image as a result of the examination. We also developed a professional supporting tool that residents can consider for cost-effectiveness in renewable energy and its environmental load reduction rate for the planning of a smart city.

  13. Long term fuel price elasticity: effects on mobility tool ownership and residential location choice - Final report

    Energy Technology Data Exchange (ETDEWEB)

    Erath, A.; Axhausen, K. W.

    2010-04-15

    This comprehensive final report for the Swiss Federal Office of Energy (SFOE) examines the long-term effects of fuel price elasticity. The study analyses how mobility tool usage and ownership as well as residence location choice are affected by rising fuel costs. Based on econometric models, long-term fuel price elasticity is derived. The authors quote that the demand reactions to higher fuel prices mainly observed are the reduction of mileage and the consideration of smaller-engined and diesel-driven cars. As cars with natural gas powered engines and electric drives were hardly considered in the survey, the results of the natural gas model can, according to the authors, only serve as a trend. No stable model could be estimated for the demand and usage of electric cars. A literature overview is presented and the design of the survey is discussed, whereby socio-demographical variables and the effects of price and residence changes are discussed. Modelling of mobility tool factors and results obtained are looked at. Finally, residence choice factors are modelled and discussed. Several appendices complete the report.

  14. PV O&M Cost Model and Cost Reduction

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Andy

    2017-03-15

    This is a presentation on PV O&M cost model and cost reduction for the annual Photovoltaic Reliability Workshop (2017), covering estimating PV O&M costs, polynomial expansion, and implementation of Net Present Value (NPV) and reserve account in cost models.

  15. State-of-the-Art for Hygrothermal Simulation Tools

    Energy Technology Data Exchange (ETDEWEB)

    Boudreaux, Philip R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); New, Joshua Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shrestha, Som S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Adams, Mark B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pallin, Simon B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-01

    The hygrothermal (heat and moisture) performance of buildings can be assessed by utilizing simulation tools. There are currently a number of available hygrothermal calculation tools available which vary in their degree of sophistication and runtime requirements. This report investigates three of the most commonly used models (WUFI, HAMT, and EMPD) to assess their limitations and potential to generate physically realistic results to prioritize improvements for EnergyPlus (which uses HAMT and EMPD). The outcome of the study shows that, out of these three tools, WUFI has the greatest hygrothermal capabilities. Limitations of these tools were also assessed including: WUFI’s inability to properly account for air leakage and transfer at surface boundaries; HAMT’s inability to handle air leakage, precipitationrelated moisture problems, or condensation problems from high relative humidity; and multiple limitations for EMPD as a simplified method to estimate indoor temperature and humidity levels and generally not used to estimate the hygrothermal performance of the building envelope materials. In conclusion, out of the three investigated simulation tools, HAMT has the greatest modeling potential, is open source, and we have prioritized specific features that can enable EnergyPlus to model all relevant heat and moisture transfer mechanisms that impact the performance of building envelope components.

  16. Accounting for behavioral effects of increases in the carbon dioxide (CO2) tax in revenue estimation in Sweden

    International Nuclear Information System (INIS)

    Hammar, Henrik; Sjoestroem, Magnus

    2011-01-01

    In this paper we describe how behavioral responses of carbon dioxide (CO 2 ) tax increases are accounted for in tax revenue estimation in Sweden. The rationale for developing a method for this is a mix between that a CO 2 tax is a primary climate policy tool aiming to reduce CO 2 emissions and that the CO 2 tax generates sizable tax revenues. - Highlights: → We develop a method on the long run tax revenue effects of increasing the CO2 tax in Sweden. → We use long run price elasticities as the basis for calculating the long run effects. → The CO2 tax is the primary instrument to reduce CO2 emissions from sectors outside the EU ETS. → There is almost an exact correlation between fossil energy use and fossil CO 2 emissions. → The method provide consistent estimates of emission reductions following from CO 2 tax increases.

  17. EINSTEIN - Expert system for an Intelligent Supply of Thermal Energy in Industry. Audit methodology and software tool

    Energy Technology Data Exchange (ETDEWEB)

    Schweiger, Hans; Danov, Stoyan (energyXperts.NET (Spain)); Vannoni, Claudia; Facci, Enrico (Sapienza Univ. of Rome, Dept. of Mechanics and Aeronautics, Rome (Italy)); Brunner, Christoph; Slawitsch, Bettina (Joanneum Research, Inst. of Sustainable Techniques and Systems - JOINTS, Graz (Austria))

    2009-07-01

    For optimising thermal energy supply in industry, a holistic integral approach is required that includes possibilities of demand reduction by heat recovery and process integration, and by an intelligent combination of efficient heat and cold supply technologies. EINSTEIN is a tool-kit for fast and high quality thermal energy audits in industry, composed by an audit guide describing the methodology and by a software tool that guides the auditor through all the audit steps. The main features of EINSTEIN are: (1) a basic questionnaire helps for systematic collection of the necessary information with the possibility to acquire data by distance; (2) special tools allow for fast consistency checking and estimation of missing data, so that already with very few data some first predictions can be made; (3) the data processing is based on standardised models for industrial processes and industrial heat supply systems; (4) semi-automatization: the software tool gives support to decision making for the generation of alternative heat and cold supply proposals, carries out automatically all the necessary calculations, including dynamic simulation of the heat supply system, and creates a standard audit report. The software tool includes modules for benchmarking, automatic design of heat exchanger networks, and design assistants for the heat and cold supply system. The core of the expert system software tool is available for free, as an open source software project. This type of software development has shown to be very efficient for dissemination of knowledge and for the continuous maintenance and improvement thanks to user contributions.

  18. Receiver based PAPR reduction in OFDMA

    KAUST Repository

    Ali, Anum Z.; Al-Zahrani, Ali Y.; Al-Naffouri, Tareq Y.; Naguib, Ayman F.

    2014-01-01

    High peak-to-average power ratio is one of the major drawbacks of orthogonal frequency division multiplexing (OFDM). Clipping is the simplest peak reduction scheme, however, it requires clipping mitigation at the receiver. Recently compressed sensing has been used for clipping mitigation (by exploiting the sparse nature of clipping signal). However, clipping estimation in multi-user scenario (i.e., OFDMA) is not straightforward as clipping distortions overlap in frequency domain and one cannot distinguish between distortions from different users. In this work, a collaborative clipping removal strategy is proposed based on joint estimation of the clipping distortions from all users. Further, an effective data aided channel estimation strategy for clipped OFDM is also outlined. Simulation results are presented to justify the effectiveness of the proposed schemes. © 2014 IEEE.

  19. Receiver based PAPR reduction in OFDMA

    KAUST Repository

    Ali, Anum Z.

    2014-05-01

    High peak-to-average power ratio is one of the major drawbacks of orthogonal frequency division multiplexing (OFDM). Clipping is the simplest peak reduction scheme, however, it requires clipping mitigation at the receiver. Recently compressed sensing has been used for clipping mitigation (by exploiting the sparse nature of clipping signal). However, clipping estimation in multi-user scenario (i.e., OFDMA) is not straightforward as clipping distortions overlap in frequency domain and one cannot distinguish between distortions from different users. In this work, a collaborative clipping removal strategy is proposed based on joint estimation of the clipping distortions from all users. Further, an effective data aided channel estimation strategy for clipped OFDM is also outlined. Simulation results are presented to justify the effectiveness of the proposed schemes. © 2014 IEEE.

  20. Using Bayesian Network as a tool for coastal storm flood impact prediction at Varna Bay (Bulgaria, Western Black Sea)

    Science.gov (United States)

    Valchev, Nikolay; Eftimova, Petya; Andreeva, Nataliya; Prodanov, Bogdan

    2017-04-01

    Coastal zone is among the fastest evolving areas worldwide. Ever increasing population inhabiting coastal settlements develops often conflicting economic and societal activities. The existing imbalance between the expansion of these activities, on one hand, and the potential to accommodate them in a sustainable manner, on the other, becomes a critical problem. Concurrently, coasts are affected by various hydro-meteorological phenomena such as storm surges, heavy seas, strong winds and flash floods, which intensities and occurrence frequency is likely to increase due to the climate change. This implies elaboration of tools capable of quick prediction of impact of those phenomena on the coast and providing solutions in terms of disaster risk reduction measures. One such tool is Bayesian network. Proposed paper describes the set-up of such network for Varna Bay (Bulgaria, Western Black Sea). It relates near-shore storm conditions to their onshore flood potential and ultimately to relevant impact as relative damage on coastal and manmade environment. Methodology for set-up and training of the Bayesian network was developed within RISC-KIT project (Resilience-Increasing Strategies for Coasts - toolKIT). Proposed BN reflects the interaction between boundary conditions, receptors, hazard, and consequences. Storm boundary conditions - maximum significant wave height and peak surge level, were determined on the basis of their historical and projected occurrence. The only hazard considered in this study is flooding characterized by maximum inundation depth. BN was trained with synthetic events created by combining estimated boundary conditions. Flood impact was modeled with the process-based morphodynamical model XBeach. Restaurants, sport and leisure facilities, administrative buildings, and car parks were introduced in the network as receptors. Consequences (impact) are estimated in terms of relative damage caused by given inundation depth. National depth

  1. Reduction of hexavalent chromium by fasted and fed human gastric fluid. II. Ex vivo gastric reduction modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kirman, Christopher R., E-mail: ckirman@summittoxicology.com [Summit Toxicology, Orange Village, OH, 44022 (United States); Suh, Mina, E-mail: msuh@toxstrategies.com [ToxStrategies, Inc., Mission Viejo, CA, 92692 (United States); Hays, Sean M., E-mail: shays@summittoxicology.com [Summit Toxicology, Allenspark, CO, 8040 (United States); Gürleyük, Hakan, E-mail: hakan@brooksrand.com [Brooks Applied Labs, Bothell, WA, 98011 (United States); Gerads, Russ, E-mail: russ@brooksrand.com [Brooks Applied Labs, Bothell, WA, 98011 (United States); De Flora, Silvio, E-mail: sdf@unige.it [Department of Health Sciences, University of Genoa, 16132 Genoa (Italy); Parker, William, E-mail: william.parker@duke.edu [Duke University Medical Center, Department of Surgery, Durham, NC, 27710 (United States); Lin, Shu, E-mail: shu.lin@duke.edu [Duke University Medical Center, Department of Surgery, Durham, NC, 27710 (United States); Haws, Laurie C., E-mail: lhaws@toxstrategies.com [ToxStrategies, Inc., Katy, TX, 77494 (United States); Harris, Mark A., E-mail: mharris@toxstrategies.com [ToxStrategies, Inc., Austin, TX, 78751 (United States); Proctor, Deborah M., E-mail: dproctor@toxstrategies.com [ToxStrategies, Inc., Mission Viejo, CA, 92692 (United States)

    2016-09-01

    To extend previous models of hexavalent chromium [Cr(VI)] reduction by gastric fluid (GF), ex vivo experiments were conducted to address data gaps and limitations identified with respect to (1) GF dilution in the model; (2) reduction of Cr(VI) in fed human GF samples; (3) the number of Cr(VI) reduction pools present in human GF under fed, fasted, and proton pump inhibitor (PPI)-use conditions; and (4) an appropriate form for the pH-dependence of Cr(VI) reduction rate constants. Rates and capacities of Cr(VI) reduction were characterized in gastric contents from fed and fasted volunteers, and from fasted pre-operative patients treated with PPIs. Reduction capacities were first estimated over a 4-h reduction period. Once reduction capacity was established, a dual-spike approach was used in speciated isotope dilution mass spectrometry analyses to characterize the concentration-dependence of the 2nd order reduction rate constants. These data, when combined with previously collected data, were well described by a three-pool model (pool 1 = fast reaction with low capacity; pool 2 = slow reaction with higher capacity; pool 3 = very slow reaction with higher capacity) using pH-dependent rate constants characterized by a piecewise, log-linear relationship. These data indicate that human gastric samples, like those collected from rats and mice, contain multiple pools of reducing agents, and low concentrations of Cr(VI) (< 0.7 mg/L) are reduced more rapidly than high concentrations. The data and revised modeling results herein provide improved characterization of Cr(VI) gastric reduction kinetics, critical for Cr(VI) pharmacokinetic modeling and human health risk assessment. - Highlights: • SIDMS allows for measurement of Cr(VI) reduction rate in gastric fluid ex vivo • Human gastric fluid has three reducing pools • Cr(VI) in drinking water at < 0.7 mg/L is rapidly reduced in human gastric fluid • Reduction rate is concentration- and pH-dependent • A refined PK

  2. Scientific and practical tools for dealing with water resource estimations for the future

    Directory of Open Access Journals (Sweden)

    D. A. Hughes

    2015-06-01

    Full Text Available Future flow regimes will be different to today and imperfect knowledge of present and future climate variations, rainfall–runoff processes and anthropogenic impacts make them highly uncertain. Future water resources decisions will rely on practical and appropriate simulation tools that are sensitive to changes, can assimilate different types of change information and flexible enough to accommodate improvements in understanding of change. They need to include representations of uncertainty and generate information appropriate for uncertain decision-making. This paper presents some examples of the tools that have been developed to address these issues in the southern Africa region. The examples include uncertainty in present day simulations due to lack of understanding and data, using climate change projection data from multiple climate models and future catchment responses due to both climate and development effects. The conclusions are that the tools and models are largely available and what we need is more reliable forcing and model evlaution information as well as methods of making decisions with such inevitably uncertain information.

  3. Cost-effective reduction of fine primary particulate matter emissions in Finland

    International Nuclear Information System (INIS)

    Karvosenoja, Niko; Klimont, Zbigniew; Tohka, Antti; Johansson, Matti

    2007-01-01

    Policies to reduce adverse health impacts of fine particulate matter (PM 2.5 ) require information on costs of abatement and associated costs. This paper explores the potential for cost-efficient control of anthropogenic primary PM 2.5 emissions in Finland. Based on a Kyoto-compliant energy projection, two emission control scenarios for 2020 were developed. 'Baseline' assumes implementation of PM controls in compliance with existing legislation. 'Reduction' assumes ambitious further reductions. Emissions for 2020 were estimated at 26 and 18.6 Gg a -1 for 'Baseline' and 'Reduction', respectively. The largest abatement potential, 3.0 Gg a -1 , was calculated for power plants and industrial combustion. The largest potential with marginal costs below 5000 Euro MG(PM 2.5 ) -1 was for domestic wood combustion, 1.7 Gg a -1 . For traffic the potential was estimated at 1.0 Gg a -1 , but was associated with high costs. The results from this paper are used in the policy-driven national integrated assessment modeling that explores cost-efficient reductions of the health impacts of PM

  4. Bayesian estimates of linkage disequilibrium

    Directory of Open Access Journals (Sweden)

    Abad-Grau María M

    2007-06-01

    Full Text Available Abstract Background The maximum likelihood estimator of D' – a standard measure of linkage disequilibrium – is biased toward disequilibrium, and the bias is particularly evident in small samples and rare haplotypes. Results This paper proposes a Bayesian estimation of D' to address this problem. The reduction of the bias is achieved by using a prior distribution on the pair-wise associations between single nucleotide polymorphisms (SNPs that increases the likelihood of equilibrium with increasing physical distances between pairs of SNPs. We show how to compute the Bayesian estimate using a stochastic estimation based on MCMC methods, and also propose a numerical approximation to the Bayesian estimates that can be used to estimate patterns of LD in large datasets of SNPs. Conclusion Our Bayesian estimator of D' corrects the bias toward disequilibrium that affects the maximum likelihood estimator. A consequence of this feature is a more objective view about the extent of linkage disequilibrium in the human genome, and a more realistic number of tagging SNPs to fully exploit the power of genome wide association studies.

  5. INDUSTRIAL APPROBATION OF COMBINED COUNTERSINK-TAP TOOL

    Directory of Open Access Journals (Sweden)

    Nurulla M. Vagabov

    2017-01-01

    Full Text Available Abstract. Objectives Based on a critical analysis of the results of complex studies, we set out to demonstrate the advantages, as compared with existing technologies, of a developed technology that uses a new cutting scheme with a combined countersink-tap tool. Methods One way to improve the processing capacity, tool life and quality of a cut thread is to reduce the torque and strain hardening of the processed material by employing a new cutting approach to completely eliminate the friction of the lateral sides of the tooth on the surface of the cut thread. It was necessary for this technology to be checked in real production conditions. Results The conducted production tests of a combined countersink-tap tool with the new cutting scheme developed by the inventors have shown that, as a result of a significant reduction in the torque and a decrease in the strain hardening of the processed material, it is possible to increase the cutting speed and increase labour productivity by more than 2 times as compared with the thread cutting processes using taps with staggered teeth, 1.2 times as compared to taps with a corrected structure, and more than 6 times as compared to standard taps. At the same time, the stability of the tool is increased 3-5 times and the number of breakages is also sharply reduced. Conclusion It has been established that the accuracy of the geometric parameters as well as the strength and quality of the thread surface cut by the combined countersink-tap tool with the new cutting scheme in hard-to-work materials is much higher than the same thread parameters obtained by processing with standard and other known taps. The studies also indicated its high reliability, operability and expediency of application for processing the above-mentioned materials. The advantages of the combined tool also include a reduction in thread cutting time as compared to a separate machining of the threaded hole (countersinking with a standard

  6. Assessment of Energy Efficiency Improvement and CO2 Emission Reduction Potentials in India's Cement Industry

    Energy Technology Data Exchange (ETDEWEB)

    Morrow, III, William R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hasanbeigi, Ali [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-12-03

    India’s cement industry is the second largest in the world behind China with annual cement production of 168 Mt in 2010 which accounted for slightly greater than six percent of the world’s annual cement production in the same year. To produce that amount of cement, the industry consumed roughly 700 PJ of fuel and 14.7 TWh of electricity. We identified and analyzed 22 energy efficiency technologies and measures applicable to the processes in the Indian cement industry. The Conservation Supply Curve (CSC) used in this study is an analytical tool that captures both the engineering and the economic perspectives of energy conservation. Using a bottom-up electricity CSC model and compared to an electricity price forecast the cumulative cost-effective plant-level electricity savings potential for the Indian cement industry for 2010- 2030 is estimated to be 83 TWh, and the cumulative plant-level technical electricity saving potential is 89 TWh during the same period. The grid-level CO2 emissions reduction associated with cost-effective electricity savings is 82 Mt CO2 and the electric grid-level CO2 emission reduction associated with technical electricity saving potential is 88 Mt CO2. Compared to a fuel price forecast, an estimated cumulative cost-effective fuel savings potential of 1,029 PJ with associated CO2 emission reduction of 97 Mt CO2 during 2010-2030 is possible. In addition, a sensitivity analysis with respect to the discount rate used is conducted to assess the effect of changes in this parameter on the results. The result of this study gives a comprehensive and easy to understand perspective to the Indian cement industry and policy makers about the energy efficiency potential and its associated cost over the next twenty years.

  7. LDR: A Package for Likelihood-Based Sufficient Dimension Reduction

    Directory of Open Access Journals (Sweden)

    R. Dennis Cook

    2011-03-01

    Full Text Available We introduce a new mlab software package that implements several recently proposed likelihood-based methods for sufficient dimension reduction. Current capabilities include estimation of reduced subspaces with a fixed dimension d, as well as estimation of d by use of likelihood-ratio testing, permutation testing and information criteria. The methods are suitable for preprocessing data for both regression and classification. Implementations of related estimators are also available. Although the software is more oriented to command-line operation, a graphical user interface is also provided for prototype computations.

  8. Adaptive noise reduction circuit for a sound reproduction system

    Science.gov (United States)

    Engebretson, A. Maynard (Inventor); O'Connell, Michael P. (Inventor)

    1995-01-01

    A noise reduction circuit for a hearing aid having an adaptive filter for producing a signal which estimates the noise components present in an input signal. The circuit includes a second filter for receiving the noise-estimating signal and modifying it as a function of a user's preference or as a function of an expected noise environment. The circuit also includes a gain control for adjusting the magnitude of the modified noise-estimating signal, thereby allowing for the adjustment of the magnitude of the circuit response. The circuit also includes a signal combiner for combining the input signal with the adjusted noise-estimating signal to produce a noise reduced output signal.

  9. Side-by-side ANFIS as a useful tool for estimating correlated thermophysical properties

    Science.gov (United States)

    Grieu, Stéphane; Faugeroux, Olivier; Traoré, Adama; Claudet, Bernard; Bodnar, Jean-Luc

    2015-12-01

    In the present paper, an artificial intelligence-based approach dealing with the estimation of correlated thermophysical properties is designed and evaluated. This new and "intelligent" approach makes use of photothermal responses obtained when homogeneous materials are subjected to a light flux. Commonly, gradient-based algorithms are used as parameter estimation techniques. Unfortunately, such algorithms show instabilities leading to non-convergence in case of correlated properties to be estimated from a rebuilt impulse response. So, the main objective of the present work was to simultaneously estimate both the thermal diffusivity and conductivity of homogeneous materials, from front-face or rear-face photothermal responses to pseudo random binary signals. To this end, we used side-by-side neuro-fuzzy systems (adaptive network-based fuzzy inference systems) trained with a hybrid algorithm. We focused on the impact on generalization of both the examples used during training and the fuzzification process. In addition, computation time was a key point to consider. That is why the developed algorithm is computationally tractable and allows both the thermal diffusivity and conductivity of homogeneous materials to be simultaneously estimated with very good accuracy (the generalization error ranges between 4.6% and 6.2%).

  10. Development of remote handling tools for glove box

    International Nuclear Information System (INIS)

    Tomita, Yutaka; Nemoto, Takeshi; Denuma, Akio; Todokoro, Akio

    1996-01-01

    For a part of advanced nuclear fuel recycling technology development, we will separate and recover Americium from the MOX fuel scrap by solvent extraction. When we carry this examination, reduction of exposure from Americium-241 is one of important problems. To solve this problem fundamentally, we studied many joints type of the remote handling tools for glove box and produced a trial production machine. Also, we carried out basic function examinations of it. As a result, we got the prospect of development of the remote handling tools which could treat Americium in glove box. (author)

  11. Efficacy of US-guided Hydrostatic Reduction in Children with Intussusception

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Min; Chung, Tae Woong; Yoon, Woong; Chang, Nam Kyu; Heo, Suk Hee; Shin, Sang Soo; Lim, Hyo Sun; Jeong, Yong Yeon; Kang, Heoung Keun [Chonnam National University Hospital, Gwangju (Korea, Republic of)

    2007-09-15

    To assess the success rate and efficacy of US-guided hydrostatic reduction in children with intussusception. We retrospectively evaluated the ultrasonographic findings and clinical features of 121 children (M:F=80:41, mean age= 18 months) who underwent US-guided hydrostatic reduction between November, 2002 and February, 2007 for the diagnosis and treatment of intussusception. The 121 patients underwent 147 procedures, including recurred cases. Successful reduction was achieved in 132 cases (89.8% success rate), as confirmed by post-procedure ultrasonography and clinical findings. Emergency operations were performed in the 10 (6.8%) cases of irreducible intussusceptions, 8 of ileocolic type and 2 of ileoileal type. Perforation occurred in 4 cases (2.7%), and seizure in 1 case during the procedure (0.7%). US-guided hydrostatic reduction is a safe and effective tool for the diagnosis and treatment of pediatric intussusception

  12. Cost-effectiveness of reduction of off-site dose

    International Nuclear Information System (INIS)

    McGrath, J.J.; Macphee, R.; Arbeau, N.; Miskin, J.; Scott, C.K.; Winters, E.

    1988-03-01

    Since the early 1970's, nuclear power plants have been designed and operated with a target of not releasing more than one percent of the licensed limits (derived emission limits) in liquid and gaseous effluents. The AECB initiated this study of the cost-effectiveness of the reduction of off-site doses as part of a review to determine if further measures to reduce off-site doses might be reasonably achievable. Atlantic Nuclear has estimated the cost of existing technology options that can be applied for a further reduction of radioactive effluents from future CANDU nuclear power plants. Detritiation, filtration, ion exchange and evaporation are included in the assessment. The costs are presented in 1987 Canadian dollars, and include capital and operating costs for a reference 50 year plant life. Darlington NGS and Point Lepreau NGS are the reference nuclear power plant types and locations. The effect resulting from the hypothetical application of each technology has been calculated as the resulting reduction in world collective radiation dose detriment. The CSA N288.1 procedure was used for local pathway analysis and the global dispersion model developed by the NEA (OECD) group of experts was used for dose calculations. The reduction in the 'collective effective dose equivalent commitment' was assumed to exist for 10,000 years, the expected life-span of solid waste repositories. No attempt was made to model world population dynamics. The collective dose reductions were calculated for a nominal world population of 10 billion persons. The estimated cost and effect of applying the technology options are summarized in a tabular form for input to further consideration of 'reasonably achievable off-site dose levels'

  13. POTENTIAL HEALTH RISK REDUCTION ARISING FROM REDUCED MERCURY EMISSIONS FROM COAL FIRED POWER PLANTS.

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T. M.; Lipfert, F. W.; Morris, S. C.; Moskowitz, P. D.

    2001-09-01

    The U.S. Environmental Protection Agency (EPA) has announced plans to regulate mercury (Hg) emissions from coal-fired power plants. EPA has not prepared a quantitative assessment of the reduction in risk that could be achieved through reduction in coal plant emissions of Hg. To address this issue, Brookhaven National Laboratory (BNL) with support from the U.S. Department of Energy Office of Fossil Energy (DOE FE) prepared a quantitative assessment of the reduction in human health risk that could be achieved through reduction in coal plant emissions of Hg. The primary pathway for Hg exposure is through consumption of fish. The most susceptible population to Hg exposure is the fetus. Therefore the risk assessment focused on consumption of fish by women of child-bearing age. Dose response factors were generated from studies on loss of cognitive abilities (language skills, motor skills, etc.) by young children whose mothers consumed large amounts of fish with high Hg levels. Population risks were estimated for the general population in three regions of the country, (the Midwest, Northeast, and Southeast) that were identified by EPA as being heavily impacted by coal emissions. Three scenarios for reducing Hg emissions from coal plants were considered: (1) A base case using current conditions; (2) A 50% reduction; and, (3) A 90% reduction. These reductions in emissions were assumed to translate linearly into a reduction in fish Hg levels of 8.6% and 15.5%, respectively. Population risk estimates were also calculated for two subsistence fisher populations. These groups of people consume substantially more fish than the general public and, depending on location, the fish may contain higher Hg levels than average. Risk estimates for these groups were calculated for the three Hg levels used for the general population analyses. Analysis shows that the general population risks for exposure of the fetus to Hg are small. Estimated risks under current conditions (i.e., no

  14. Assessing Sediment Yield and the Effect of Best Management Practices on Sediment Yield Reduction for Tutuila Island, American Samoa

    Science.gov (United States)

    Leta, O. T.; Dulai, H.; El-Kadi, A. I.

    2017-12-01

    Upland soil erosion and sedimentation are the main threats for riparian and coastal reef ecosystems in Pacific islands. Here, due to small size of the watersheds and steep slope, the residence time of rainfall runoff and its suspended load is short. Fagaalu bay, located on the island of Tutuila (American Samoa) has been identified as a priority watershed, due to degraded coral reef condition and reduction of stream water quality from heavy anthropogenic activity yielding high nutrients and sediment loads to the receiving water bodies. This study aimed to estimate the sediment yield to the Fagaalu stream and assess the impact of Best Management Practices (BMP) on sediment yield reduction. For this, the Soil and Water Assessment Tool (SWAT) model was applied, calibrated, and validated for both daily streamflow and sediment load simulation. The model also estimated the sediment yield contributions from existing land use types of Fagaalu and identified soil erosion prone areas for introducing BMP scenarios in the watershed. Then, three BMP scenarios, such as stone bund, retention pond, and filter strip were treated on bare (quarry area), agricultural, and shrub land use types. It was found that the bare land with quarry activity yielded the highest annual average sediment yield of 133 ton per hectare (t ha-1) followed by agriculture (26.1 t ha-1) while the lowest sediment yield of 0.2 t ha-1 was estimated for the forested part of the watershed. Additionally, the bare land area (2 ha) contributed approximately 65% (207 ha) of the watershed's sediment yield, which is 4.0 t ha-1. The latter signifies the high impact as well as contribution of anthropogenic activity on sediment yield. The use of different BMP scenarios generally reduced the sediment yield to the coastal reef of Fagaalu watershed. However, treating the quarry activity area with stone bund showed the highest sediment yield reduction as compared to the other two BMP scenarios. This study provides an estimate

  15. Development and experimental validation of a tool to determine out-of-field dose in radiotherapy

    International Nuclear Information System (INIS)

    Bessieres, I.

    2013-01-01

    Over the last two decades, many technical developments have been achieved on intensity modulated radiotherapy (IMRT) and allow a better conformation of the dose to the tumor and consequently increase the success of cancer treatments. These techniques often reduce the dose to organs at risk close to the target volume; nevertheless they increase peripheral dose levels. In this situation, the rising of the survival rate also increases the probability of secondary effects expression caused by peripheral dose deposition (second cancers for instance). Nowadays, the peripheral dose is not taken into account during the treatment planning and no reliable prediction tool exists. However it becomes crucial to consider the peripheral dose during the planning, especially for pediatric cases. Many steps of the development of an accurate and fast Monte Carlo out-of-field dose prediction tool based on the PENELOPE code have been achieved during this PhD work. To this end, we demonstrated the ability of the PENELOPE code to estimate the peripheral dose by comparing its results with reference measurements performed on two experimental configurations (metrological and pre-clinical). During this experimental work, we defined a protocol for low doses measurement with OSL dosimeters. In parallel, we highlighted the slow convergence of the code for clinical use. Consequently, we accelerated the code by implementing a new variance reduction technique called pseudo-deterministic transport which is specifically with the objective of improving calculations in areas far away from the beam. This step improved the efficiency of the peripheral doses estimation in both validation configurations (by a factor of 20) in order to reach reasonable computing times for clinical application. Optimization works must be realized in order improve the convergence of our tool and consider a final clinical use. (author) [fr

  16. Per-pack price reductions available from different cigarette purchasing strategies: United States, 2009-2010.

    Science.gov (United States)

    Pesko, Michael F; Xu, Xin; Tynan, Michael A; Gerzoff, Robert B; Malarcher, Ann M; Pechacek, Terry F

    2014-06-01

    Following cigarette excise tax increases, smokers may use cigarette price minimization strategies to continue their usual cigarette consumption rather than reducing consumption or quitting. This reduces the public health benefits of the tax increase. This paper estimates the price reductions for a wide-range of strategies, compensating for overlapping strategies. We performed regression analysis on the 2009-2010 National Adult Tobacco Survey (N=13,394) to explore price reductions that smokers in the United States obtained from purchasing cigarettes. We examined five cigarette price minimization strategies: 1) purchasing discount brand cigarettes, 2) using price promotions, 3) purchasing cartons, 4) purchasing on Indian reservations, and 5) purchasing online. Price reductions from these strategies were estimated jointly to compensate for overlapping strategies. Each strategy provided price reductions between 26 and 99cents per pack. Combined price reductions were possible. Additionally, price promotions were used with regular brands to obtain larger price reductions than when price promotions were used with generic brands. Smokers can realize large price reductions from price minimization strategies, and there are many strategies available. Policymakers and public health officials should be aware of the extent that these strategies can reduce cigarette prices. Published by Elsevier Inc.

  17. Toolset for evaluating infrared countermeasures and signature reduction for ships

    NARCIS (Netherlands)

    Schleijpen, H.M.A.

    2010-01-01

    The protection of ships against infrared guided missiles is a concern for modern naval forces. The vulnerability of ships can be reduced by applying countermeasures such as infrared decoys and infrared signature reduction. This paper presents a set of simulation tools which can be used for assessing

  18. Damage severity estimation from the global stiffness decrease

    International Nuclear Information System (INIS)

    Nitescu, C; Gillich, G R; Manescu, T; Korka, Z I; Abdel Wahab, M

    2017-01-01

    In actual damage detection methods, localization and severity estimation can be treated separately. The severity is commonly estimated using fracture mechanics approach, with the main disadvantage of involving empirically deduced relations. In this paper, a damage severity estimator based on the global stiffness reduction is proposed. This feature is computed from the deflections of the intact and damaged beam, respectively. The damage is always located where the bending moment achieves maxima. If the damage is positioned elsewhere on the beam, its effect becomes lower, because the stress is produced by a diminished bending moment. It is shown that the global stiffness reduction produced by a crack is the same for all beams with a similar cross-section, regardless of the boundary conditions. One mathematical relation indicating the severity and another indicating the effect of removing damage from the beam. Measurements on damaged beams with different boundary conditions and cross-sections are carried out, and the location and severity are found using the proposed relations. These comparisons prove that the proposed approach can be used to accurately compute the severity estimator. (paper)

  19. Iterative PSF Estimation and Its Application to Shift Invariant and Variant Blur Reduction

    OpenAIRE

    Seung-Won Jung; Byeong-Doo Choi; Sung-Jea Ko

    2009-01-01

    Among image restoration approaches, image deconvolution has been considered a powerful solution. In image deconvolution, a point spread function (PSF), which describes the blur of the image, needs to be determined. Therefore, in this paper, we propose an iterative PSF estimation algorithm which is able to estimate an accurate PSF. In real-world motion-blurred images, a simple parametric model of the PSF fails when a camera moves in an arbitrary direction with an inconsistent speed during an e...

  20. Directions of the US Geological Survey Landslide Hazards Reduction Program

    Science.gov (United States)

    Wieczorek, G.F.

    1993-01-01

    The US Geological Survey (USGS) Landslide Hazards Reduction Program includes studies of landslide process and prediction, landslide susceptibility and risk mapping, landslide recurrence and slope evolution, and research application and technology transfer. Studies of landslide processes have been recently conducted in Virginia, Utah, California, Alaska, and Hawaii, Landslide susceptibility maps provide a very important tool for landslide hazard reduction. The effects of engineering-geologic characteristics of rocks, seismic activity, short and long-term climatic change on landslide recurrence are under study. Detailed measurement of movement and deformation has begun on some active landslides. -from Author

  1. Tools for Genetic Studies in Experimental Populations of Polyploids

    Directory of Open Access Journals (Sweden)

    Peter M. Bourke

    2018-04-01

    Full Text Available Polyploid organisms carry more than two copies of each chromosome, a condition rarely tolerated in animals but which occurs relatively frequently in the plant kingdom. One of the principal challenges faced by polyploid organisms is to evolve stable meiotic mechanisms to faithfully transmit genetic information to the next generation upon which the study of inheritance is based. In this review we look at the tools available to the research community to better understand polyploid inheritance, many of which have only recently been developed. Most of these tools are intended for experimental populations (rather than natural populations, facilitating genomics-assisted crop improvement and plant breeding. This is hardly surprising given that a large proportion of domesticated plant species are polyploid. We focus on three main areas: (1 polyploid genotyping; (2 genetic and physical mapping; and (3 quantitative trait analysis and genomic selection. We also briefly review some miscellaneous topics such as the mode of inheritance and the availability of polyploid simulation software. The current polyploid analytic toolbox includes software for assigning marker genotypes (and in particular, estimating the dosage of marker alleles in the heterozygous condition, establishing chromosome-scale linkage phase among marker alleles, constructing (short-range haplotypes, generating linkage maps, performing genome-wide association studies (GWAS and quantitative trait locus (QTL analyses, and simulating polyploid populations. These tools can also help elucidate the mode of inheritance (disomic, polysomic or a mixture of both as in segmental allopolyploids or reveal whether double reduction and multivalent chromosomal pairing occur. An increasing number of polyploids (or associated diploids are being sequenced, leading to publicly available reference genome assemblies. Much work remains in order to keep pace with developments in genomic technologies. However, such

  2. Tools for Genetic Studies in Experimental Populations of Polyploids.

    Science.gov (United States)

    Bourke, Peter M; Voorrips, Roeland E; Visser, Richard G F; Maliepaard, Chris

    2018-01-01

    Polyploid organisms carry more than two copies of each chromosome, a condition rarely tolerated in animals but which occurs relatively frequently in the plant kingdom. One of the principal challenges faced by polyploid organisms is to evolve stable meiotic mechanisms to faithfully transmit genetic information to the next generation upon which the study of inheritance is based. In this review we look at the tools available to the research community to better understand polyploid inheritance, many of which have only recently been developed. Most of these tools are intended for experimental populations (rather than natural populations), facilitating genomics-assisted crop improvement and plant breeding. This is hardly surprising given that a large proportion of domesticated plant species are polyploid. We focus on three main areas: (1) polyploid genotyping; (2) genetic and physical mapping; and (3) quantitative trait analysis and genomic selection. We also briefly review some miscellaneous topics such as the mode of inheritance and the availability of polyploid simulation software. The current polyploid analytic toolbox includes software for assigning marker genotypes (and in particular, estimating the dosage of marker alleles in the heterozygous condition), establishing chromosome-scale linkage phase among marker alleles, constructing (short-range) haplotypes, generating linkage maps, performing genome-wide association studies (GWAS) and quantitative trait locus (QTL) analyses, and simulating polyploid populations. These tools can also help elucidate the mode of inheritance (disomic, polysomic or a mixture of both as in segmental allopolyploids) or reveal whether double reduction and multivalent chromosomal pairing occur. An increasing number of polyploids (or associated diploids) are being sequenced, leading to publicly available reference genome assemblies. Much work remains in order to keep pace with developments in genomic technologies. However, such technologies

  3. Filtering Methods for Error Reduction in Spacecraft Attitude Estimation Using Quaternion Star Trackers

    Science.gov (United States)

    Calhoun, Philip C.; Sedlak, Joseph E.; Superfin, Emil

    2011-01-01

    Precision attitude determination for recent and planned space missions typically includes quaternion star trackers (ST) and a three-axis inertial reference unit (IRU). Sensor selection is based on estimates of knowledge accuracy attainable from a Kalman filter (KF), which provides the optimal solution for the case of linear dynamics with measurement and process errors characterized by random Gaussian noise with white spectrum. Non-Gaussian systematic errors in quaternion STs are often quite large and have an unpredictable time-varying nature, particularly when used in non-inertial pointing applications. Two filtering methods are proposed to reduce the attitude estimation error resulting from ST systematic errors, 1) extended Kalman filter (EKF) augmented with Markov states, 2) Unscented Kalman filter (UKF) with a periodic measurement model. Realistic assessments of the attitude estimation performance gains are demonstrated with both simulation and flight telemetry data from the Lunar Reconnaissance Orbiter.

  4. Deep Drawing of High-Strength Tailored Blanks by Using Tailored Tools

    Directory of Open Access Journals (Sweden)

    Thomas Mennecart

    2016-01-01

    Full Text Available In most forming processes based on tailored blanks, the tool material remains the same as that of sheet metal blanks without tailored properties. A novel concept of lightweight construction for deep drawing tools is presented in this work to improve the forming behavior of tailored blanks. The investigations presented here deal with the forming of tailored blanks of dissimilar strengths using tailored dies made of two different materials. In the area of the steel blank with higher strength, typical tool steel is used. In the area of the low-strength steel, a hybrid tool made out of a polymer and a fiber-reinforced surface replaces the steel half. Cylindrical cups of DP600/HX300LAD are formed and analyzed regarding their formability. The use of two different halves of tool materials shows improved blank thickness distribution, weld-line movement and pressure distribution compared to the use of two steel halves. An improvement in strain distribution is also observed by the inclusion of springs in the polymer side of tools, which is implemented to control the material flow in the die. Furthermore, a reduction in tool weight of approximately 75% can be achieved by using this technique. An accurate finite element modeling strategy is developed to analyze the problem numerically and is verified experimentally for the cylindrical cup. This strategy is then applied to investigate the thickness distribution and weld-line movement for a complex geometry, and its transferability is validated. The inclusion of springs in the hybrid tool leads to better material flow, which results in reduction of weld-line movement by around 60%, leading to more uniform thickness distribution.

  5. Construction of estimated flow- and load-duration curves for Kentucky using the Water Availability Tool for Environmental Resources (WATER)

    Science.gov (United States)

    Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.

    2012-01-01

    Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.

  6. A software tool for ecosystem services assessments

    Science.gov (United States)

    Riegels, Niels; Klinting, Anders; Butts, Michael; Middelboe, Anne Lise; Mark, Ole

    2017-04-01

    proposed project can be estimated to determine whether the project affects drivers, pressures, states or a combination of these. • In part III, information about impacts on drivers, pressures, and states is used to identify ESS impacted by a proposed project. Potential beneficiaries of impacted ESS are also identified. • In part IV, changes in ESS are estimated. These estimates include changes in the provision of ESS, the use of ESS, and the value of ESS. • A sustainability assessment in Part V estimates the broader impact of a proposed project according to social, environmental, governance and other criteria. The ESS evaluation software tool is designed to assist an evaluation or study leader carrying out an ESS assessment. The tool helps users move through the logic of the ESS evaluation and make sense of relationships between elements of the DPSIR framework, the CICES classification scheme, and the FEGS approach. The tool also provides links to useful indicators and assessment methods in order to help users quantify changes in ESS and ESS values. The software tool is developed in collaboration with the DESSIN user group, who will use the software to estimate changes in ESS resulting from the implementation of green technologies addressing water quality and water scarcity issues. Although the software is targeted to this user group, it will be made available for free to the public after the conclusion of the project.

  7. An Economic Evaluation of Food Safety Education Interventions: Estimates and Critical Data Gaps.

    Science.gov (United States)

    Zan, Hua; Lambea, Maria; McDowell, Joyce; Scharff, Robert L

    2017-08-01

    The economic evaluation of food safety interventions is an important tool that practitioners and policy makers use to assess the efficacy of their efforts. These evaluations are built on models that are dependent on accurate estimation of numerous input variables. In many cases, however, there is no data available to determine input values and expert opinion is used to generate estimates. This study uses a benefit-cost analysis of the food safety component of the adult Expanded Food and Nutrition Education Program (EFNEP) in Ohio as a vehicle for demonstrating how results based on variable values that are not objectively determined may be sensitive to alternative assumptions. In particular, the focus here is on how reported behavioral change is translated into economic benefits. Current gaps in the literature make it impossible to know with certainty how many people are protected by the education (what are the spillover effects?), the length of time education remains effective, and the level of risk reduction from change in behavior. Based on EFNEP survey data, food safety education led 37.4% of participants to improve their food safety behaviors. Under reasonable default assumptions, benefits from this improvement significantly outweigh costs, yielding a benefit-cost ratio of between 6.2 and 10.0. Incorporation of a sensitivity analysis using alternative estimates yields a greater range of estimates (0.2 to 56.3), which highlights the importance of future research aimed at filling these research gaps. Nevertheless, most reasonable assumptions lead to estimates of benefits that justify their costs.

  8. COST OF SELECTIVE CATALYTIC REDUCTION (SCR) APPLICATION FOR NOX CONTROL ON COAL-FIRED BOILERS

    Science.gov (United States)

    The report provides a methodology for estimating budgetary costs associated with retrofit applications of selective catalytic reduction (SCR) technology on coal-fired boilers. SCR is a postcombustion nitrogen oxides (NOx) control technology capable of providing NOx reductions >90...

  9. Implementation of a Clinical Decision Support Tool for Stool Cultures and Parasitological Studies in Hospitalized Patients.

    Science.gov (United States)

    Nikolic, D; Richter, S S; Asamoto, K; Wyllie, R; Tuttle, R; Procop, G W

    2017-12-01

    There is substantial evidence that stool culture and parasitological examinations are of minimal to no value after 3 days of hospitalization. We implemented and studied the impact of a clinical decision support tool (CDST) to decrease the number of unnecessary stool cultures (STCUL), ova/parasite (O&P) examinations, and Giardia / Cryptosporidium enzyme immunoassay screens (GC-EIA) performed for patients hospitalized >3 days. We studied the frequency of stool studies ordered before or on day 3 and after day 3 of hospitalization (i.e., categorical orders/total number of orders) before and after this intervention and denoted the numbers and types of microorganisms detected within those time frames. This intervention, which corresponded to a custom-programmed hard-stop alert tool in the Epic hospital information system, allowed providers to override the intervention by calling the laboratory, if testing was deemed medically necessary. Comparative statistics were employed to determine significance, and cost savings were estimated based on our internal costs. Before the intervention, 129/670 (19.25%) O&P examinations, 47/204 (23.04%) GC-EIA, and 249/1,229 (20.26%) STCUL were ordered after 3 days of hospitalization. After the intervention, 46/521 (8.83%) O&P examinations, 27/157 (17.20%) GC-EIA, and 106/1,028 (10.31%) STCUL were ordered after 3 days of hospitalization. The proportions of reductions in the number of tests performed after 3 days and the associated P values were 54.1% for O&P examinations ( P < 0.0001), 22.58% for GC-EIA ( P = 0.2807), and 49.1% for STCUL ( P < 0.0001). This was estimated to have resulted in $8,108.84 of cost savings. The electronic CDST resulted in a substantial reduction in the number of evaluations of stool cultures and the number of parasitological examinations for patients hospitalized for more than 3 days and in a cost savings while retaining the ability of the clinician to obtain these tests if clinically indicated. Copyright © 2017

  10. Spot Urine-guided Salt Reduction in Chronic Kidney Disease Patients.

    Science.gov (United States)

    Uchiyama, Kiyotaka; Yanai, Akane; Ishibashi, Yoshitaka

    2017-09-01

    Dietary salt restriction is important in patients with chronic kidney disease (CKD) to reduce hypertension, cardiovascular events, progression of CKD, and mortality. However, recommending salt reduction for patients is difficult without knowing their actual sodium intake. This study evaluated the effectiveness of spot urine-guided salt reduction in CKD outpatients. A prospective cohort study was used. This study included a total of 127 adult outpatients (aged 60 ± 18 years, 80 males) with CKD. Their baseline estimated glomerular filtration rate was 51.4 ± 25.1 (mL/minute/1.73 m 2 ), and 64 (50%) of them were with CKD stage 3a or 3b (both 32 [25%]). We informed the patients of their individual spot urine-estimated salt intake every time they visited the outpatient clinic. Based on the data, the nephrologist encouraged the patients to achieve their salt restriction goal. The primary outcome was the estimated salt excretion, and the secondary outcome was the urinary protein-to-Cr ratio (UPCR). Multiple regression analyses were performed to clarify the contributing factors of changes in both outcomes. Over a follow-up of 12 months, the median number of patients' visits was 7 (5-8). The estimated salt intake was significantly reduced from 7.98 ± 2.49 g/day to 6.77 ± 1.77 g/day (P intake, with borderline significance (P = .08). Providing spot urine-estimated salt intake feedback effectively motivated CKD patients to reduce their salt intake. Spot urine-guided salt reduction may slow CKD progression through decreased urinary protein excretion. Copyright © 2017 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  11. 324 Building life cycle dose estimates for planned work

    Energy Technology Data Exchange (ETDEWEB)

    Landsman, S.D.; Peterson, C.A.; Thornhill, R.E.

    1995-09-01

    This report describes a tool for use by organizational management teams to plan, manage, and oversee personnel exposures within their organizations. The report encompasses personnel radiation exposures received from activities associated with the B-Cell Cleanout Project, Surveillance and Maintenance Project, the Mk-42 Project, and other minor activities. It is designed to provide verifiable Radiological Performance Reports. The primary area workers receive radiation exposure is the Radiochemical Engineering Complex airlock. Entry to the airlock is necessary for maintenance of cranes and other equipment, and to set up the rail system used to move large pieces of equipment and shipping casks into and out of the airlock. Transfers of equipment and materials from the hot cells in the complex to the airlock are required to allow dose profiles of waste containers, shuffling of waste containers to allow grouting activities to go on, and to allow maintenance of in-cell cranes. Both DOE and the Pacific Northwest Laboratory (PNL) are currently investing in state-of-the-art decontamination equipment. Challenging goals for exposure reduction were established for several broad areas of activity. Exposure estimates and goals developed from these scheduled activities will be compared against actual exposures for scheduled and unscheduled activities that contributed to exposures received by personnel throughout the year. Included in this report are life cycle exposure estimates by calendar year for the B-Cell Cleanout project, a three-year estimate of exposures associated with Surveillance and Maintenance, and known activities for Calendar Year (CY) 1995 associated with several smaller projects. These reports are intended to provide a foundation for future dose estimates, by year, requiring updating as exposure conditions change or new avenues of approach to performing work are developed.

  12. 324 Building life cycle dose estimates for planned work

    International Nuclear Information System (INIS)

    Landsman, S.D.; Peterson, C.A.; Thornhill, R.E.

    1995-09-01

    This report describes a tool for use by organizational management teams to plan, manage, and oversee personnel exposures within their organizations. The report encompasses personnel radiation exposures received from activities associated with the B-Cell Cleanout Project, Surveillance and Maintenance Project, the Mk-42 Project, and other minor activities. It is designed to provide verifiable Radiological Performance Reports. The primary area workers receive radiation exposure is the Radiochemical Engineering Complex airlock. Entry to the airlock is necessary for maintenance of cranes and other equipment, and to set up the rail system used to move large pieces of equipment and shipping casks into and out of the airlock. Transfers of equipment and materials from the hot cells in the complex to the airlock are required to allow dose profiles of waste containers, shuffling of waste containers to allow grouting activities to go on, and to allow maintenance of in-cell cranes. Both DOE and the Pacific Northwest Laboratory (PNL) are currently investing in state-of-the-art decontamination equipment. Challenging goals for exposure reduction were established for several broad areas of activity. Exposure estimates and goals developed from these scheduled activities will be compared against actual exposures for scheduled and unscheduled activities that contributed to exposures received by personnel throughout the year. Included in this report are life cycle exposure estimates by calendar year for the B-Cell Cleanout project, a three-year estimate of exposures associated with Surveillance and Maintenance, and known activities for Calendar Year (CY) 1995 associated with several smaller projects. These reports are intended to provide a foundation for future dose estimates, by year, requiring updating as exposure conditions change or new avenues of approach to performing work are developed

  13. Estimation and reduction of harmonic currents from power converters

    DEFF Research Database (Denmark)

    Asiminoaei, Lucian

    -based method depends very much on the amount and accuracy of collected data in the development stage. The outcome of this investigation is a Harmonic Calculation Software compiled into a Graphical User Interface PC-software application, which can be applied for fast estimations of the harmonic currents...... control of the proposed topologies are given together with laboratory tests. One harmonic current mitigation solution found is to connect (two) smaller power APF's in parallel, sharing the same ac- and dc-bus. It is proven that parallel APF's may have lower passive components although other issues arises......, like circulation currents, which is removed here by common mode coils. Another harmonic solution is to use cascade connection of (two) independent APF's that cooperatively share the task of the harmonic mitigation. Two cooperative control methods are proposed called load-sharing and harmonic-sharing...

  14. Designing decision support tools for targeted N-regulation

    DEFF Research Database (Denmark)

    Christensen, Andreas Aagaard; Piil, Kristoffer; Andersen, Peter Stubkjær

    2017-01-01

    data model for land use data – the dNmark landscape model. Based on input data which is corrected and edited by workshop participants, the tool estimates the effect of potential land use scenarios on nutrient emissions. The tool was tested in 5 scenario workshops in case areas in Denmark in 2016...... in Denmark to develop and improve a functioning decision support tool for landscape scale N-management. The aim of the study is to evaluate how a decision support tool can best be designed in order to enable landscape scale strategic N-management practices. Methods: A prototype GIS-tool for capturing......, storing, editing, displaying and modelling landscape scale farming practices and associated emission consequences was developed. The tool was designed to integrate locally held knowledge with national scale datasets in live scenario situations through the implementation of a flexible, uniform and editable...

  15. Estimation of the Tool Condition by Applying the Wavelet Transform to Acoustic Emission Signals

    International Nuclear Information System (INIS)

    Gomez, M. P.; Piotrkowski, R.; Ruzzante, J. E.; D'Attellis, C. E.

    2007-01-01

    This work follows the search of parameters to evaluate the tool condition in machining processes. The selected sensing technique is acoustic emission and it is applied to a turning process of steel samples. The obtained signals are studied using the wavelet transformation. The tool wear level is quantified as a percentage of the final wear specified by the Standard ISO 3685. The amplitude and relevant scale obtained of acoustic emission signals could be related with the wear level

  16. Low-cost risk reduction strategy for small workplaces: how can we spread good practices?

    Science.gov (United States)

    Kogi, K

    2006-01-01

    Recent advances in health risk reduction approaches are examined based on inter-country networking experiences. A noteworthy progress is the wider application of low-cost improvements to risk reduction particularly in small enterprises and agriculture in both industrially developing and developed countries. This is helped by the readiness of managers and workers to implement these improvements despite many constraints. Typical improvements include mobile racks, simple workstation changes, screening hazards, better welfare facilities and teamwork arrangements. In view of the complex circumstances of work-related health risks, it is important to know whether a low-cost strategy can advance risk reduction practices effectively and what support measures are necessary. It is confirmed that the strategy can overcome related constraints through its advantages. Main advantages lie in (a) the facilitation of improved practices in multiple technical areas, (b) the strengthening of realistic stepwise risk reduction, and (c) the enhanced multiplier effects through training of local trainers. Action-oriented risk assessment tools, such as action checklists and low-cost improvement guides, can encourage risk-reducing measures adjusted to each local situation. It is suggested to spread the low-cost risk reduction strategy for improving small workplaces in diversified settings with the support of these locally tailored tools.

  17. CMS Analysis and Data Reduction with Apache Spark

    Energy Technology Data Exchange (ETDEWEB)

    Gutsche, Oliver [Fermilab; Canali, Luca [CERN; Cremer, Illia [Magnetic Corp., Waltham; Cremonesi, Matteo [Fermilab; Elmer, Peter [Princeton U.; Fisk, Ian [Flatiron Inst., New York; Girone, Maria [CERN; Jayatilaka, Bo [Fermilab; Kowalkowski, Jim [Fermilab; Khristenko, Viktor [CERN; Motesnitsalis, Evangelos [CERN; Pivarski, Jim [Princeton U.; Sehrish, Saba [Fermilab; Surdy, Kacper [CERN; Svyatkovskiy, Alexey [Princeton U.

    2017-10-31

    Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and tools, promising a fresh look at analysis of very large datasets that could potentially reduce the time-to-physics with increased interactivity. Moreover these new tools are typically actively developed by large communities, often profiting of industry resources, and under open source licensing. These factors result in a boost for adoption and maturity of the tools and for the communities supporting them, at the same time helping in reducing the cost of ownership for the end-users. In this talk, we are presenting studies of using Apache Spark for end user data analysis. We are studying the HEP analysis workflow separated into two thrusts: the reduction of centrally produced experiment datasets and the end-analysis up to the publication plot. Studying the first thrust, CMS is working together with CERN openlab and Intel on the CMS Big Data Reduction Facility. The goal is to reduce 1 PB of official CMS data to 1 TB of ntuple output for analysis. We are presenting the progress of this 2-year project with first results of scaling up Spark-based HEP analysis. Studying the second thrust, we are presenting studies on using Apache Spark for a CMS Dark Matter physics search, comparing Spark's feasibility, usability and performance to the ROOT-based analysis.

  18. For Third Enrollment Period, Marketplaces Expand Decision Support Tools To Assist Consumers.

    Science.gov (United States)

    Wong, Charlene A; Polsky, Daniel E; Jones, Arthur T; Weiner, Janet; Town, Robert J; Baker, Tom

    2016-04-01

    The design of the Affordable Care Act's online health insurance Marketplaces can improve how consumers make complex health plan choices. We examined the choice environment on the state-based Marketplaces and HealthCare.gov in the third open enrollment period. Compared to previous enrollment periods, we found greater adoption of some decision support tools, such as total cost estimators and integrated provider lookups. Total cost estimators differed in how they generated estimates: In some Marketplaces, consumers categorized their own utilization, while in others, consumers answered detailed questions and were assigned a utilization profile. The tools available before creating an account (in the window-shopping period) and afterward (in the real-shopping period) differed in several Marketplaces. For example, five Marketplaces provided total cost estimators to window shoppers, but only two provided them to real shoppers. Further research is needed on the impact of different choice environments and on which tools are most effective in helping consumers pick optimal plans. Project HOPE—The People-to-People Health Foundation, Inc.

  19. “DRYPACK” - a calculation and analysis tool

    DEFF Research Database (Denmark)

    Andreasen, M.B.; Toftegaard, R.; Schneider, P.

    2013-01-01

    drying processes. Moreover, it is possible to change the configuration of the dryer by including/changing energy saving components to illustrate the potential of the new configuration. The calculation tool is demonstrated in four different case studies, where the actual energy consumption and possible......“DryPack” is a calculation tool that visualises the energy consumption of airbased and superheated steam drying processes. With “DryPack”, it is possible to add different components to a simple drying process, and thereby increase the flexibility, which makes it possible to analyse the most common...... energy consumption reductions by using “DryPack” are calculated. With the “DryPack” calculation tool, it is possible to calculate four different unit operations with moist air (dehumidification of air, humidification of air, mixing of two air streams, and heating of air). In addition, a Mollier diagram...

  20. Engineering tools for complex task of reducing energy consumption

    NARCIS (Netherlands)

    Hensen, J.L.M.

    1994-01-01

    Reduction of energy consumption in buildings while ensuring a good indoor environment is a very challenging and difficult engineering task. For this we need tools which are based on an integral approach of the building, control systems, occupants and outdoor environment. A building energy simulation

  1. FY 2000 report on the results of the technology development of energy use reduction of machine tools. Development of dry cutting use abrasion resistant/lubricous coated tools; 2000 nendo energy shiyo gorika kosaku kikai nado gijutsu kaihatsu seika hokokusho. Dry sessakuyo taimamo junkatsusei hifuku kogu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    For the purpose of energy conservation and reduction of environmental loads of machine tools, study was conducted on the dry cutting which is the cutting with no use of cutting oil, and the FY 2000 results were summed up. The study was made on dry cutting use abrasion resistance/lubricous coated tools coated with the composite membrane of which the cutting life become little lower than that of existing tools using coolant. In the survey of abrasion resistant/lubricous films, it was found out that in the adhesion to ultra-hard substrates, the DLC single-layer film consisting only of carbon indicated the same excellent adhesion as intermediate-layer inserts. As to the synthesis of abrasion resistant/lubricous films, the synthesis of the composite membrane (WC/C membrane) consisting of tungsten carbide (WC) and carbon (C) was made using arc ion plating device. The WC/C membrane is composed of W and C and has the structure in which at nm levels the layer with much W and the layer with less W were alternately piled. Study was made of devices necessary for the development of abrasion resistant/lubricous films and the film formation for drill. (NEDO)

  2. Local polynomial Whittle estimation covering non-stationary fractional processes

    DEFF Research Database (Denmark)

    Nielsen, Frank

    to the non-stationary region. By approximating the short-run component of the spectrum by a polynomial, instead of a constant, in a shrinking neighborhood of zero we alleviate some of the bias that the classical local Whittle estimators is prone to. This bias reduction comes at a cost as the variance is in...... study illustrates the performance of the proposed estimator compared to the classical local Whittle estimator and the local polynomial Whittle estimator. The empirical justi.cation of the proposed estimator is shown through an analysis of credit spreads....

  3. Performance Analysis of a Fluidic Axial Oscillation Tool for Friction Reduction with the Absence of a Throttling Plate

    Directory of Open Access Journals (Sweden)

    Xinxin Zhang

    2017-04-01

    Full Text Available An axial oscillation tool is proved to be effective in solving problems associated with high friction and torque in the sliding drilling of a complex well. The fluidic axial oscillation tool, based on an output-fed bistable fluidic oscillator, is a type of axial oscillation tool which has become increasingly popular in recent years. The aim of this paper is to analyze the dynamic flow behavior of a fluidic axial oscillation tool with the absence of a throttling plate in order to evaluate its overall performance. In particular, the differences between the original design with a throttling plate and the current default design are profoundly analyzed, and an improvement is expected to be recorded for the latter. A commercial computational fluid dynamics code, Fluent, was used to predict the pressure drop and oscillation frequency of a fluidic axial oscillation tool. The results of the numerical simulations agree well with corresponding experimental results. A sufficient pressure pulse amplitude with a low pressure drop is desired in this study. Therefore, a relative pulse amplitude of pressure drop and displacement are introduced in our study. A comparison analysis between the two designs with and without a throttling plate indicates that when the supply flow rate is relatively low or higher than a certain value, the fluidic axial oscillation tool with a throttling plate exhibits a better performance; otherwise, the fluidic axial oscillation tool without a throttling plate seems to be a preferred alternative. In most of the operating circumstances in terms of the supply flow rate and pressure drop, the fluidic axial oscillation tool performs better than the original design.

  4. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  5. Development and evaluation of thermal model reduction algorithms for spacecraft

    Science.gov (United States)

    Deiml, Michael; Suderland, Martin; Reiss, Philipp; Czupalla, Markus

    2015-05-01

    This paper is concerned with the topic of the reduction of thermal models of spacecraft. The work presented here has been conducted in cooperation with the company OHB AG, formerly Kayser-Threde GmbH, and the Institute of Astronautics at Technische Universität München with the goal to shorten and automatize the time-consuming and manual process of thermal model reduction. The reduction of thermal models can be divided into the simplification of the geometry model for calculation of external heat flows and radiative couplings and into the reduction of the underlying mathematical model. For simplification a method has been developed which approximates the reduced geometry model with the help of an optimization algorithm. Different linear and nonlinear model reduction techniques have been evaluated for their applicability in reduction of the mathematical model. Thereby the compatibility with the thermal analysis tool ESATAN-TMS is of major concern, which restricts the useful application of these methods. Additional model reduction methods have been developed, which account to these constraints. The Matrix Reduction method allows the approximation of the differential equation to reference values exactly expect for numerical errors. The summation method enables a useful, applicable reduction of thermal models that can be used in industry. In this work a framework for model reduction of thermal models has been created, which can be used together with a newly developed graphical user interface for the reduction of thermal models in industry.

  6. Costing Tool for International Cancer Registries

    Centers for Disease Control (CDC) Podcasts

    2016-11-21

    A health economist at CDC talks about a new tool for estimating how much it costs to run cancer registries in developing countries.  Created: 11/21/2016 by National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP).   Date Released: 11/21/2016.

  7. Counting Parasites: Using Shrimp to Teach Students about Estimation

    Science.gov (United States)

    Gunzburger, Lindsay; Curran, Mary Carla

    2013-01-01

    Estimation is an important skill that we rely on every day for simple tasks, such as providing food for a dinner party or arriving at an appointment on time. Despite its importance, most people have never been formally taught how to estimate. Estimation can also be a vital tool for scientific inquiry. We have created an activity designed to teach…

  8. Assessment of food safety practices of food service food handlers (risk assessment data): testing a communication intervention (evaluation of tools).

    Science.gov (United States)

    Chapman, Benjamin; Eversley, Tiffany; Fillion, Katie; Maclaurin, Tanya; Powell, Douglas

    2010-06-01

    Globally, foodborne illness affects an estimated 30% of individuals annually. Meals prepared outside of the home are a risk factor for acquiring foodborne illness and have been implicated in up to 70% of traced outbreaks. The Centers for Disease Control and Prevention has called on food safety communicators to design new methods and messages aimed at increasing food safety risk-reduction practices from farm to fork. Food safety infosheets, a novel communication tool designed to appeal to food handlers and compel behavior change, were evaluated. Food safety infosheets were provided weekly to food handlers in working food service operations for 7 weeks. It was hypothesized that through the posting of food safety infosheets in highly visible locations, such as kitchen work areas and hand washing stations, that safe food handling behaviors of food service staff could be positively influenced. Using video observation, food handlers (n = 47) in eight food service operations were observed for a total of 348 h (pre- and postintervention combined). After the food safety infosheets were introduced, food handlers demonstrated a significant increase (6.7%, P < 0.05, 95% confidence interval) in mean hand washing attempts, and a significant reduction in indirect cross-contamination events (19.6%, P < 0.05, 95% confidence interval). Results of the research demonstrate that posting food safety infosheets is an effective intervention tool that positively influences the food safety behaviors of food handlers.

  9. Assessment of damage from reduction of expected lifespan due to cancer

    Directory of Open Access Journals (Sweden)

    Boris Alengordovich Korobitsyn

    2013-09-01

    Full Text Available This paper presents the theoretical and methodological approaches to the assessment of damage from premature mortality and reduction of life expectancy due to various reasons. The concepts measuring the price of a human life are analyzed: the evaluation from the standpoint of the theory of human capital; indirect estimation taking into account non-monetary social costs; evaluation of individuals’ willingness to pay for the elimination of the risk of death; estimation based on the determination of insurance premiums and compensations under court decision; evaluation of the social investments, aimed to reduce the risk of premature mortality of the individual. The following indexes were calculated for all subordinate entities of the Russian Federation: reduction of life expectancy, lost years of potential life in the working age, and gross regional product lost due to the reduction of years of potential life in the working-age population as a result of cancer

  10. Cement bond evaluation method in horizontal wells using segmented bond tool

    Science.gov (United States)

    Song, Ruolong; He, Li

    2018-06-01

    Most of the existing cement evaluation technologies suffer from tool eccentralization due to gravity in highly deviated wells and horizontal wells. This paper proposes a correction method to lessen the effects of tool eccentralization on evaluation results of cement bond using segmented bond tool, which has an omnidirectional sonic transmitter and eight segmented receivers evenly arranged around the tool 2 ft from the transmitter. Using 3-D finite difference parallel numerical simulation method, we investigate the logging responses of centred and eccentred segmented bond tool in a variety of bond conditions. From the numerical results, we find that the tool eccentricity and channel azimuth can be estimated from measured sector amplitude. The average of the sector amplitude when the tool is eccentred can be corrected to the one when the tool is centred. Then the corrected amplitude will be used to calculate the channel size. The proposed method is applied to both synthetic and field data. For synthetic data, it turns out that this method can estimate the tool eccentricity with small error and the bond map is improved after correction. For field data, the tool eccentricity has a good agreement with the measured well deviation angle. Though this method still suffers from the low accuracy of calculating channel azimuth, the credibility of corrected bond map is improved especially in horizontal wells. It gives us a choice to evaluate the bond condition for horizontal wells using existing logging tool. The numerical results in this paper can provide aids for understanding measurements of segmented tool in both vertical and horizontal wells.

  11. A human development framework for CO2 reductions.

    Directory of Open Access Journals (Sweden)

    Luís Costa

    Full Text Available Although developing countries are called to participate in CO(2 emission reduction efforts to avoid dangerous climate change, the implications of proposed reduction schemes in human development standards of developing countries remain a matter of debate. We show the existence of a positive and time-dependent correlation between the Human Development Index (HDI and per capita CO(2 emissions from fossil fuel combustion. Employing this empirical relation, extrapolating the HDI, and using three population scenarios, the cumulative CO(2 emissions necessary for developing countries to achieve particular HDI thresholds are assessed following a Development As Usual approach (DAU. If current demographic and development trends are maintained, we estimate that by 2050 around 85% of the world's population will live in countries with high HDI (above 0.8. In particular, 300 Gt of cumulative CO(2 emissions between 2000 and 2050 are estimated to be necessary for the development of 104 developing countries in the year 2000. This value represents between 20 % to 30 % of previously calculated CO(2 budgets limiting global warming to 2 °C. These constraints and results are incorporated into a CO(2 reduction framework involving four domains of climate action for individual countries. The framework reserves a fair emission path for developing countries to proceed with their development by indexing country-dependent reduction rates proportional to the HDI in order to preserve the 2 °C target after a particular development threshold is reached. For example, in each time step of five years, countries with an HDI of 0.85 would need to reduce their per capita emissions by approx. 17% and countries with an HDI of 0.9 by 33 %. Under this approach, global cumulative emissions by 2050 are estimated to range from 850 up to 1100 Gt of CO(2. These values are within the uncertainty range of emissions to limit global temperatures to 2 °C.

  12. A human development framework for CO2 reductions.

    Science.gov (United States)

    Costa, Luís; Rybski, Diego; Kropp, Jürgen P

    2011-01-01

    Although developing countries are called to participate in CO(2) emission reduction efforts to avoid dangerous climate change, the implications of proposed reduction schemes in human development standards of developing countries remain a matter of debate. We show the existence of a positive and time-dependent correlation between the Human Development Index (HDI) and per capita CO(2) emissions from fossil fuel combustion. Employing this empirical relation, extrapolating the HDI, and using three population scenarios, the cumulative CO(2) emissions necessary for developing countries to achieve particular HDI thresholds are assessed following a Development As Usual approach (DAU). If current demographic and development trends are maintained, we estimate that by 2050 around 85% of the world's population will live in countries with high HDI (above 0.8). In particular, 300 Gt of cumulative CO(2) emissions between 2000 and 2050 are estimated to be necessary for the development of 104 developing countries in the year 2000. This value represents between 20 % to 30 % of previously calculated CO(2) budgets limiting global warming to 2 °C. These constraints and results are incorporated into a CO(2) reduction framework involving four domains of climate action for individual countries. The framework reserves a fair emission path for developing countries to proceed with their development by indexing country-dependent reduction rates proportional to the HDI in order to preserve the 2 °C target after a particular development threshold is reached. For example, in each time step of five years, countries with an HDI of 0.85 would need to reduce their per capita emissions by approx. 17% and countries with an HDI of 0.9 by 33 %. Under this approach, global cumulative emissions by 2050 are estimated to range from 850 up to 1100 Gt of CO(2). These values are within the uncertainty range of emissions to limit global temperatures to 2 °C. © 2011 Costa et al.

  13. Scale models: A proven cost-effective tool for outage planning

    Energy Technology Data Exchange (ETDEWEB)

    Lee, R. [Commonwealth Edison Co., Morris, IL (United States); Segroves, R. [Sargent & Lundy, Chicago, IL (United States)

    1995-03-01

    As generation costs for operating nuclear stations have risen, more nuclear utilities have initiated efforts to improve cost effectiveness. Nuclear plant owners are also being challenged with lower radiation exposure limits and new revised radiation protection related regulations (10 CFR 20), which places further stress on their budgets. As source term reduction activities continue to lower radiation fields, reducing the amount of time spent in radiation fields becomes one of the most cost-effective ways of reducing radiation exposure. An effective approach for minimizing time spent in radiation areas is to use a physical scale model for worker orientation planning and monitoring maintenance, modifications, and outage activities. To meet the challenge of continued reduction in the annual cumulative radiation exposures, new cost-effective tools are required. One field-tested and proven tool is the physical scale model.

  14. Performance of penalized maximum likelihood in estimation of genetic covariances matrices

    Directory of Open Access Journals (Sweden)

    Meyer Karin

    2011-11-01

    Full Text Available Abstract Background Estimation of genetic covariance matrices for multivariate problems comprising more than a few traits is inherently problematic, since sampling variation increases dramatically with the number of traits. This paper investigates the efficacy of regularized estimation of covariance components in a maximum likelihood framework, imposing a penalty on the likelihood designed to reduce sampling variation. In particular, penalties that "borrow strength" from the phenotypic covariance matrix are considered. Methods An extensive simulation study was carried out to investigate the reduction in average 'loss', i.e. the deviation in estimated matrices from the population values, and the accompanying bias for a range of parameter values and sample sizes. A number of penalties are examined, penalizing either the canonical eigenvalues or the genetic covariance or correlation matrices. In addition, several strategies to determine the amount of penalization to be applied, i.e. to estimate the appropriate tuning factor, are explored. Results It is shown that substantial reductions in loss for estimates of genetic covariance can be achieved for small to moderate sample sizes. While no penalty performed best overall, penalizing the variance among the estimated canonical eigenvalues on the logarithmic scale or shrinking the genetic towards the phenotypic correlation matrix appeared most advantageous. Estimating the tuning factor using cross-validation resulted in a loss reduction 10 to 15% less than that obtained if population values were known. Applying a mild penalty, chosen so that the deviation in likelihood from the maximum was non-significant, performed as well if not better than cross-validation and can be recommended as a pragmatic strategy. Conclusions Penalized maximum likelihood estimation provides the means to 'make the most' of limited and precious data and facilitates more stable estimation for multi-dimensional analyses. It should

  15. SU-G-IeP3-05: Effects of Image Receptor Technology and Dose Reduction Software On Radiation Dose Estimates for Fluoroscopically-Guided Interventional (FGI) Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Merritt, Z; Dave, J; Eschelman, D; Gonsalves, C [Thomas Jefferson University, Philadelphia, PA (United States)

    2016-06-15

    Purpose: To investigate the effects of image receptor technology and dose reduction software on radiation dose estimates for most frequently performed fluoroscopically-guided interventional (FGI) procedures at a tertiary health care center. Methods: IRB approval was obtained for retrospective analysis of FGI procedures performed in the interventional radiology suites between January-2011 and December-2015. This included procedures performed using image-intensifier (II) based systems which were subsequently replaced, flat-panel-detector (FPD) based systems which were later upgraded with ClarityIQ dose reduction software (Philips Healthcare) and relatively new FPD system already equipped with ClarityIQ. Post procedure, technologists entered system-reported cumulative air kerma (CAK) and kerma-area product (KAP; only KAP for II based systems) in RIS; these values were analyzed. Data pre-processing included correcting typographical errors and cross-verifying CAK and KAP. The most frequent high and low dose FGI procedures were identified and corresponding CAK and KAP values were compared. Results: Out of 27,251 procedures within this time period, most frequent high and low dose procedures were chemo/immuno-embolization (n=1967) and abscess drainage (n=1821). Mean KAP for embolization and abscess drainage procedures were 260,657, 310,304 and 94,908 mGycm{sup 2}, and 14,497, 15,040 and 6307 mGycm{sup 2} using II-, FPD- and FPD with ClarityIQ- based systems, respectively. Statistically significant differences were observed in KAP values for embolization procedures with respect to different systems but for abscess drainage procedures significant differences were only noted between systems with FPD and FPD with ClarityIQ (p<0.05). Mean CAK reduced significantly from 823 to 308 mGy and from 43 to 21 mGy for embolization and abscess drainage procedures, respectively, in transitioning to FPD systems with ClarityIQ (p<0.05). Conclusion: While transitioning from II- to FPD- based

  16. ELER software - a new tool for urban earthquake loss assessment

    Science.gov (United States)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.

    2010-12-01

    Rapid loss estimation after potentially damaging earthquakes is critical for effective emergency response and public information. A methodology and software package, ELER-Earthquake Loss Estimation Routine, for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region was developed under the Joint Research Activity-3 (JRA3) of the EC FP6 Project entitled "Network of Research Infrastructures for European Seismology-NERIES". Recently, a new version (v2.0) of ELER software has been released. The multi-level methodology developed is capable of incorporating regional variability and uncertainty originating from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. Although primarily intended for quasi real-time estimation of earthquake shaking and losses, the routine is also equally capable of incorporating scenario-based earthquake loss assessments. This paper introduces the urban earthquake loss assessment module (Level 2) of the ELER software which makes use of the most detailed inventory databases of physical and social elements at risk in combination with the analytical vulnerability relationships and building damage-related casualty vulnerability models for the estimation of building damage and casualty distributions, respectively. Spectral capacity-based loss assessment methodology and its vital components are presented. The analysis methods of the Level 2 module, i.e. Capacity Spectrum Method (ATC-40, 1996), Modified Acceleration-Displacement Response Spectrum Method (FEMA 440, 2005), Reduction Factor Method (Fajfar, 2000) and Coefficient Method (ASCE 41-06, 2006), are applied to the selected building types for validation and verification purposes. The damage estimates are compared to the results obtained from the other studies available in the literature, i.e. SELENA v4.0 (Molina et al., 2008) and

  17. Boundary methods for mode estimation

    Science.gov (United States)

    Pierson, William E., Jr.; Ulug, Batuhan; Ahalt, Stanley C.

    1999-08-01

    This paper investigates the use of Boundary Methods (BMs), a collection of tools used for distribution analysis, as a method for estimating the number of modes associated with a given data set. Model order information of this type is required by several pattern recognition applications. The BM technique provides a novel approach to this parameter estimation problem and is comparable in terms of both accuracy and computations to other popular mode estimation techniques currently found in the literature and automatic target recognition applications. This paper explains the methodology used in the BM approach to mode estimation. Also, this paper quickly reviews other common mode estimation techniques and describes the empirical investigation used to explore the relationship of the BM technique to other mode estimation techniques. Specifically, the accuracy and computational efficiency of the BM technique are compared quantitatively to the a mixture of Gaussian (MOG) approach and a k-means approach to model order estimation. The stopping criteria of the MOG and k-means techniques is the Akaike Information Criteria (AIC).

  18. Fouzth report to Congress: resource recovery and waste reduction

    Energy Technology Data Exchange (ETDEWEB)

    1977-01-01

    The report covers domestic refuse generation and resource recovery estimates. A discussion of waste reduction at various national organizational levels, source separation, mixed refuse processing for energy production, and environmental and economic impact of beverage containers deposit law are included.

  19. Effective dysphonia detection using feature dimension reduction and kernel density estimation for patients with Parkinson's disease.

    Directory of Open Access Journals (Sweden)

    Shanshan Yang

    Full Text Available Detection of dysphonia is useful for monitoring the progression of phonatory impairment for patients with Parkinson's disease (PD, and also helps assess the disease severity. This paper describes the statistical pattern analysis methods to study different vocal measurements of sustained phonations. The feature dimension reduction procedure was implemented by using the sequential forward selection (SFS and kernel principal component analysis (KPCA methods. Four selected vocal measures were projected by the KPCA onto the bivariate feature space, in which the class-conditional feature densities can be approximated with the nonparametric kernel density estimation technique. In the vocal pattern classification experiments, Fisher's linear discriminant analysis (FLDA was applied to perform the linear classification of voice records for healthy control subjects and PD patients, and the maximum a posteriori (MAP decision rule and support vector machine (SVM with radial basis function kernels were employed for the nonlinear classification tasks. Based on the KPCA-mapped feature densities, the MAP classifier successfully distinguished 91.8% voice records, with a sensitivity rate of 0.986, a specificity rate of 0.708, and an area value of 0.94 under the receiver operating characteristic (ROC curve. The diagnostic performance provided by the MAP classifier was superior to those of the FLDA and SVM classifiers. In addition, the classification results indicated that gender is insensitive to dysphonia detection, and the sustained phonations of PD patients with minimal functional disability are more difficult to be correctly identified.

  20. Model reduction of nonlinear systems subject to input disturbances

    KAUST Repository

    Ndoye, Ibrahima

    2017-07-10

    The method of convex optimization is used as a tool for model reduction of a class of nonlinear systems in the presence of disturbances. It is shown that under some conditions the nonlinear disturbed system can be approximated by a reduced order nonlinear system with similar disturbance-output properties to the original plant. The proposed model reduction strategy preserves the nonlinearity and the input disturbance nature of the model. It guarantees a sufficiently small error between the outputs of the original and the reduced-order systems, and also maintains the properties of input-to-state stability. The matrices of the reduced order system are given in terms of a set of linear matrix inequalities (LMIs). The paper concludes with a demonstration of the proposed approach on model reduction of a nonlinear electronic circuit with additive disturbances.

  1. ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures

    Science.gov (United States)

    Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.

    2008-08-01

    This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.

  2. Optimal covariance selection for estimation using graphical models

    OpenAIRE

    Vichik, Sergey; Oshman, Yaakov

    2011-01-01

    We consider a problem encountered when trying to estimate a Gaussian random field using a distributed estimation approach based on Gaussian graphical models. Because of constraints imposed by estimation tools used in Gaussian graphical models, the a priori covariance of the random field is constrained to embed conditional independence constraints among a significant number of variables. The problem is, then: given the (unconstrained) a priori covariance of the random field, and the conditiona...

  3. Malware Function Estimation Using API in Initial Behavior

    OpenAIRE

    KAWAGUCHI, Naoto; OMOTE, Kazumasa

    2017-01-01

    Malware proliferation has become a serious threat to the Internet in recent years. Most current malware are subspecies of existing malware that have been automatically generated by illegal tools. To conduct an efficient analysis of malware, estimating their functions in advance is effective when we give priority to analyze malware. However, estimating the malware functions has been difficult due to the increasing sophistication of malware. Actually, the previous researches do not estimate the...

  4. Applying a Consumer Behavior Lens to Salt Reduction Initiatives.

    Science.gov (United States)

    Regan, Áine; Kent, Monique Potvin; Raats, Monique M; McConnon, Áine; Wall, Patrick; Dubois, Lise

    2017-08-18

    Reformulation of food products to reduce salt content has been a central strategy for achieving population level salt reduction. In this paper, we reflect on current reformulation strategies and consider how consumer behavior determines the ultimate success of these strategies. We consider the merits of adopting a 'health by stealth', silent approach to reformulation compared to implementing a communications strategy which draws on labeling initiatives in tandem with reformulation efforts. We end this paper by calling for a multi-actor approach which utilizes co-design, participatory tools to facilitate the involvement of all stakeholders, including, and especially, consumers, in making decisions around how best to achieve population-level salt reduction.

  5. Supercompactor force effectiveness as related to dry active waste volume reduction

    International Nuclear Information System (INIS)

    Williams, P.C.; Phillips, W.S.

    1986-01-01

    The first U.S. permanently installed supercompactor is now in operation at the Babcock and Wilcox volume reduction center, Parks Township, Pennsylvania. Tests with various DAW (dry active waste) material have been conducted, recording press force versus drum height as one means of estimating volume reduction capability of this machine at various compaction forces. The results of these tests, as well as other factors, are presented herein

  6. Crash probability estimation via quantifying driver hazard perception.

    Science.gov (United States)

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. i-Tree: Tools to assess and manage structure, function, and value of community forests

    Science.gov (United States)

    Hirabayashi, S.; Nowak, D.; Endreny, T. A.; Kroll, C.; Maco, S.

    2011-12-01

    Trees in urban communities can mitigate many adverse effects associated with anthropogenic activities and climate change (e.g. urban heat island, greenhouse gas, air pollution, and floods). To protect environmental and human health, managers need to make informed decisions regarding urban forest management practices. Here we present the i-Tree suite of software tools (www.itreetools.org) developed by the USDA Forest Service and their cooperators. This software suite can help urban forest managers assess and manage the structure, function, and value of urban tree populations regardless of community size or technical capacity. i-Tree is a state-of-the-art, peer-reviewed Windows GUI- or Web-based software that is freely available, supported, and continuously refined by the USDA Forest Service and their cooperators. Two major features of i-Tree are 1) to analyze current canopy structures and identify potential planting spots, and 2) to estimate the environmental benefits provided by the trees, such as carbon storage and sequestration, energy conservation, air pollution removal, and storm water reduction. To cover diverse forest topologies, various tools were developed within the i-Tree suite: i-Tree Design for points (individual trees), i-Tree Streets for lines (street trees), and i-Tree Eco, Vue, and Canopy (in the order of complexity) for areas (community trees). Once the forest structure is identified with these tools, ecosystem services provided by trees can be estimated with common models and protocols, and reports in the form of texts, charts, and figures are then created for users. Since i-Tree was developed with a client/server architecture, nationwide data in the US such as location-related parameters, weather, streamflow, and air pollution data are stored in the server and retrieved to a user's computer at run-time. Freely available remote-sensed images (e.g. NLCD and Google maps) are also employed to estimate tree canopy characteristics. As the demand for i

  8. estimation of shear strength parameters of lateritic soils using

    African Journals Online (AJOL)

    user

    ... a tool to estimate the. Nigerian Journal of Technology (NIJOTECH). Vol. ... modeling tools for the prediction of shear strength parameters for lateritic ... 2.2 Geotechnical Analysis of the Soils ... The back propagation learning algorithm is the most popular and ..... [10] Alsaleh, M. I., Numerical modeling for strain localization in ...

  9. A Monte Carlo based decision-support tool for assessing generation portfolios in future carbon constrained electricity industries

    International Nuclear Information System (INIS)

    Vithayasrichareon, Peerapat; MacGill, Iain F.

    2012-01-01

    This paper presents a novel decision-support tool for assessing future generation portfolios in an increasingly uncertain electricity industry. The tool combines optimal generation mix concepts with Monte Carlo simulation and portfolio analysis techniques to determine expected overall industry costs, associated cost uncertainty, and expected CO 2 emissions for different generation portfolio mixes. The tool can incorporate complex and correlated probability distributions for estimated future fossil-fuel costs, carbon prices, plant investment costs, and demand, including price elasticity impacts. The intent of this tool is to facilitate risk-weighted generation investment and associated policy decision-making given uncertainties facing the electricity industry. Applications of this tool are demonstrated through a case study of an electricity industry with coal, CCGT, and OCGT facing future uncertainties. Results highlight some significant generation investment challenges, including the impacts of uncertain and correlated carbon and fossil-fuel prices, the role of future demand changes in response to electricity prices, and the impact of construction cost uncertainties on capital intensive generation. The tool can incorporate virtually any type of input probability distribution, and support sophisticated risk assessments of different portfolios, including downside economic risks. It can also assess portfolios against multi-criterion objectives such as greenhouse emissions as well as overall industry costs. - Highlights: ► Present a decision support tool to assist generation investment and policy making under uncertainty. ► Generation portfolios are assessed based on their expected costs, risks, and CO 2 emissions. ► There is tradeoff among expected cost, risks, and CO 2 emissions of generation portfolios. ► Investment challenges include economic impact of uncertainties and the effect of price elasticity. ► CO 2 emissions reduction depends on the mix of

  10. Artificial Neural Network Based State Estimators Integrated into Kalmtool

    DEFF Research Database (Denmark)

    Bayramoglu, Enis; Ravn, Ole; Poulsen, Niels Kjølstad

    2012-01-01

    In this paper we present a toolbox enabling easy evaluation and comparison of dierent ltering algorithms. The toolbox is called Kalmtool and is a set of MATLAB tools for state estimation of nonlinear systems. The toolbox now contains functions for Articial Neural Network Based State Estimation as...

  11. A Fast Tool for Assessing the Power Performance of Large WEC arrays

    DEFF Research Database (Denmark)

    Ruiz, Pau Mercadé

    In the present work, a tool for computing wave energy converter array hydrodynamic forces and power performance is developed. The tool leads to a significant reduction on computation time compared with standard boundary element method based codes while keeping similar levels of accuracy. This mak...... it suitable for array layout optimization, where large numbers of simulations are required. Furthermore, the tool is developed within an open-source environment such as Python 2.7 so that it is fully accessible to anyone willing to make use of it....

  12. Design-corrected variation by centre in mortality reduction in the ERSPC randomised prostate cancer screening trial.

    Science.gov (United States)

    Hakama, Matti; Moss, Sue M; Stenman, Ulf-Hakan; Roobol, Monique J; Zappa, Marco; Carlsson, Sigrid; Randazzo, Marco; Nelen, Vera; Hugosson, Jonas

    2017-06-01

    Objectives To calculate design-corrected estimates of the effect of screening on prostate cancer mortality by centre in the European Randomised Study of Screening for Prostate Cancer (ERSPC). Setting The ERSPC has shown a 21% reduction in prostate cancer mortality in men invited to screening with follow-up truncated at 13 years. Centres either used pre-consent randomisation (effectiveness design) or post-consent randomisation (efficacy design). Methods In six centres (three effectiveness design, three efficacy design) with follow-up until the end of 2010, or maximum 13 years, the effect of screening was estimated as both effectiveness (mortality reduction in the target population) and efficacy (reduction in those actually screened). Results The overall crude prostate cancer mortality risk ratio in the intervention arm vs control arm for the six centres was 0.79 ranging from a 14% increase to a 38% reduction. The risk ratio was 0.85 in centres with effectiveness design and 0.73 in those with efficacy design. After correcting for design, overall efficacy was 27%, 24% in pre-consent and 29% in post-consent centres, ranging between a 12% increase and a 52% reduction. Conclusion The estimated overall effect of screening in attenders (efficacy) was a 27% reduction in prostate cancer mortality at 13 years' follow-up. The variation in efficacy between centres was greater than the range in risk ratio without correction for design. The centre-specific variation in the mortality reduction could not be accounted for by the randomisation method.

  13. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  14. Wind Noise Reduction using Non-negative Sparse Coding

    DEFF Research Database (Denmark)

    Schmidt, Mikkel N.; Larsen, Jan; Hsiao, Fu-Tien

    2007-01-01

    We introduce a new speaker independent method for reducing wind noise in single-channel recordings of noisy speech. The method is based on non-negative sparse coding and relies on a wind noise dictionary which is estimated from an isolated noise recording. We estimate the parameters of the model ...... and discuss their sensitivity. We then compare the algorithm with the classical spectral subtraction method and the Qualcomm-ICSI-OGI noise reduction method. We optimize the sound quality in terms of signal-to-noise ratio and provide results on a noisy speech recognition task....

  15. Parameter importance and uncertainty in predicting runoff pesticide reduction with filter strips.

    Science.gov (United States)

    Muñoz-Carpena, Rafael; Fox, Garey A; Sabbagh, George J

    2010-01-01

    Vegetative filter strips (VFS) are an environmental management tool used to reduce sediment and pesticide transport from surface runoff. Numerical models of VFS such as the Vegetative Filter Strip Modeling System (VFSMOD-W) are capable of predicting runoff, sediment, and pesticide reduction and can be useful tools to understand the effectiveness of VFS and environmental conditions under which they may be ineffective. However, as part of the modeling process, it is critical to identify input factor importance and quantify uncertainty in predicted runoff, sediment, and pesticide reductions. This research used state-of-the-art global sensitivity and uncertainty analysis tools, a screening method (Morris) and a variance-based method (extended Fourier Analysis Sensitivity Test), to evaluate VFSMOD-W under a range of field scenarios. The three VFS studies analyzed were conducted on silty clay loam and silt loam soils under uniform, sheet flow conditions and included atrazine, chlorpyrifos, cyanazine, metolachlor, pendimethalin, and terbuthylazine data. Saturated hydraulic conductivity was the most important input factor for predicting infiltration and runoff, explaining >75% of the total output variance for studies with smaller hydraulic loading rates ( approximately 100-150 mm equivalent depths) and approximately 50% for the higher loading rate ( approximately 280-mm equivalent depth). Important input factors for predicting sedimentation included hydraulic conductivity, average particle size, and the filter's Manning's roughness coefficient. Input factor importance for pesticide trapping was controlled by infiltration and, therefore, hydraulic conductivity. Global uncertainty analyses suggested a wide range of reductions for runoff (95% confidence intervals of 7-93%), sediment (84-100%), and pesticide (43-100%) . Pesticide trapping probability distributions fell between runoff and sediment reduction distributions as a function of the pesticides' sorption. Seemingly

  16. Euler-Poincare Reduction of a Rigid Body Motion

    DEFF Research Database (Denmark)

    Wisniewski, Rafal; Kulczycki, P.

    2005-01-01

    |If a mechanical system experiences symmetry, the Lagrangian becomes invariant under a certain group action. This property leads to substantial simplification of the description of movement. The standpoint in this article is a mechanical system afected by an external force of a control action....... Assuming that the system possesses symmetry and the configuration manifold corresponds to a Lie group, the Euler-Poincare reduction breaks up the motion into separate equations of dynamics and kinematics. This becomes of particular interest for modeling, estimation and control of mechanical systems......-known Euler-Poincare reduction to a rigid body motion with forcing....

  17. Appennino: A GIS Tool for Analyzing Wildlife Habitat Use

    Directory of Open Access Journals (Sweden)

    Marco Ferretti

    2012-01-01

    Full Text Available The aim of the study was to test Appennino, a tool used to evaluate the habitats of animals through compositional analysis. This free tool calculates an animal’s habitat use within the GIS platform for ArcGIS and saves and exports the results of the comparative land uses to other statistical software. Visual Basic for Application programming language was employed to prepare the ESRI ArcGIS 9.x utility. The tool was tested on a dataset of 546 pheasant positions obtained from a study carried out in Tuscany (Italy. The tool automatically gave the same results as the results obtained by calculating the surfaces in ESRI ArcGIS, exporting the data from the ArcGIS, then using a commercial spreadsheet and/or statistical software to calculate the animal’s habitat use with a considerable reduction in time.

  18. Estimation of tool wear length in finish milling using a fuzzy inference algorithm

    Science.gov (United States)

    Ko, Tae Jo; Cho, Dong Woo

    1993-10-01

    The geometric accuracy and surface roughness are mainly affected by the flank wear at the minor cutting edge in finish machining. A fuzzy estimator obtained by a fuzzy inference algorithm with a max-min composition rule to evaluate the minor flank wear length in finish milling is introduced. The features sensitive to minor flank wear are extracted from the dispersion analysis of a time series AR model of the feed directional acceleration of the spindle housing. Linguistic rules for fuzzy estimation are constructed using these features, and then fuzzy inferences are carried out with test data sets under various cutting conditions. The proposed system turns out to be effective for estimating minor flank wear length, and its mean error is less than 12%.

  19. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  20. Cost-effectiveness analysis of salt reduction policies to reduce coronary heart disease in Syria, 2010-2020.

    Science.gov (United States)

    Wilcox, Meredith L; Mason, Helen; Fouad, Fouad M; Rastam, Samer; al Ali, Radwan; Page, Timothy F; Capewell, Simon; O'Flaherty, Martin; Maziak, Wasim

    2015-01-01

    This study presents a cost-effectiveness analysis of salt reduction policies to lower coronary heart disease in Syria. Costs and benefits of a health promotion campaign about salt reduction (HP); labeling of salt content on packaged foods (L); reformulation of salt content within packaged foods (R); and combinations of the three were estimated over a 10-year time frame. Policies were deemed cost-effective if their cost-effectiveness ratios were below the region's established threshold of $38,997 purchasing power parity (PPP). Sensitivity analysis was conducted to account for the uncertainty in the reduction of salt intake. HP, L, and R+HP+L were cost-saving using the best estimates. The remaining policies were cost-effective (CERs: R=$5,453 PPP/LYG; R+HP=$2,201 PPP/LYG; R+L=$2,125 PPP/LYG). R+HP+L provided the largest benefit with net savings using the best and maximum estimates, while R+L was cost-effective with the lowest marginal cost using the minimum estimates. This study demonstrated that all policies were cost-saving or cost effective, with the combination of reformulation plus labeling and a comprehensive policy involving all three approaches being the most promising salt reduction strategies to reduce CHD mortality in Syria.