WorldWideScience

Sample records for greater computational cost

  1. Towards Greater Harmonisation of Decommissioning Cost Estimates

    International Nuclear Information System (INIS)

    O'Sullivan, Patrick; ); Laraia, Michele; ); LaGuardia, Thomas S.

    2010-01-01

    The NEA Decommissioning Cost Estimation Group (DCEG), in collaboration with the IAEA Waste Technology Section and the EC Directorate-General for Energy and Transport, has recently studied cost estimation practices in 12 countries - Belgium, Canada, France, Germany, Italy, Japan, the Netherlands, Slovakia, Spain, Sweden, the United Kingdom and the United States. Its findings are to be published in an OECD/NEA report entitled Cost Estimation for Decommissioning: An International Overview of Cost Elements, Estimation Practices and Reporting Requirements. This booklet highlights the findings contained in the full report. (authors)

  2. External costs of nuclear: Greater or less than the alternatives?

    International Nuclear Information System (INIS)

    Rabl, Ari; Rabl, Veronika A.

    2013-01-01

    Since Fukushima many are calling for a shutdown of nuclear power plants. To see whether such a shutdown would reduce the risks for health and environment, the external costs of nuclear electricity are compared with alternatives that could replace it. The frequency of catastrophic nuclear accidents is based on the historical record, about one in 25 years for the plants built to date, an order of magnitude higher than the safety goals of the U.S. Nuclear Regulatory Commission. Impacts similar to Chernobyl and Fukushima are assumed to estimate the cost. A detailed comparison is presented with wind as alternative with the lowest external cost. The variability of wind necessitates augmentation by other sources, primarily fossil fuels, because storage at the required scale is in most regions too expensive. The external costs of natural gas combined cycle are taken as 0.6 €cent/kWh due to health effects of air pollution and 1.25 €cent/kWh due to greenhouse gases (at 25€/t CO 2 eq ) for the central estimate, but a wide range of different parameters is also considered, both for nuclear and for the alternatives. Although the central estimate of external costs of the wind-based alternative is higher than that of nuclear, the uncertainty ranges overlap. - Highlights: ► The external costs of nuclear electricity are compared with the alternatives. ► Frequency and cost of nuclear accidents based on Chernobyl and Fukushima. ► Detailed comparison with wind as alternative with the lowest external costs. ► High external cost of wind because of natural gas backup (storage too limited). ► External costs of wind higher than nuclear but uncertainty ranges overlap

  3. A cost modelling system for cloud computing

    OpenAIRE

    Ajeh, Daniel; Ellman, Jeremy; Keogh, Shelagh

    2014-01-01

    An advance in technology unlocks new opportunities for organizations to increase their productivity, efficiency and process automation while reducing the cost of doing business as well. The emergence of cloud computing addresses these prospects through the provision of agile systems that are scalable, flexible and reliable as well as cost effective. Cloud computing has made hosting and deployment of computing resources cheaper and easier with no up-front charges but pay per-use flexible payme...

  4. On efficiency of fire simulation realization: parallelization with greater number of computational meshes

    Science.gov (United States)

    Valasek, Lukas; Glasa, Jan

    2017-12-01

    Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.

  5. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  6. Incremental ALARA cost/benefit computer analysis

    International Nuclear Information System (INIS)

    Hamby, P.

    1987-01-01

    Commonwealth Edison Company has developed and is testing an enhanced Fortran Computer Program to be used for cost/benefit analysis of Radiation Reduction Projects at its six nuclear power facilities and Corporate Technical Support Groups. This paper describes a Macro-Diven IBM Mainframe Program comprised of two different types of analyses-an Abbreviated Program with fixed costs and base values, and an extended Engineering Version for a detailed, more through and time-consuming approach. The extended engineering version breaks radiation exposure costs down into two components-Health-Related Costs and Replacement Labor Costs. According to user input, the program automatically adjust these two cost components and applies the derivation to company economic analyses such as replacement power costs, carrying charges, debt interest, and capital investment cost. The results from one of more program runs using different parameters may be compared in order to determine the most appropriate ALARA dose reduction technique. Benefits of this particular cost / benefit analysis technique includes flexibility to accommodate a wide range of user data and pre-job preparation, as well as the use of proven and standardized company economic equations

  7. Computer Software for Life Cycle Cost.

    Science.gov (United States)

    1987-04-01

    34 111. 1111I .25 IL4 jj 16 MICROCOPY RESOLUTION TEST CHART hut FILE C AIR CoMMNAMN STFF COLLG STUJDET PORTO i COMpUTER SOFTWARE FOR LIFE CYCLE CO879...obsolete), physical life (utility before physically wearing out), or application life (utility in a given function)." (7:5) The costs are usually

  8. The Hidden Cost of Buying a Computer.

    Science.gov (United States)

    Johnson, Michael

    1983-01-01

    In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)

  9. Clean air benefits and costs in the GVRD [Greater Vancouver Regional District

    International Nuclear Information System (INIS)

    Gislason, G.; Martin, J.; Williams, D.; Caton, B.; Rich, J.; Rojak, S.; Robinson, J.; Stuermer, A. von

    1994-01-01

    Air pollution is a major concern in the Greater Vancouver Regional District in British Columbia. An analysis was conducted to assess the costs and benefits of an innovative plan to reduce the emissions of five primary pollutants in the GVRD: nitrogen oxides (NOx), sulfur oxides (SOx), volatile organic compounds (VOCs), particulates, and CO. The study adopts a damage function approach in which the benefits of reduced emissions are given by the averted damages to human health, crops, and so on. Under a base case scenario, motor vehicle emission controls and additional measures proposed in the region's air quality management plan (AQMP) are projected to lead to emission reductions of 873,000 tonnes in the GVRD by the year 2020, compared to the emission level projected without intervention. The AQMP is projected to avert over its life some 2,800 premature deaths, 33,000 emergency room visits, 13 million restricted activity days, and 5 million symptoms. Crop losses due to ozone are projected to decrease by 1-4%/y over the next several decades due to the AQMP. Damage averted to materials and property per tonne of pollutant reduced ranges from $30 for VOC to $180 for particulates. Under base-case conservative assumptions, the AQMP generates $5.4 billion in benefits and $3.8 billion in costs, nearly 2/3 of which are paid by the industrial and commercial sectors. 1 tab

  10. Cost/Benefit Analysis of Leasing Versus Purchasing Computers

    National Research Council Canada - National Science Library

    Arceneaux, Alan

    1997-01-01

    .... In constructing this model, several factors were considered, including: The purchase cost of computer equipment, annual lease payments, depreciation costs, the opportunity cost of purchasing, tax revenue implications and various leasing terms...

  11. Low cost highly available digital control computer

    International Nuclear Information System (INIS)

    Silvers, M.W.

    1986-01-01

    When designing digital controllers for critical plant control it is important to provide several features. Among these are reliability, availability, maintainability, environmental protection, and low cost. An examination of several applications has lead to a design that can be produced for approximately $20,000 (1000 control points). This design is compatible with modern concepts in distributed and hierarchical control. The canonical controller element is a dual-redundant self-checking computer that communicates with a cross-strapped, electrically isolated input/output system. The input/output subsystem comprises multiple intelligent input/output cards. These cards accept commands from the primary processor which are validated, executed, and acknowledged. Each card may be hot replaced to facilitate sparing. The implementation of the dual-redundant computer architecture is discussed. Called the FS-86, this computer can be used for a variety of applications. It has most recently found application in the upgrade of San Francisco's Bay Area Rapid Transit (BART) train control currently in progress and has been proposed for feedwater control in a boiling water reactor

  12. Benefit-cost-risk analysis of alternatives for greater-confinement disposal of radioactive waste

    International Nuclear Information System (INIS)

    Gilbert, T.L.; Luner, C.; Peterson, J.M.

    1983-01-01

    Seven alternatives are included in the analysis: near-surface disposal; improved waste form; below-ground engineered structure; augered shaft; shale fracturing; shallow geologic repository; and high-level waste repository. These alternatives are representative generic facilities that span the range from low-level waste disposal practice to high-level waste disposal practice, tentatively ordered according to an expected increasing cost and/or effectiveness of confinement. They have been chosen to enable an assessment of the degree of confinement that represents an appropriate balance between public health and safety requirements and costs rather than identification of a specific preferred facility design. The objective of the analysis is to provide a comparative ranking of the alternatives on the basis of benefit-cost-risk considerations

  13. Greater healthcare utilization and costs among Black persons compared to White persons with aphasia in the North Carolina stroke belt.

    Science.gov (United States)

    Ellis, Charles; Hardy, Rose Y; Lindrooth, Richard C

    2017-05-15

    To examine racial differences in healthcare utilization and costs for persons with aphasia (PWA) being treated in acute care hospitals in North Carolina (NC). NC Healthcare Cost and Utilization Project State Inpatient Database (HCUP-SID) data from 2011-2012 were analyzed to examine healthcare utilization and costs of care for stroke patients with aphasia. Analyses emphasized length of stay, charges and cost of general hospital services. Generalized linear models (GLM) were constructed to determine the impact of demographic characteristics, stroke/illness severity, and observed hospital characteristics on utilization and costs. Hospital fixed effects were included to yield within-hospital estimates of disparities. GLM models demonstrated that Blacks with aphasia experienced 1.9days longer lengths of stay compared to Whites with aphasia after controlling for demographic characteristics, 1.4days controlling for stroke/illness severity, 1.2days controlling for observed hospital characteristics, and ~1 extra day controlling for unobserved hospital characteristics. Similarly, Blacks accrued ~$2047 greater total costs compared to Whites after controlling for demographic characteristics, $1659 controlling for stroke/illness severity, $1338 controlling for observed hospital characteristics, and ~$1311 greater total costs after controlling for unobserved hospital characteristics. In the acute hospital setting, Blacks with aphasia utilize greater hospital services during longer hospitalizations and at substantially higher costs in the state of NC. A substantial portion of the adjusted difference was related to the hospital treating the patient. However, even after controlling for the hospital, the differences remained clinically and statistically significant. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Cost-benefit analysis for invasive species control: the case of greater Canada goose Branta canadensis in Flanders (northern Belgium

    Directory of Open Access Journals (Sweden)

    Nikolaas Reyns

    2018-01-01

    Full Text Available Background Sound decisions on control actions for established invasive alien species (IAS require information on ecological as well as socio-economic impact of the species and of its management. Cost-benefit analysis provides part of this information, yet has received relatively little attention in the scientific literature on IAS. Methods We apply a bio-economic model in a cost-benefit analysis framework to greater Canada goose Branta canadensis, an IAS with documented social, economic and ecological impacts in Flanders (northern Belgium. We compared a business as usual (BAU scenario which involved non-coordinated hunting and egg destruction with an enhanced scenario based on a continuation of these activities but supplemented with coordinated capture of moulting birds. To assess population growth under the BAU scenario we fitted a logistic growth model to the observed pre-moult capture population. Projected damage costs included water eutrophication and damage to cultivated grasslands and were calculated for all scenarios. Management costs of the moult captures were based on a representative average of the actual cost of planning and executing moult captures. Results Comparing the scenarios with different capture rates, different costs for eutrophication and various discount rates, showed avoided damage costs were in the range of 21.15 M€ to 45.82 M€ under the moult capture scenario. The lowest value for the avoided costs applied to the scenario where we lowered the capture rate by 10%. The highest value occurred in the scenario where we lowered the real discount rate from 4% to 2.5%. Discussion The reduction in damage costs always outweighed the additional management costs of moult captures. Therefore, additional coordinated moult captures could be applied to limit the negative economic impact of greater Canada goose at a regional scale. We further discuss the strengths and weaknesses of our approach and its potential application to other

  15. Cost-benefit analysis for invasive species control: the case of greater Canada goose Branta canadensis in Flanders (northern Belgium)

    Science.gov (United States)

    Casaer, Jim; De Smet, Lieven; Devos, Koen; Huysentruyt, Frank; Robertson, Peter A.; Verbeke, Tom

    2018-01-01

    Background Sound decisions on control actions for established invasive alien species (IAS) require information on ecological as well as socio-economic impact of the species and of its management. Cost-benefit analysis provides part of this information, yet has received relatively little attention in the scientific literature on IAS. Methods We apply a bio-economic model in a cost-benefit analysis framework to greater Canada goose Branta canadensis, an IAS with documented social, economic and ecological impacts in Flanders (northern Belgium). We compared a business as usual (BAU) scenario which involved non-coordinated hunting and egg destruction with an enhanced scenario based on a continuation of these activities but supplemented with coordinated capture of moulting birds. To assess population growth under the BAU scenario we fitted a logistic growth model to the observed pre-moult capture population. Projected damage costs included water eutrophication and damage to cultivated grasslands and were calculated for all scenarios. Management costs of the moult captures were based on a representative average of the actual cost of planning and executing moult captures. Results Comparing the scenarios with different capture rates, different costs for eutrophication and various discount rates, showed avoided damage costs were in the range of 21.15 M€ to 45.82 M€ under the moult capture scenario. The lowest value for the avoided costs applied to the scenario where we lowered the capture rate by 10%. The highest value occurred in the scenario where we lowered the real discount rate from 4% to 2.5%. Discussion The reduction in damage costs always outweighed the additional management costs of moult captures. Therefore, additional coordinated moult captures could be applied to limit the negative economic impact of greater Canada goose at a regional scale. We further discuss the strengths and weaknesses of our approach and its potential application to other IAS. PMID

  16. Waste Management Facilities Cost Information report for Greater-Than-Class C and DOE equivalent special case waste

    Energy Technology Data Exchange (ETDEWEB)

    Feizollahi, F.; Shropshire, D.

    1993-07-01

    This Waste Management Facility Cost Information (WMFCI) report for Greater-Than-Class C low-level waste (GTCC LLW) and DOE equivalent special case waste contains preconceptual designs and planning level life-cycle cost (PLCC) estimates for treatment, storage, and disposal facilities needed for management of GTCC LLW and DOE equivalent waste. The report contains information on 16 facilities (referred to as cost modules). These facilities are treatment facility front-end and back-end support functions (administration support, and receiving, preparation, and shipping cost modules); seven treatment concepts (incineration, metal melting, shredding/compaction, solidification, vitrification, metal sizing and decontamination, and wet/air oxidation cost modules); two storage concepts (enclosed vault and silo); disposal facility front-end functions (disposal receiving and inspection cost module); and four disposal concepts (shallow-land, engineered shallow-land, intermediate depth, and deep geological cost modules). Data in this report allow the user to develop PLCC estimates for various waste management options. A procedure to guide the U.S. Department of Energy (DOE) and its contractor personnel in the use of estimating data is also included in this report.

  17. Waste Management Facilities Cost Information report for Greater-Than-Class C and DOE equivalent special case waste

    International Nuclear Information System (INIS)

    Feizollahi, F.; Shropshire, D.

    1993-07-01

    This Waste Management Facility Cost Information (WMFCI) report for Greater-Than-Class C low-level waste (GTCC LLW) and DOE equivalent special case waste contains preconceptual designs and planning level life-cycle cost (PLCC) estimates for treatment, storage, and disposal facilities needed for management of GTCC LLW and DOE equivalent waste. The report contains information on 16 facilities (referred to as cost modules). These facilities are treatment facility front-end and back-end support functions (administration support, and receiving, preparation, and shipping cost modules); seven treatment concepts (incineration, metal melting, shredding/compaction, solidification, vitrification, metal sizing and decontamination, and wet/air oxidation cost modules); two storage concepts (enclosed vault and silo); disposal facility front-end functions (disposal receiving and inspection cost module); and four disposal concepts (shallow-land, engineered shallow-land, intermediate depth, and deep geological cost modules). Data in this report allow the user to develop PLCC estimates for various waste management options. A procedure to guide the U.S. Department of Energy (DOE) and its contractor personnel in the use of estimating data is also included in this report

  18. Cloud Computing and Information Technology Resource Cost Management for SMEs

    DEFF Research Database (Denmark)

    Kuada, Eric; Adanu, Kwame; Olesen, Henning

    2013-01-01

    This paper analyzes the decision-making problem confronting SMEs considering the adoption of cloud computing as an alternative to in-house computing services provision. The economics of choosing between in-house computing and a cloud alternative is analyzed by comparing the total economic costs...... in determining the relative value of cloud computing....

  19. Synergy effects of fluoxetine and variability in temperature lead to proportionally greater fitness costs in Daphnia: A multigenerational test.

    Science.gov (United States)

    Barbosa, Miguel; Inocentes, Núrya; Soares, Amadeu M V M; Oliveira, Miguel

    2017-12-01

    Increased variability in water temperature is predicted to impose disproportionally greater fitness costs than mean increase in temperature. Additionally, water contaminants are currently a major source of human-induced stress likely to produce fitness costs. Global change models forecast an increase in these two human-induced stressors. Yet, in spite the growing interest in understanding how organisms respond to global change, the joint fitness effects of water pollution and increased variability in temperature remain unclear. Here, using a multigenerational design, we test the hypothesis that exposure to high concentrations of fluoxetine, a human medicine commonly found in freshwater systems, causes increased lifetime fitness costs, when associated with increased variability in temperature. Although fluoxetine and variability in temperature elicited some fitness cost when tested alone, when both stressors acted together the costs were disproportionally greater. The combined effect of fluoxetine and variability in temperature led to a reduction of 37% in lifetime reproductive success and a 17.9% decrease in population growth rate. Interestingly, fluoxetine and variability in temperature had no effect on the probability of survival. Freshwater systems are among the most imperilled ecosystems, often exposed to multiple human-induced stressors. Our results indicate that organisms face greater fitness risk when exposed to multiple stressors at the same time than when each stress acts alone. Our study highlights the importance of using a multi-generational approach to fully understand individual environmental tolerance and its responses to a global change scenario in aquatic systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Limited risk assessment and some cost/benefit considerations for greater confinement disposal compared to shallow land burial

    International Nuclear Information System (INIS)

    Hunter, P.H.; Lester, D.H.; Robertson, L.D.; Spaeth, M.E.; Stoddard, J.A.; Dickman, P.T.

    1984-09-01

    A limited risk assessment and some cost/benefit considerations of greater confinement disposal (GCD) compared to shallow land burial (SLB) are presented. This study is limited to an analysis of the postclosure phase of hypothetical GCD and SLB facilities. Selected release scenarios are used which bound the range of risks to a maximally exposed individual and a hypothetical population. Based on the scenario assessments, GCD had a significant risk advantage over SLB for normal exposure pathways at both humid and arid sites, particularly for the human intrusion scenario. Since GCD costs are somewhat higher than SLB, it is necessary to weigh the higher costs of GCD against the higher risks of SLB. In this regard, GCD should be pursued as an alternative to SLB for certain types of low-level waste, and as an alternative to processing for wastes requiring improved stabilization or higher integrity packaging to be compatible with SLB. There are two reasons for this conclusion. First, GCD might diminish public apprehension regarding the disposal of wastes perceived to be too hazardous for SLB. Second, GCD may be a relatively cost-effective alternative to various stabilization and packaging schemes required to meet 10 CFR 61 near-surface requirements as well as being a cost-effective alternative to deep geologic disposal. Radionuclide transport through the biosphere and resultant dose consequences were determined using the RADTRAN radionuclide transport code. 19 references, 4 figures, 5 tables

  1. Fixed-point image orthorectification algorithms for reduced computational cost

    Science.gov (United States)

    French, Joseph Clinton

    Imaging systems have been applied to many new applications in recent years. With the advent of low-cost, low-power focal planes and more powerful, lower cost computers, remote sensing applications have become more wide spread. Many of these applications require some form of geolocation, especially when relative distances are desired. However, when greater global positional accuracy is needed, orthorectification becomes necessary. Orthorectification is the process of projecting an image onto a Digital Elevation Map (DEM), which removes terrain distortions and corrects the perspective distortion by changing the viewing angle to be perpendicular to the projection plane. Orthorectification is used in disaster tracking, landscape management, wildlife monitoring and many other applications. However, orthorectification is a computationally expensive process due to floating point operations and divisions in the algorithm. To reduce the computational cost of on-board processing, two novel algorithm modifications are proposed. One modification is projection utilizing fixed-point arithmetic. Fixed point arithmetic removes the floating point operations and reduces the processing time by operating only on integers. The second modification is replacement of the division inherent in projection with a multiplication of the inverse. The inverse must operate iteratively. Therefore, the inverse is replaced with a linear approximation. As a result of these modifications, the processing time of projection is reduced by a factor of 1.3x with an average pixel position error of 0.2% of a pixel size for 128-bit integer processing and over 4x with an average pixel position error of less than 13% of a pixel size for a 64-bit integer processing. A secondary inverse function approximation is also developed that replaces the linear approximation with a quadratic. The quadratic approximation produces a more accurate approximation of the inverse, allowing for an integer multiplication calculation

  2. Greater power and computational efficiency for kernel-based association testing of sets of genetic variants.

    Science.gov (United States)

    Lippert, Christoph; Xiang, Jing; Horta, Danilo; Widmer, Christian; Kadie, Carl; Heckerman, David; Listgarten, Jennifer

    2014-11-15

    Set-based variance component tests have been identified as a way to increase power in association studies by aggregating weak individual effects. However, the choice of test statistic has been largely ignored even though it may play an important role in obtaining optimal power. We compared a standard statistical test-a score test-with a recently developed likelihood ratio (LR) test. Further, when correction for hidden structure is needed, or gene-gene interactions are sought, state-of-the art algorithms for both the score and LR tests can be computationally impractical. Thus we develop new computationally efficient methods. After reviewing theoretical differences in performance between the score and LR tests, we find empirically on real data that the LR test generally has more power. In particular, on 15 of 17 real datasets, the LR test yielded at least as many associations as the score test-up to 23 more associations-whereas the score test yielded at most one more association than the LR test in the two remaining datasets. On synthetic data, we find that the LR test yielded up to 12% more associations, consistent with our results on real data, but also observe a regime of extremely small signal where the score test yielded up to 25% more associations than the LR test, consistent with theory. Finally, our computational speedups now enable (i) efficient LR testing when the background kernel is full rank, and (ii) efficient score testing when the background kernel changes with each test, as for gene-gene interaction tests. The latter yielded a factor of 2000 speedup on a cohort of size 13 500. Software available at http://research.microsoft.com/en-us/um/redmond/projects/MSCompBio/Fastlmm/. heckerma@microsoft.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  3. The ability of land owners and their cooperatives to leverage payments greater than opportunity costs from conservation contracts.

    Science.gov (United States)

    Lennox, Gareth D; Armsworth, Paul R

    2013-06-01

    In negotiations over land-right acquisitions, landowners have an informational advantage over conservation groups because they know more about the opportunity costs of conservation measures on their sites. This advantage creates the possibility that landowners will demand payments greater than the required minimum, where this minimum required payment is known as the landowner’s willingness to accept (WTA). However, in recent studies of conservation costs, researchers have assumed landowners will accept conservation with minimum payments. We investigated the ability of landowners to demand payments above their WTA when a conservation group has identified multiple sites for protection. First, we estimated the maximum payment landowners could potentially demand, which is set when groups of landowners act as a cooperative. Next, through the simulation of conservation auctions, we explored the amount of money above landowners’ WTA (i.e., surplus) that conservation groups could cede to secure conservation agreements, again investigating the influence of landowner cooperatives. The simulations showed the informational advantage landowners held could make conservation investments up to 42% more expensive than suggested by the site WTAs. Moreover, all auctions resulted in landowners obtaining payments greater than their WTA; thus, it may be unrealistic to assume landowners will accept conservation contracts with minimum payments. Of particular significance for species conservation, conservation objectives focused on overall species richness,which therefore recognize site complementarity, create an incentive for land owners to form cooperatives to capture surplus. To the contrary, objectives in which sites are substitutes, such as the maximization of species occurrences, create a disincentive for cooperative formation.

  4. Computer tomography: a cost-saving examination?

    International Nuclear Information System (INIS)

    Barneveld Binkhuysen, F.H.; Puijlaert, C.B.A.J.

    1987-01-01

    The research concerns the influence of the body computer tomograph (BCT) on the efficiency in radiology and in the hospital as a whole in The Netherlands. Hospitals with CT are compared with hospitals without CT. In radiology the substitution effect is investigated, with use of the number of radiological performances per clinical patient as a parameter. This parameter proves to decrease in hospitals with a CT, in contrast to hospitals without a CT. The often-expressed opinion that the CT should specifically perform complementary examinations appears incorrect. As to the efficiency in the hospital this is related to the average hospital in-patient stay. The average hospital in-patient stay proves to be shorter in hospitals with a CT than in those without a CT. The CT has turned out to be a very effective expedient which unfortunately, however, is being used inefficiently in The Netherlands, owing to limited installation. 17 refs.; 6 figs.; 5 tabs

  5. Cloud Computing-An Ultimate Technique to Minimize Computing cost for Developing Countries

    OpenAIRE

    Narendra Kumar; Shikha Jain

    2012-01-01

    The presented paper deals with how remotely managed computing and IT resources can be beneficial in the developing countries like India and Asian sub-continent countries. This paper not only defines the architectures and functionalities of cloud computing but also indicates strongly about the current demand of Cloud computing to achieve organizational and personal level of IT supports in very minimal cost with high class flexibility. The power of cloud can be used to reduce the cost of IT - r...

  6. Software Requirements for a System to Compute Mean Failure Cost

    Energy Technology Data Exchange (ETDEWEB)

    Aissa, Anis Ben [University of Tunis, Belvedere, Tunisia; Abercrombie, Robert K [ORNL; Sheldon, Frederick T [ORNL; Mili, Ali [New Jersey Insitute of Technology

    2010-01-01

    In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder. We also demonstrated this infrastructure through the results of security breakdowns for the ecommerce case. In this paper, we illustrate this infrastructure by an application that supports the computation of the Mean Failure Cost (MFC) for each stakeholder.

  7. Development of computer program for estimating decommissioning cost - 59037

    International Nuclear Information System (INIS)

    Kim, Hak-Soo; Park, Jong-Kil

    2012-01-01

    The programs for estimating the decommissioning cost have been developed for many different purposes and applications. The estimation of decommissioning cost is required a large amount of data such as unit cost factors, plant area and its inventory, waste treatment, etc. These make it difficult to use manual calculation or typical spreadsheet software such as Microsoft Excel. The cost estimation for eventual decommissioning of nuclear power plants is a prerequisite for safe, timely and cost-effective decommissioning. To estimate the decommissioning cost more accurately and systematically, KHNP, Korea Hydro and Nuclear Power Co. Ltd, developed a decommissioning cost estimating computer program called 'DeCAT-Pro', which is Decommission-ing Cost Assessment Tool - Professional. (Hereinafter called 'DeCAT') This program allows users to easily assess the decommissioning cost with various decommissioning options. Also, this program provides detailed reporting for decommissioning funding requirements as well as providing detail project schedules, cash-flow, staffing plan and levels, and waste volumes by waste classifications and types. KHNP is planning to implement functions for estimating the plant inventory using 3-D technology and for classifying the conditions of radwaste disposal and transportation automatically. (authors)

  8. An approximate fractional Gaussian noise model with computational cost

    KAUST Repository

    Sø rbye, Sigrunn H.; Myrvoll-Nilsen, Eirik; Rue, Haavard

    2017-01-01

    Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood

  9. Cost-effectiveness of PET and PET/Computed Tomography

    DEFF Research Database (Denmark)

    Gerke, Oke; Hermansson, Ronnie; Hess, Søren

    2015-01-01

    measure by means of incremental cost-effectiveness ratios when considering the replacement of the standard regimen by a new diagnostic procedure. This article discusses economic assessments of PET and PET/computed tomography reported until mid-July 2014. Forty-seven studies on cancer and noncancer...

  10. A survey of cost accounting in service-oriented computing

    NARCIS (Netherlands)

    de Medeiros, Robson W.A.; Rosa, Nelson S.; Campos, Glaucia M.M.; Ferreira Pires, Luis

    Nowadays, companies are increasingly offering their business services through computational services on the Internet in order to attract more customers and increase their revenues. However, these services have financial costs that need to be managed in order to maximize profit. Several models and

  11. Low cost spacecraft computers: Oxymoron or future trend?

    Science.gov (United States)

    Manning, Robert M.

    1993-01-01

    Over the last few decades, application of current terrestrial computer technology in embedded spacecraft control systems has been expensive and wrought with many technical challenges. These challenges have centered on overcoming the extreme environmental constraints (protons, neutrons, gamma radiation, cosmic rays, temperature, vibration, etc.) that often preclude direct use of commercial off-the-shelf computer technology. Reliability, fault tolerance and power have also greatly constrained the selection of spacecraft control system computers. More recently, new constraints are being felt, cost and mass in particular, that have again narrowed the degrees of freedom spacecraft designers once enjoyed. This paper discusses these challenges, how they were previously overcome, how future trends in commercial computer technology will simplify (or hinder) selection of computer technology for spacecraft control applications, and what spacecraft electronic system designers can do now to circumvent them.

  12. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  13. User manual for PACTOLUS: a code for computing power costs

    International Nuclear Information System (INIS)

    Huber, H.D.; Bloomster, C.H.

    1979-02-01

    PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results with the updated version of PACTOLUS. 11 figures, 2 tables

  14. User manual for PACTOLUS: a code for computing power costs.

    Energy Technology Data Exchange (ETDEWEB)

    Huber, H.D.; Bloomster, C.H.

    1979-02-01

    PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results. (RWR)

  15. Trouble Sleeping Associated With Lower Work Performance and Greater Health Care Costs: Longitudinal Data From Kansas State Employee Wellness Program.

    Science.gov (United States)

    Hui, Siu-kuen Azor; Grandner, Michael A

    2015-10-01

    To examine the relationships between employees' trouble sleeping and absenteeism, work performance, and health care expenditures over a 2-year period. Utilizing the Kansas State employee wellness program (EWP) data set from 2008 to 2009, multinomial logistic regression analyses were conducted with trouble sleeping as the predictor and absenteeism, work performance, and health care costs as the outcomes. EWP participants (N = 11,698 in 2008; 5636 followed up in 2009) who had higher levels of sleep disturbance were more likely to be absent from work (all P work performance ratings (all P health care costs (P work attendance, work performance, and health care costs.

  16. An approximate fractional Gaussian noise model with computational cost

    KAUST Repository

    Sørbye, Sigrunn H.

    2017-09-18

    Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood-based approach is ${\\\\mathcal O}(n^{2})$, exploiting the Toeplitz structure of the covariance matrix. In most realistic cases, we do not observe the fGn process directly but only through indirect Gaussian observations, so the Toeplitz structure is easily lost and the computational cost increases to ${\\\\mathcal O}(n^{3})$. This paper presents an approximate fGn model of ${\\\\mathcal O}(n)$ computational cost, both with direct or indirect Gaussian observations, with or without conditioning. This is achieved by approximating fGn with a weighted sum of independent first-order autoregressive processes, fitting the parameters of the approximation to match the autocorrelation function of the fGn model. The resulting approximation is stationary despite being Markov and gives a remarkably accurate fit using only four components. The performance of the approximate fGn model is demonstrated in simulations and two real data examples.

  17. Client-server computer architecture saves costs and eliminates bottlenecks

    International Nuclear Information System (INIS)

    Darukhanavala, P.P.; Davidson, M.C.; Tyler, T.N.; Blaskovich, F.T.; Smith, C.

    1992-01-01

    This paper reports that workstation, client-server architecture saved costs and eliminated bottlenecks that BP Exploration (Alaska) Inc. experienced with mainframe computer systems. In 1991, BP embarked on an ambitious project to change technical computing for its Prudhoe Bay, Endicott, and Kuparuk operations on Alaska's North Slope. This project promised substantial rewards, but also involved considerable risk. The project plan called for reservoir simulations (which historically had run on a Cray Research Inc. X-MP supercomputer in the company's Houston data center) to be run on small computer workstations. Additionally, large Prudhoe Bay, Endicott, and Kuparuk production and reservoir engineering data bases and related applications also would be moved to workstations, replacing a Digital Equipment Corp. VAX cluster in Anchorage

  18. Assessment of the structural shielding integrity of some selected computed tomography facilities in the Greater Accra Region of Ghana

    International Nuclear Information System (INIS)

    Nkansah, A.

    2010-01-01

    The structural shielding integrity was assessed for four of the CT facilities at Trust Hospital, Korle-Bu Teaching Hospital, the 37 Military Hospital and Medical Imaging Ghana Ltd. in the Greater Accra Region of Ghana. From the shielding calculations, the concrete wall thickness computed are 120, 145, 140 and 155mm, for Medical Imaging Ghana Ltd. 37 Military, Trust Hospital and Korle-Bu Teaching Hospital respectively using Default DLP values. The wall thickness using Derived DLP values are 110, 110, 120 and 168mm for Medical Imaging Ghana Ltd, 37 Military Hospital, Trust Hospital and Korle-Bu Teaching Hospital respectively. These values are within the accepted standard concrete thickness of 102- 152mm prescribed by the National Council of Radiological Protection and measurement. The ultrasonic pulse testing indicated that all the sandcrete walls are of good quality and free of voids since pulse velocities estimated were approximately equal to 3.45km/s. an average dose rate measurement for supervised areas is 3.4 μSv/wk and controlled areas is 18.0 μSv/wk. These dose rates were below the acceptable levels of 100 μSv per week for the occupationally exposed and 20 μSv per week for members of the public provided by the ICRU. The results mean that the structural shielding thickness are adequate to protect members of the public and occupationally exposed workers (au).

  19. Trouble Sleeping Associated with Lower Work Performance and Greater Healthcare Costs: Longitudinal Data from Kansas State Employee Wellness Program

    Science.gov (United States)

    Hui, Siu-kuen Azor; Grandner, Michael A.

    2015-01-01

    Objective To examine the relationships between employees’ trouble sleeping and absenteeism, work performance, and healthcare expenditures over a two year period. Methods Utilizing the Kansas State employee wellness program (EWP) dataset from 2008–2009, multinomial logistic regression analyses were conducted with trouble sleeping as the predictor and absenteeism, work performance, and healthcare costs as the outcomes. Results EWP participants (N=11,698 in 2008; 5,636 followed up in 2009) who had higher levels of sleep disturbance were more likely to be absent from work (all p performance ratings (all p performance, and healthcare costs. PMID:26461857

  20. Dark respiration of leaves and traps of terrestrial carnivorous plants: are there greater energetic costs in traps?

    Czech Academy of Sciences Publication Activity Database

    Adamec, Lubomír

    2010-01-01

    Roč. 5, č. 1 (2010), s. 121-124 ISSN 1895-104X Institutional research plan: CEZ:AV0Z60050516 Keywords : Aerobic respiration * metabolic costs * trap specialization Subject RIV: EF - Botanics Impact factor: 0.685, year: 2010

  1. 12 CFR 714.5 - What is required if you rely on an estimated residual value greater than 25% of the original cost...

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false What is required if you rely on an estimated residual value greater than 25% of the original cost of the leased property? 714.5 Section 714.5 Banks and Banking NATIONAL CREDIT UNION ADMINISTRATION REGULATIONS AFFECTING CREDIT UNIONS LEASING § 714.5 What is...

  2. Addressing the computational cost of large EIT solutions

    International Nuclear Information System (INIS)

    Boyle, Alistair; Adler, Andy; Borsic, Andrea

    2012-01-01

    Electrical impedance tomography (EIT) is a soft field tomography modality based on the application of electric current to a body and measurement of voltages through electrodes at the boundary. The interior conductivity is reconstructed on a discrete representation of the domain using a finite-element method (FEM) mesh and a parametrization of that domain. The reconstruction requires a sequence of numerically intensive calculations. There is strong interest in reducing the cost of these calculations. An improvement in the compute time for current problems would encourage further exploration of computationally challenging problems such as the incorporation of time series data, wide-spread adoption of three-dimensional simulations and correlation of other modalities such as CT and ultrasound. Multicore processors offer an opportunity to reduce EIT computation times but may require some restructuring of the underlying algorithms to maximize the use of available resources. This work profiles two EIT software packages (EIDORS and NDRM) to experimentally determine where the computational costs arise in EIT as problems scale. Sparse matrix solvers, a key component for the FEM forward problem and sensitivity estimates in the inverse problem, are shown to take a considerable portion of the total compute time in these packages. A sparse matrix solver performance measurement tool, Meagre-Crowd, is developed to interface with a variety of solvers and compare their performance over a range of two- and three-dimensional problems of increasing node density. Results show that distributed sparse matrix solvers that operate on multiple cores are advantageous up to a limit that increases as the node density increases. We recommend a selection procedure to find a solver and hardware arrangement matched to the problem and provide guidance and tools to perform that selection. (paper)

  3. Addressing the computational cost of large EIT solutions.

    Science.gov (United States)

    Boyle, Alistair; Borsic, Andrea; Adler, Andy

    2012-05-01

    Electrical impedance tomography (EIT) is a soft field tomography modality based on the application of electric current to a body and measurement of voltages through electrodes at the boundary. The interior conductivity is reconstructed on a discrete representation of the domain using a finite-element method (FEM) mesh and a parametrization of that domain. The reconstruction requires a sequence of numerically intensive calculations. There is strong interest in reducing the cost of these calculations. An improvement in the compute time for current problems would encourage further exploration of computationally challenging problems such as the incorporation of time series data, wide-spread adoption of three-dimensional simulations and correlation of other modalities such as CT and ultrasound. Multicore processors offer an opportunity to reduce EIT computation times but may require some restructuring of the underlying algorithms to maximize the use of available resources. This work profiles two EIT software packages (EIDORS and NDRM) to experimentally determine where the computational costs arise in EIT as problems scale. Sparse matrix solvers, a key component for the FEM forward problem and sensitivity estimates in the inverse problem, are shown to take a considerable portion of the total compute time in these packages. A sparse matrix solver performance measurement tool, Meagre-Crowd, is developed to interface with a variety of solvers and compare their performance over a range of two- and three-dimensional problems of increasing node density. Results show that distributed sparse matrix solvers that operate on multiple cores are advantageous up to a limit that increases as the node density increases. We recommend a selection procedure to find a solver and hardware arrangement matched to the problem and provide guidance and tools to perform that selection.

  4. Estimating pressurized water reactor decommissioning costs: A user's manual for the PWR Cost Estimating Computer Program (CECP) software

    International Nuclear Information System (INIS)

    Bierschbach, M.C.; Mencinsky, G.J.

    1993-10-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  5. Scaling cost-sharing to wages: how employers can reduce health spending and provide greater economic security.

    Science.gov (United States)

    Robertson, Christopher T

    2014-01-01

    In the employer-sponsored insurance market that covers most Americans; many workers are "underinsured." The evidence shows onerous out-of-pocket payments causing them to forgo needed care, miss work, and fall into bankruptcies and foreclosures. Nonetheless, many higher-paid workers are "overinsured": the evidence shows that in this domain, surplus insurance stimulates spending and price inflation without improving health. Employers can solve these problems together by scaling cost-sharing to wages. This reform would make insurance better protect against risk and guarantee access to care, while maintaining or even reducing insurance premiums. Yet, there are legal obstacles to scaled cost-sharing. The group-based nature of employer health insurance, reinforced by federal law, makes it difficult for scaling to be achieved through individual choices. The Affordable Care Act's (ACA) "essential coverage" mandate also caps cost-sharing even for wealthy workers that need no such cap. Additionally, there is a tax distortion in favor of highly paid workers purchasing healthcare through insurance rather than out-of-pocket. These problems are all surmountable. In particular, the ACA has expanded the applicability of an unenforced employee-benefits rule that prohibits "discrimination" in favor of highly compensated workers. A novel analysis shows that this statute gives the Internal Revenue Service the authority to require scaling and to thereby eliminate the current inequities and inefficiencies caused by the tax distortion. The promise is smarter insurance for over 150 million Americans.

  6. A computed tomography study in the location of greater palatine artery in South Indian population for maxillary osteotomy

    Directory of Open Access Journals (Sweden)

    I Packiaraj

    2016-01-01

    Full Text Available Introduction: The greater palatine artery is one of the important feeding vessel to the maxilla. The surgeon should know the surgical anatomy of greater palatine artery to avoid trauma in maxilla which leads to ischemic problems. Aim: The CT evaluation of the distance between Pyriform aperture and the greater palatine foramen in various ages of both sexes. Result: The distance varies according to sex and age which are measured by CT and standardised. Discussion: The lateral nasal osteotomy can be done upto 25 mm depth, instead of 20 mm. Conclusion: By this study it shows that the lateral nasal wall osteotomy can be performed without injury to greater palatine artery.

  7. The performance of low-cost commercial cloud computing as an alternative in computational chemistry.

    Science.gov (United States)

    Thackston, Russell; Fortenberry, Ryan C

    2015-05-05

    The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. © 2015 Wiley Periodicals, Inc.

  8. Computer Aided Design of a Low-Cost Painting Robot

    Directory of Open Access Journals (Sweden)

    SYEDA MARIA KHATOON ZAIDI

    2017-10-01

    Full Text Available The application of robots or robotic systems for painting parts is becoming increasingly conventional; to improve reliability, productivity, consistency and to decrease waste. However, in Pakistan only highend Industries are able to afford the luxury of a robotic system for various purposes. In this study we propose an economical Painting Robot that a small-scale industry can install in their plant with ease. The importance of this robot is that being cost effective, it can easily be replaced in small manufacturing industries and therefore, eliminate health problems occurring to the individual in charge of painting parts on an everyday basis. To achieve this aim, the robot is made with local parts with only few exceptions, to cut costs; and the programming language is kept at a mediocre level. Image processing is used to establish object recognition and it can be programmed to paint various simple geometries. The robot is placed on a conveyer belt to maximize productivity. A four DoF (Degree of Freedom arm increases the working envelope and accessibility of painting different shaped parts with ease. This robot is capable of painting up, front, back, left and right sides of the part with a single colour. Initially CAD (Computer Aided Design models of the robot were developed which were analyzed, modified and improved to withstand loading condition and perform its task efficiently. After design selection, appropriate motors and materials were selected and the robot was developed. Throughout the development phase, minor problems and errors were fixed accordingly as they arose. Lastly the robot was integrated with the computer and image processing for autonomous control. The final results demonstrated that the robot is economical and reduces paint wastage.

  9. Computer aided design of a low-cost painting robot

    International Nuclear Information System (INIS)

    Zaidi, S.M.; Janejo, F.; Mujtaba, S.B.

    2017-01-01

    The application of robots or robotic systems for painting parts is becoming increasingly conventional; to improve reliability, productivity, consistency and to decrease waste. However, in Pakistan only highend Industries are able to afford the luxury of a robotic system for various purposes. In this study we propose an economical Painting Robot that a small-scale industry can install in their plant with ease. The importance of this robot is that being cost effective, it can easily be replaced in small manufacturing industries and therefore, eliminate health problems occurring to the individual in charge of painting parts on an everyday basis. To achieve this aim, the robot is made with local parts with only few exceptions, to cut costs; and the programming language is kept at a mediocre level. Image processing is used to establish object recognition and it can be programmed to paint various simple geometries. The robot is placed on a conveyer belt to maximize productivity. A four DoF (Degree of Freedom) arm increases the working envelope and accessibility of painting different shaped parts with ease. This robot is capable of painting up, front, back, left and right sides of the part with a single colour. Initially CAD (Computer Aided Design) models of the robot were developed which were analyzed, modified and improved to withstand loading condition and perform its task efficiently. After design selection, appropriate motors and materials were selected and the robot was developed. Throughout the development phase, minor problems and errors were fixed accordingly as they arose. Lastly the robot was integrated with the computer and image processing for autonomous control. The final results demonstrated that the robot is economical and reduces paint wastage. (author)

  10. Decommissioning costing approach based on the standardised list of costing items. Lessons learnt by the OMEGA computer code

    International Nuclear Information System (INIS)

    Daniska, Vladimir; Rehak, Ivan; Vasko, Marek; Ondra, Frantisek; Bezak, Peter; Pritrsky, Jozef; Zachar, Matej; Necas, Vladimir

    2011-01-01

    The document 'A Proposed Standardised List of Items for Costing Purposes' was issues in 1999 by OECD/NEA, IAEA and European Commission (EC) for promoting the harmonisation in decommissioning costing. It is a systematic list of decommissioning activities classified in chapters 01 to 11 with three numbered levels. Four cost group are defined for cost at each level. Document constitutes the standardised matrix of decommissioning activities and cost groups with definition of content of items. Knowing what is behind the items makes the comparison of cost for decommissioning projects transparent. Two approaches are identified for use of the standardised cost structure. First approach converts the cost data from existing specific cost structures into the standardised cost structure for the purpose of cost presentation. Second approach uses the standardised cost structure as the base for the cost calculation structure; the calculated cost data are formatted in the standardised cost format directly; several additional advantages may be identified in this approach. The paper presents the costing methodology based on the standardised cost structure and lessons learnt from last ten years of the implementation of the standardised cost structure as the cost calculation structure in the computer code OMEGA. Code include also on-line management of decommissioning waste, decay of radioactively, evaluation of exposure, generation and optimisation of the Gantt chart of a decommissioning project, which makes the OMEGA code an effective tool for planning and optimisation of decommissioning processes. (author)

  11. Manual of phosphoric acid fuel cell power plant cost model and computer program

    Science.gov (United States)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    Cost analysis of phosphoric acid fuel cell power plant includes two parts: a method for estimation of system capital costs, and an economic analysis which determines the levelized annual cost of operating the system used in the capital cost estimation. A FORTRAN computer has been developed for this cost analysis.

  12. Cost effective distributed computing for Monte Carlo radiation dosimetry

    International Nuclear Information System (INIS)

    Wise, K.N.; Webb, D.V.

    2000-01-01

    Full text: An inexpensive computing facility has been established for performing repetitive Monte Carlo simulations with the BEAM and EGS4/EGSnrc codes of linear accelerator beams, for calculating effective dose from diagnostic imaging procedures and of ion chambers and phantoms used for the Australian high energy absorbed dose standards. The facility currently consists of 3 dual-processor 450 MHz processor PCs linked by a high speed LAN. The 3 PCs can be accessed either locally from a single keyboard/monitor/mouse combination using a SwitchView controller or remotely via a computer network from PCs with suitable communications software (e.g. Telnet, Kermit etc). All 3 PCs are identically configured to have the Red Hat Linux 6.0 operating system. A Fortran compiler and the BEAM and EGS4/EGSnrc codes are available on the 3 PCs. The preparation of sequences of jobs utilising the Monte Carlo codes is simplified using load-distributing software (enFuzion 6.0 marketed by TurboLinux Inc, formerly Cluster from Active Tools) which efficiently distributes the computing load amongst all 6 processors. We describe 3 applications of the system - (a) energy spectra from radiotherapy sources, (b) mean mass-energy absorption coefficients and stopping powers for absolute absorbed dose standards and (c) dosimetry for diagnostic procedures; (a) and (b) are based on the transport codes BEAM and FLURZnrc while (c) is a Fortran/EGS code developed at ARPANSA. Efficiency gains ranged from 3 for (c) to close to the theoretical maximum of 6 for (a) and (b), with the gain depending on the amount of 'bookkeeping' to begin each task and the time taken to complete a single task. We have found the use of a load-balancing batch processing system with many PCs to be an economical way of achieving greater productivity for Monte Carlo calculations or of any computer intensive task requiring many runs with different parameters. Copyright (2000) Australasian College of Physical Scientists and

  13. (Re)engineering Earth System Models to Expose Greater Concurrency for Ultrascale Computing: Practice, Experience, and Musings

    Science.gov (United States)

    Mills, R. T.

    2014-12-01

    As the high performance computing (HPC) community pushes towards the exascale horizon, the importance and prevalence of fine-grained parallelism in new computer architectures is increasing. This is perhaps most apparent in the proliferation of so-called "accelerators" such as the Intel Xeon Phi or NVIDIA GPGPUs, but the trend also holds for CPUs, where serial performance has grown slowly and effective use of hardware threads and vector units are becoming increasingly important to realizing high performance. This has significant implications for weather, climate, and Earth system modeling codes, many of which display impressive scalability across MPI ranks but take relatively little advantage of threading and vector processing. In addition to increasing parallelism, next generation codes will also need to address increasingly deep hierarchies for data movement: NUMA/cache levels, on node vs. off node, local vs. wide neighborhoods on the interconnect, and even in the I/O system. We will discuss some approaches (grounded in experiences with the Intel Xeon Phi architecture) for restructuring Earth science codes to maximize concurrency across multiple levels (vectors, threads, MPI ranks), and also discuss some novel approaches for minimizing expensive data movement/communication.

  14. Development of computer software for pavement life cycle cost analysis.

    Science.gov (United States)

    1988-01-01

    The life cycle cost analysis program (LCCA) is designed to automate and standardize life cycle costing in Virginia. It allows the user to input information necessary for the analysis, and it then completes the calculations and produces a printed copy...

  15. Environmental damage costs from airborne pollution of industrial activities in the greater Athens, Greece area and the resulting benefits from the introduction of BAT

    International Nuclear Information System (INIS)

    Mirasgedis, S.; Hontou, V.; Georgopoulou, E.; Sarafidis, Y.; Gakis, N.; Lalas, D.P.; Loukatos, A.; Gargoulas, N.; Mentzis, A.; Economidis, D.; Triantafilopoulos, T.; Korizi, K.; Mavrotas, G.

    2008-01-01

    Attributing costs to the environmental impacts associated with industrial activities can greatly assist in protecting human health and the natural environment as monetary values are capable of directly influencing technological and policy decisions without changing the rules of the market. This paper attempts to estimate the external cost attributable to the atmospheric pollution from 'medium and high environmental burden' industrial activities located in the greater Athens area and the benefits from Best Available Techniques (BAT) introduction. To this end a number of typical installations were defined to be used in conjunction with the Impact Pathway Approach developed in the context of the ExternE project to model all industrial sectors/sub-sectors located in the area of interest. Total environmental externalities due to air pollutants emitted by these industrial activities were found to reach 211 M Euro per year, associated mainly with human mortality and morbidity due to PM 10 emissions, as well as with climate change impacts due to CO 2 emissions for which non-metallic minerals and oil processing industries are the main sources. The results obtained can be used as the basis for an integrated evaluation of potential BAT, taking into account not only private costs and benefits but also the environmental externalities, thus leading to policy decisions that maximize social welfare in each industrial sector/sub-sector

  16. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  17. Assessment of the integrity of structural shielding of four computed tomography facilities in the greater Accra region of Ghana

    International Nuclear Information System (INIS)

    Nkansah, A.; Schandorf, C.; Boadu, M.; Fletcher, J. J.

    2013-01-01

    The structural shielding thicknesses of the walls of four computed tomography (CT) facilities in Ghana were re-evaluated to verify the shielding integrity using the new shielding design methods recommended by the National Council on Radiological Protection and Measurements (NCRP). The shielding thickness obtained ranged from 120 to 155 mm using default DLP values proposed by the European Commission and 110 to 168 mm using derived DLP values from the four CT manufacturers. These values are within the accepted standard concrete wall thickness ranging from 102 to 152 mm prescribed by the NCRP. The ultrasonic pulse testing of all walls indicated that these are of good quality and free of voids since pulse velocities estimated were within the range of 3.496±0.005 km s -1 . An average dose equivalent rate estimated for supervised areas is 3.4±0.27 μSv week -1 and that for the controlled area is 18.0±0.15 μSv week -1 , which are within acceptable values. (authors)

  18. Computed Tomography (CT) radiation dose in children: A survey to propose regional diagnostic reference levels in Greater Accra-Ghana

    International Nuclear Information System (INIS)

    Addo, Patience

    2016-07-01

    The aim of this work was to assess the doses delivered to paediatric patients during computed tomography (CT) examinations of the head, chest and abdomen, and establishing regional diagnostic reference levels (RDRLs) for four age groups. The patient data, technique parameters and dose descriptors collected include: age, sex, tube voltage, tube current, rotation time, slice thickness, scan length, volume CT dose index (CTDI_v_o_l) and dose length product (DLP). Currently, paediatric CT examinations account for 11% of radiation exposure. For the paediatric age groups; < 1 year, (1-5 years), (6-10 years) and (11-15 years), the proposed RDRLs for head in terms of CTDI_v_o_l are (28, 38, 48 and 86 mGy) and in terms of DLP; (395, 487, 601, 1614 mGy cm) respectively. For Chest examinations, proposed RDRLs in terms of CTDI_v_o_l are (1 and 5 mGy) and in terms of DLP; (18 and 110 mGy cm) for age groups; < 1 year and (1-5 years) respectively. For Abdomino-pelvic examinations, proposed RDRLs in terms of CTDI_v_o_l are (3, 3 and 10 mGy) and in terms of DLP; (71, 120 and 494 mGy cm) for age groups; < 1 year, (1-5 years) and (6-10 years) respectively. For abdomen examinations, proposed RDRLs in terms of CTDI_v_o_l are (3, 5 and 5 mGy) and in terms of DLP; (83, 124 and 233 mGy cm) for age groups; < 1 year, (1-5 years) and (11-15 years) respectively. RDRLs have been proposed for CTDI_v_o_l and DLP for head, chest, abdomen and Abdomino-pelvic paediatric CT examinations in this study. An optimisation is required for 11-15 years age group for the DLP values which was higher than their corresponding international DRLs. For an effective optimization of patient protection a trade-off between image quality and patients doses studies should be investigated. (au)

  19. Low-Budget Computer Programming in Your School (An Alternative to the Cost of Large Computers). Illinois Series on Educational Applications of Computers. No. 14.

    Science.gov (United States)

    Dennis, J. Richard; Thomson, David

    This paper is concerned with a low cost alternative for providing computer experience to secondary school students. The brief discussion covers the programmable calculator and its relevance for teaching the concepts and the rudiments of computer programming and for computer problem solving. A list of twenty-five programming activities related to…

  20. DECOST: computer routine for decommissioning cost and funding analysis

    International Nuclear Information System (INIS)

    Mingst, B.C.

    1979-12-01

    One of the major controversies surrounding the decommissioning of nuclear facilities is the lack of financial information on just what the eventual costs will be. The Nuclear Regulatory Commission has studies underway to analyze the costs of decommissioning of nuclear fuel cycle facilities and some other similar studies have also been done by other groups. These studies all deal only with the final cost outlays needed to finance decommissioning in an unchangeable set of circumstances. Funding methods and planning to reduce the costs and financial risks are usually not attempted. The DECOST program package is intended to fill this void and allow wide-ranging study of the various options available when planning for the decommissioning of nuclear facilities

  1. Incentive motivation deficits in schizophrenia reflect effort computation impairments during cost-benefit decision-making.

    Science.gov (United States)

    Fervaha, Gagan; Graff-Guerrero, Ariel; Zakzanis, Konstantine K; Foussias, George; Agid, Ofer; Remington, Gary

    2013-11-01

    Motivational impairments are a core feature of schizophrenia and although there are numerous reports studying this feature using clinical rating scales, objective behavioural assessments are lacking. Here, we use a translational paradigm to measure incentive motivation in individuals with schizophrenia. Sixteen stable outpatients with schizophrenia and sixteen matched healthy controls completed a modified version of the Effort Expenditure for Rewards Task that accounts for differences in motoric ability. Briefly, subjects were presented with a series of trials where they may choose to expend a greater amount of effort for a larger monetary reward versus less effort for a smaller reward. Additionally, the probability of receiving money for a given trial was varied at 12%, 50% and 88%. Clinical and other reward-related variables were also evaluated. Patients opted to expend greater effort significantly less than controls for trials of high, but uncertain (i.e. 50% and 88% probability) incentive value, which was related to amotivation and neurocognitive deficits. Other abnormalities were also noted but were related to different clinical variables such as impulsivity (low reward and 12% probability). These motivational deficits were not due to group differences in reward learning, reward valuation or hedonic capacity. Our findings offer novel support for incentive motivation deficits in schizophrenia. Clinical amotivation is associated with impairments in the computation of effort during cost-benefit decision-making. This objective translational paradigm may guide future investigations of the neural circuitry underlying these motivational impairments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Is computer aided detection (CAD) cost effective in screening mammography? A model based on the CADET II study

    Science.gov (United States)

    2011-01-01

    Background Single reading with computer aided detection (CAD) is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£), year 2007/08) of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner). Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate) CAD is unlikely to be a cost effective alternative to double reading for mammography screening in UK. This study

  3. Risk assessment and management of brucellosis in the southern greater Yellowstone area (II): Cost-benefit analysis of reducing elk brucellosis prevalence.

    Science.gov (United States)

    Boroff, Kari; Kauffman, Mandy; Peck, Dannele; Maichak, Eric; Scurlock, Brandon; Schumaker, Brant

    2016-11-01

    Recent cases of bovine brucellosis (Brucella abortus) in cattle (Bos taurus) and domestic bison (Bison bison) of the southern Greater Yellowstone Area (SGYA) have been traced back to free-ranging elk (Cervus elaphus). Several management activities have been implemented to reduce brucellosis seroprevalence in elk, including test-and-slaughter, low-density feeding at elk winter feedgrounds, and elk vaccination. It is unclear which of these activities are most cost-effective at reducing the risk of elk transmitting brucellosis to cattle. In a companion paper, a stochastic risk model was used to translate a reduction in elk seroprevalence to a reduction in the risk of transmission to cattle. Here, we use those results to estimate the expected economic benefits and costs of reducing seroprevalence in elk using three different management activities: vaccination of elk with Brucella strain 19 (S19), low-density feeding of elk, and elk test-and-slaughter. Results indicate that the three elk management activities yield negative expected net benefits, ranging from -$2983 per year for low-density feeding to -$595,471 per year for test-and-slaughter. Society's risk preferences will determine whether strategies that generate small negative net benefit, such as low-density feeding, are worth implementing. However, activities with large negative net benefits, such as test-and-slaughter and S19 vaccination, are unlikely to be economically worthwhile. Given uncertainty about various model parameters, we identify some circumstances in which individual management activities might generate positive expected net benefit. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Resource utilization and costs during the initial years of lung cancer screening with computed tomography in Canada.

    Science.gov (United States)

    Cressman, Sonya; Lam, Stephen; Tammemagi, Martin C; Evans, William K; Leighl, Natasha B; Regier, Dean A; Bolbocean, Corneliu; Shepherd, Frances A; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R; Mayo, John R; McWilliams, Annette; Couture, Christian; English, John C; Goffin, John; Hwang, David M; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J; Goss, Glenwood D; Nicholas, Garth; Seely, Jean M; Sekhon, Harmanjatinder S; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D; Tan, Wan C; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J

    2014-10-01

    It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer's perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400-$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553-$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254-$52,200; p = 0.061). In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure.

  5. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    Science.gov (United States)

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  6. Factors cost effectively improved using computer simulations of ...

    African Journals Online (AJOL)

    LPhidza

    effectively managed using computer simulations in semi-arid conditions pertinent to much of sub-Saharan Africa. ... small scale farmers to obtain optimal crop yields thus ensuring their food security and livelihood is ... those that simultaneously incorporate and simulate processes involved throughout the course of crop ...

  7. Adaptive Radar Signal Processing-The Problem of Exponential Computational Cost

    National Research Council Canada - National Science Library

    Rangaswamy, Muralidhar

    2003-01-01

    .... Extensions to handle the case of non-Gaussian clutter statistics are presented. Current challenges of limited training data support, computational cost, and severely heterogeneous clutter backgrounds are outlined...

  8. Computational cost of isogeometric multi-frontal solvers on parallel distributed memory machines

    KAUST Repository

    Woźniak, Maciej; Paszyński, Maciej R.; Pardo, D.; Dalcin, Lisandro; Calo, Victor M.

    2015-01-01

    This paper derives theoretical estimates of the computational cost for isogeometric multi-frontal direct solver executed on parallel distributed memory machines. We show theoretically that for the Cp-1 global continuity of the isogeometric solution

  9. Computer code for the costing and sizing of TNS tokamaks

    International Nuclear Information System (INIS)

    Sink, D.A.; Iwinski, E.M.

    1977-01-01

    A FORTRAN code for the COsting And Sizing of Tokamaks (COAST) is described. The code was written to conduct detailed analyses on the engineering features of the next tokamak fusion device following TFTR. The ORNL/Westinghouse study of TNS (The Next Step) has involved the investigation of a number of device options, each over a wide range of plasma sizes. A generalized description of TNS is incorporated in the code and includes refined modeling of over forty systems and subsystems. Considerable detailed design and analyses have provided the basis for the thermal, electrical, mechanical, nuclear, chemical, vacuum, and facility engineering of the various subsystems. Currently, the code provides a tool for the systematic comparison of four toroidal field (TF) coil technologies allowing both D-shaped and circular coils. The coil technologies are: (1) copper (both room temperature and liquid-nitrogen cooled), (2) superconducting NbTi, (3) superconducting Nb 3 Sn, and (4) a Cu/NbTi/ hybrid. For the poloidal field (PF) coil systems copper conductors are assumed. The ohmic heating (OH) coils are located within the machine bore and have an air core, while the shaping field (SF) coils are located either within or outside the TF coils. The PF coil self and mutual inductances are calculated from the geometry, and the PF coil power supplies are modeled to account for time-dependent profiles for voltages and currents as governed by input data. Plasma heating is assumed to be by neutral beams, and impurity control is either passive or by a poloidal divertor system. The size modeling allows considerable freedom in specifying physics assumptions, operating scenarios, TF operating margin, and component geometric and performance parameters. Cost relationships have been developed for both plant and capital equipment and for annual utility and fuel expenses. The code has been used successfully to reproduce the sizing and costing of TFTR in order to calibrate the various models

  10. Plerixafor mobilization leads to a lower ratio of CD34+ cells to total nucleated cells which results in greater storage costs.

    Science.gov (United States)

    Tanhehco, Yvette C; Adamski, Jill; Sell, Mary; Cunningham, Kathleen; Eisenmann, Christa; Magee, Deborah; Stadtmauer, Edward A; O'Doherty, Una

    2010-01-01

    Plerixafor (Mozobil, AMD3100) with granulocyte-colony stimulating factor (G-CSF) mobilizes more CD34+ cells/kg compared to G-CSF alone. Given that plerixafor enhances mobilization of multiple white blood cell lineages, we determined if more storage space is required for products collected from patients mobilized with plerixafor. A review of the medical records of 15 patients mobilized with chemotherapy and G-CSF (control) and 14 patients mobilized with plerixafor plus G-CSF (plerixafor) was performed. Data on demographics, baseline characteristics, CD34+ cells/kg, total nucleated cells, total mononuclear cells, total apheresis sessions, and total bags for storage were collected. Mean values were determined and compared using Student's t-test. We found that the proportion of CD34+ cells among total nucleated cells was less in the plerixafor group compared to the control group (P = 0.0427). More nucleated cells (10.7 x 10(10) vs. 7.1 x 10(10), P =0.0452) and mononuclear cells (9.7 x 10(10) vs. 5.9 x 10(10), P = 0.0059) were mobilized with plerixafor plus G-CSF. However, there was no significant difference in CD34+ cells/kg, total CD34+ cells or the proportion of mononuclear cells among total nucleated cells between the two groups. More storage bags were required for the plerixafor group compared to the control group (15 vs. 9, P = 0.0299). Mobilization with plerixafor plus G-CSF resulted in a smaller proportion of CD34+ cells collected and a greater number of storage bags. An increase in the number of bags required for stem cell storage may be logistically problematic and will also lead to increased costs for storage of stem cells.

  11. Computing Cost Price for Cataract Surgery by Activity Based Costing (ABC Method at Hazrat-E-Zahra Hospital, Isfahan University of Medical Sciences, 2014

    Directory of Open Access Journals (Sweden)

    Masuod Ferdosi

    2016-10-01

    Full Text Available Background: Hospital managers need to have accurate information about actual costs to make efficient and effective decisions. In activity based costing method, first, activities are recognized and then direct and indirect costs are computed based on allocation methods. The aim of this study was to compute the cost price for cataract surgery by Activity Based Costing (ABC method at Hazrat-e-Zahra Hospital, Isfahan University of Medical Sciences. Methods: This was a cross- sectional study for computing the costs of cataract surgery by activity based costing technique in Hazrat-e-Zahra Hospital in Isfahan University of Medical Sciences, 2014. Data were collected through interview and direct observation and analyzed by Excel software. Results: According to the results of this study, total cost in cataract surgery was 8,368,978 Rials. Personnel cost included 62.2% (5,213,574 Rials of total cost of cataract surgery that is the highest share of surgery costs. The cost of consumables was 7.57% (1,992,852 Rials of surgery costs. Conclusion: Based on the results, there was different between cost price of the services and public Tariff which appears as hazards or financial crises to the hospital. Therefore, it is recommended to use the right methods to compute the costs relating to Activity Based Costing. Cost price of cataract surgery can be reduced by strategies such as decreasing the cost of consumables.

  12. Costs incurred by applying computer-aided design/computer-aided manufacturing techniques for the reconstruction of maxillofacial defects.

    Science.gov (United States)

    Rustemeyer, Jan; Melenberg, Alex; Sari-Rieger, Aynur

    2014-12-01

    This study aims to evaluate the additional costs incurred by using a computer-aided design/computer-aided manufacturing (CAD/CAM) technique for reconstructing maxillofacial defects by analyzing typical cases. The medical charts of 11 consecutive patients who were subjected to the CAD/CAM technique were considered, and invoices from the companies providing the CAD/CAM devices were reviewed for every case. The number of devices used was significantly correlated with cost (r = 0.880; p costs were found between cases in which prebent reconstruction plates were used (€3346.00 ± €29.00) and cases in which they were not (€2534.22 ± €264.48; p costs of two, three and four devices, even when ignoring the cost of reconstruction plates. Additional fees provided by statutory health insurance covered a mean of 171.5% ± 25.6% of the cost of the CAD/CAM devices. Since the additional fees provide financial compensation, we believe that the CAD/CAM technique is suited for wide application and not restricted to complex cases. Where additional fees/funds are not available, the CAD/CAM technique might be unprofitable, so the decision whether or not to use it remains a case-to-case decision with respect to cost versus benefit. Copyright © 2014 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  13. hPIN/hTAN: Low-Cost e-Banking Secure against Untrusted Computers

    Science.gov (United States)

    Li, Shujun; Sadeghi, Ahmad-Reza; Schmitz, Roland

    We propose hPIN/hTAN, a low-cost token-based e-banking protection scheme when the adversary has full control over the user's computer. Compared with existing hardware-based solutions, hPIN/hTAN depends on neither second trusted channel, nor secure keypad, nor computationally expensive encryption module.

  14. Modelling the Intention to Adopt Cloud Computing Services: A Transaction Cost Theory Perspective

    Directory of Open Access Journals (Sweden)

    Ogan Yigitbasioglu

    2014-11-01

    Full Text Available This paper uses transaction cost theory to study cloud computing adoption. A model is developed and tested with data from an Australian survey. According to the results, perceived vendor opportunism and perceived legislative uncertainty around cloud computing were significantly associated with perceived cloud computing security risk. There was also a significant negative relationship between perceived cloud computing security risk and the intention to adopt cloud services. This study also reports on adoption rates of cloud computing in terms of applications, as well as the types of services used.

  15. Estimating boiling water reactor decommissioning costs. A user's manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    International Nuclear Information System (INIS)

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  16. Plant process computer replacements - techniques to limit installation schedules and costs

    International Nuclear Information System (INIS)

    Baker, M.D.; Olson, J.L.

    1992-01-01

    Plant process computer systems, a standard fixture in all nuclear power plants, are used to monitor and display important plant process parameters. Scanning thousands of field sensors and alarming out-of-limit values, these computer systems are heavily relied on by control room operators. The original nuclear steam supply system (NSSS) vendor for the power plant often supplied the plant process computer. Designed using sixties and seventies technology, a plant's original process computer has been obsolete for some time. Driven by increased maintenance costs and new US Nuclear Regulatory Commission regulations such as NUREG-0737, Suppl. 1, many utilities have replaced their process computers with more modern computer systems. Given that computer systems are by their nature prone to rapid obsolescence, this replacement cycle will likely repeat. A process computer replacement project can be a significant capital expenditure and must be performed during a scheduled refueling outage. The object of the installation process is to install a working system on schedule. Experience gained by supervising several computer replacement installations has taught lessons that, if applied, will shorten the schedule and limit the risk of costly delays. Examples illustrating this technique are given. This paper and these examples deal only with the installation process and assume that the replacement computer system has been adequately designed, and development and factory tested

  17. Comparative cost analysis -- computed tomography vs. alternative diagnostic procedures, 1977-1980

    International Nuclear Information System (INIS)

    Gempel, P.A.; Harris, G.H.; Evans, R.G.

    1977-12-01

    In comparing the total national cost of utilizing computed tomography (CT) for medically indicated diagnoses with that of conventional x-ray, ultrasonography, nuclear medicine, and exploratory surgery, this investigation concludes that there was little, if any, added net cost from CT use in 1977 or will there be in 1980. Computed tomography, generally recognized as a reliable and useful diagnostic modality, has the potential to reduce net costs provided that an optimal number of units can be made available to physicians and patients to achieve projected reductions in alternative procedures. This study examines the actual cost impact of CT on both cranial and body diagnostic procedures. For abdominal and mediastinal disorders, CT scanning is just beginning to emerge as a diagnostic modality. As such, clinical experience is somewhat limited and the authors assume that no significant reduction in conventional procedures took place in 1977. It is estimated that the approximately 375,000 CT body procedures performed in 1977 represent only a 5 percent cost increase over use of other diagnostic modalities. It is projected that 2,400,000 CT body procedures will be performed in 1980 and, depending on assumptions used, total body diagnostic costs will increase only slightly or be reduced. Thirty-one tables appear throughout the text presenting cost data broken down by types of diagnostic procedures used and projections by years. Appendixes present technical cost components for diagnostic procedures, the comparative efficacy of CT as revealed in abstracts of published literature, selected medical diagnoses, and references

  18. Costs of cloud computing for a biometry department. A case study.

    Science.gov (United States)

    Knaus, J; Hieke, S; Binder, H; Schwarzer, G

    2013-01-01

    "Cloud" computing providers, such as the Amazon Web Services (AWS), offer stable and scalable computational resources based on hardware virtualization, with short, usually hourly, billing periods. The idea of pay-as-you-use seems appealing for biometry research units which have only limited access to university or corporate data center resources or grids. This case study compares the costs of an existing heterogeneous on-site hardware pool in a Medical Biometry and Statistics department to a comparable AWS offer. The "total cost of ownership", including all direct costs, is determined for the on-site hardware, and hourly prices are derived, based on actual system utilization during the year 2011. Indirect costs, which are difficult to quantify are not included in this comparison, but nevertheless some rough guidance from our experience is given. To indicate the scale of costs for a methodological research project, a simulation study of a permutation-based statistical approach is performed using AWS and on-site hardware. In the presented case, with a system utilization of 25-30 percent and 3-5-year amortization, on-site hardware can result in smaller costs, compared to hourly rental in the cloud dependent on the instance chosen. Renting cloud instances with sufficient main memory is a deciding factor in this comparison. Costs for on-site hardware may vary, depending on the specific infrastructure at a research unit, but have only moderate impact on the overall comparison and subsequent decision for obtaining affordable scientific computing resources. Overall utilization has a much stronger impact as it determines the actual computing hours needed per year. Taking this into ac count, cloud computing might still be a viable option for projects with limited maturity, or as a supplement for short peaks in demand.

  19. The concept of computer software designed to identify and analyse logistics costs in agricultural enterprises

    Directory of Open Access Journals (Sweden)

    Karol Wajszczyk

    2009-01-01

    Full Text Available The study comprised research, development and computer programming works concerning the development of a concept for the IT tool to be used in the identification and analysis of logistics costs in agricultural enterprises in terms of the process-based approach. As a result of research and programming work an overall functional and IT concept of software was developed for the identification and analysis of logistics costs for agricultural enterprises.

  20. A low-cost vector processor boosting compute-intensive image processing operations

    Science.gov (United States)

    Adorf, Hans-Martin

    1992-01-01

    Low-cost vector processing (VP) is within reach of everyone seriously engaged in scientific computing. The advent of affordable add-on VP-boards for standard workstations complemented by mathematical/statistical libraries is beginning to impact compute-intensive tasks such as image processing. A case in point in the restoration of distorted images from the Hubble Space Telescope. A low-cost implementation is presented of the standard Tarasko-Richardson-Lucy restoration algorithm on an Intel i860-based VP-board which is seamlessly interfaced to a commercial, interactive image processing system. First experience is reported (including some benchmarks for standalone FFT's) and some conclusions are drawn.

  1. A participatory decision support tool to access costs and benefits or tourism development scenarios : application of the ADAPTIVE model to Greater Giyani, South Africa

    NARCIS (Netherlands)

    Henkens, R.J.H.G.; Tassone, V.C.; Grafakos, S.; Groot, de R.S.; Luttik, J.

    2007-01-01

    The tourism industry represents a thriving business and offers many opportunities for tourism development all around the world. Each development will have its economic, socio-cultural and ecological costs and benefits. Many of these are difficult to assess and to value, which often leads to

  2. An Alternative Method for Computing Unit Costs and Productivity Ratios. AIR 1984 Annual Forum Paper.

    Science.gov (United States)

    Winstead, Wayland H.; And Others

    An alternative measure for evaluating the performance of academic departments was studied. A comparison was made with the traditional manner for computing unit costs and productivity ratios: prorating the salary and effort of each faculty member to each course level based on the personal mix of course taught. The alternative method used averaging…

  3. Effectiveness of Multimedia Elements in Computer Supported Instruction: Analysis of Personalization Effects, Students' Performances and Costs

    Science.gov (United States)

    Zaidel, Mark; Luo, XiaoHui

    2010-01-01

    This study investigates the efficiency of multimedia instruction at the college level by comparing the effectiveness of multimedia elements used in the computer supported learning with the cost of their preparation. Among the various technologies that advance learning, instructors and students generally identify interactive multimedia elements as…

  4. The cognitive dynamics of computer science cost-effective large scale software development

    CERN Document Server

    De Gyurky, Szabolcs Michael; John Wiley & Sons

    2006-01-01

    This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.

  5. Low-cost addition-subtraction sequences for the final exponentiation computation in pairings

    DEFF Research Database (Denmark)

    Guzmán-Trampe, Juan E; Cruz-Cortéz, Nareli; Dominguez Perez, Luis

    2014-01-01

    In this paper, we address the problem of finding low cost addition–subtraction sequences for situations where a doubling step is significantly cheaper than a non-doubling one. One application of this setting appears in the computation of the final exponentiation step of the reduced Tate pairing d...

  6. Low cost, scalable proteomics data analysis using Amazon's cloud computing services and open source search algorithms.

    Science.gov (United States)

    Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N

    2009-06-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).

  7. Low-cost computer mouse for the elderly or disabled in Taiwan.

    Science.gov (United States)

    Chen, C-C; Chen, W-L; Chen, B-N; Shih, Y-Y; Lai, J-S; Chen, Y-L

    2014-01-01

    A mouse is an important communication interface between a human and a computer, but it is still difficult to use for the elderly or disabled. To develop a low-cost computer mouse auxiliary tool. The principal structure of the low-cost mouse auxiliary tool is the IR (infrared ray) array module and the Wii icon sensor module, which combine with reflective tape and the SQL Server database. This has several benefits including cheap hardware cost, fluent control, prompt response, adaptive adjustment and portability. Also, it carries the game module with the function of training and evaluation; to the trainee, it is really helpful to upgrade the sensitivity of consciousness/sense and the centralization of attention. The intervention phase/maintenance phase, with regard to clicking accuracy and use of time, p value (p< 0.05) reach the level of significance. The development of the low cost adaptive computer mouse auxiliary tool was completed during the study and was also verified as having the characteristics of low cost, easy operation and the adaptability. To patients with physical disabilities, if they have independent control action parts of their limbs, the mouse auxiliary tool is suitable for them to use, i.e. the user only needs to paste the reflective tape by the independent control action parts of the body to operate the mouse auxiliary tool.

  8. Development of a computer program for the cost analysis of spent fuel management

    International Nuclear Information System (INIS)

    Choi, Heui Joo; Lee, Jong Youl; Choi, Jong Won; Cha, Jeong Hun; Whang, Joo Ho

    2009-01-01

    So far, a substantial amount of spent fuels have been generated from the PWR and CANDU reactors. They are being temporarily stored at the nuclear power plant sites. It is expected that the temporary storage facility will be full of spent fuels by around 2016. The government plans to solve the problem by constructing an interim storage facility soon. The radioactive management act was enacted in 2008 to manage the spent fuels safety in Korea. According to the act, the radioactive waste management fund which will be used for the transportation, interim storage, and the final disposal of spent fuels has been established. The cost for the management of spent fuels is surprisingly high and could include a lot of uncertainty. KAERI and Kyunghee University have developed cost estimation tools to evaluate the cost for a spent fuel management based on an engineering design and calculation. It is not easy to develop a tool for a cost estimation under the situation that the national policy on a spent fuel management has not yet been fixed at all. Thus, the current version of the computer program is based on the current conceptual design of each management system. The main purpose of this paper is to introduce the computer program developed for the cost analysis of a spent fuel management. In order to show the application of the program, a spent fuel management scenario is prepared, and the cost for the scenario is estimated

  9. PCR diagnosis of tick-borne pathogens in Maharashtra state, India indicates fitness cost associated with carrier infections is greater for crossbreed than native cattle breeds.

    Directory of Open Access Journals (Sweden)

    Sunil W Kolte

    Full Text Available Tick-borne pathogens (TBP are responsible for significant economic losses to cattle production, globally. This is particularly true in countries like India where TBP constrain rearing of high yielding Bos taurus, as they show susceptibility to acute tick borne disease (TBD, most notably tropical theileriosis caused by Theileria annulata. This has led to a programme of cross breeding Bos taurus (Holstein-Friesian or Jersey with native Bos indicus (numerous breeds to generate cattle that are more resistant to disease. However, the cost to fitness of subclinical carrier infection in crossbreeds relative to native breeds is unknown, but could represent a significant hidden economic cost. In this study, a total of 1052 bovine blood samples, together with associated data on host type, sex and body score, were collected from apparently healthy animals in four different agro-climatic zones of Maharashtra state. Samples were screened by PCR for detection of five major TBPs: T. annulata, T. orientalis, B. bigemina, B. bovis and Anaplasma spp.. The results demonstrated that single and co-infection with TBP are common, and although differences in pathogen spp. prevalence across the climatic zones were detected, simplistic regression models predicted that host type, sex and location are all likely to impact on prevalence of TBP. In order to remove issues with autocorrelation between variables, a subset of the dataset was modelled to assess any impact of TBP infection on body score of crossbreed versus native breed cattle (breed type. The model showed significant association between infection with TBP (particularly apicomplexan parasites and poorer body condition for crossbreed animals. These findings indicate potential cost of TBP carrier infection on crossbreed productivity. Thus, there is a case for development of strategies for targeted breeding to combine productivity traits with disease resistance, or to prevent transmission of TBP in India for economic

  10. PCR diagnosis of tick-borne pathogens in Maharashtra state, India indicates fitness cost associated with carrier infections is greater for crossbreed than native cattle breeds.

    Science.gov (United States)

    Kolte, Sunil W; Larcombe, Stephen D; Jadhao, Suresh G; Magar, Swapnil P; Warthi, Ganesh; Kurkure, Nitin V; Glass, Elizabeth J; Shiels, Brian R

    2017-01-01

    Tick-borne pathogens (TBP) are responsible for significant economic losses to cattle production, globally. This is particularly true in countries like India where TBP constrain rearing of high yielding Bos taurus, as they show susceptibility to acute tick borne disease (TBD), most notably tropical theileriosis caused by Theileria annulata. This has led to a programme of cross breeding Bos taurus (Holstein-Friesian or Jersey) with native Bos indicus (numerous) breeds to generate cattle that are more resistant to disease. However, the cost to fitness of subclinical carrier infection in crossbreeds relative to native breeds is unknown, but could represent a significant hidden economic cost. In this study, a total of 1052 bovine blood samples, together with associated data on host type, sex and body score, were collected from apparently healthy animals in four different agro-climatic zones of Maharashtra state. Samples were screened by PCR for detection of five major TBPs: T. annulata, T. orientalis, B. bigemina, B. bovis and Anaplasma spp.. The results demonstrated that single and co-infection with TBP are common, and although differences in pathogen spp. prevalence across the climatic zones were detected, simplistic regression models predicted that host type, sex and location are all likely to impact on prevalence of TBP. In order to remove issues with autocorrelation between variables, a subset of the dataset was modelled to assess any impact of TBP infection on body score of crossbreed versus native breed cattle (breed type). The model showed significant association between infection with TBP (particularly apicomplexan parasites) and poorer body condition for crossbreed animals. These findings indicate potential cost of TBP carrier infection on crossbreed productivity. Thus, there is a case for development of strategies for targeted breeding to combine productivity traits with disease resistance, or to prevent transmission of TBP in India for economic benefit.

  11. A low cost computer-controlled electrochemical measurement system for education and research

    International Nuclear Information System (INIS)

    Cottis, R.A.

    1989-01-01

    With the advent of low cost computers of significant processing power, it has become economically attractive, as well as offering practical advantages, to replace conventional electrochemical instrumentation with computer-based equipment. For example, the equipment to be described can perform all of the functions required for the measurement of a potentiodynamic polarization curve, replacing the conventional arrangement of sweep generator, potentiostat and chart recorder at a cost (based on the purchase cost of parts) which is less than that of most chart recorders alone. Additionally the use of computer control at a relatively low level provides a versatility (assuming the development of suitable software) which cannot easily be matched by conventional instruments. As a result of these considerations a simple computer-controlled electrochemical measurement system has been developed, with a primary aim being its use in teaching an MSc class in corrosion science and engineering, with additional applications in MSc and PhD research. For education reasons the design of the user interface has tried to make the internal operation of the unit as obvious as possible, and thereby minimize the tendency for students to treat the unit as a 'black box' with incomprehensible inner workings. This has resulted in a unit in which the three main components of function generator, potentiostat and recorder are presented as independent areas on the front panel, and can be configured by the user in exactly the same way as conventional instruments. (author) 11 figs

  12. A low cost computer-controlled electrochemical measurement system for education and research

    Energy Technology Data Exchange (ETDEWEB)

    Cottis, R A [Manchester Univ. (UK). Inst. of Science and Technology

    1989-01-01

    With the advent of low cost computers of significant processing power, it has become economically attractive, as well as offering practical advantages, to replace conventional electrochemical instrumentation with computer-based equipment. For example, the equipment to be described can perform all of the functions required for the measurement of a potentiodynamic polarization curve, replacing the conventional arrangement of sweep generator, potentiostat and chart recorder at a cost (based on the purchase cost of parts) which is less than that of most chart recorders alone. Additionally the use of computer control at a relatively low level provides a versatility (assuming the development of suitable software) which cannot easily be matched by conventional instruments. As a result of these considerations a simple computer-controlled electrochemical measurement system has been developed, with a primary aim being its use in teaching an MSc class in corrosion science and engineering, with additional applications in MSc and PhD research. For education reasons the design of the user interface has tried to make the internal operation of the unit as obvious as possible, and thereby minimize the tendency for students to treat the unit as a 'black box' with incomprehensible inner workings. This has resulted in a unit in which the three main components of function generator, potentiostat and recorder are presented as independent areas on the front panel, and can be configured by the user in exactly the same way as conventional instruments. (author) 11 figs.

  13. Hybrid Cloud Computing Architecture Optimization by Total Cost of Ownership Criterion

    Directory of Open Access Journals (Sweden)

    Elena Valeryevna Makarenko

    2014-12-01

    Full Text Available Achieving the goals of information security is a key factor in the decision to outsource information technology and, in particular, to decide on the migration of organizational data, applications, and other resources to the infrastructure, based on cloud computing. And the key issue in the selection of optimal architecture and the subsequent migration of business applications and data to the cloud organization information environment is the question of the total cost of ownership of IT infrastructure. This paper focuses on solving the problem of minimizing the total cost of ownership cloud.

  14. Low cost phantom for computed radiology; Objeto de teste de baixo custo para radiologia computadorizada

    Energy Technology Data Exchange (ETDEWEB)

    Travassos, Paulo Cesar B.; Magalhaes, Luis Alexandre G., E-mail: pctravassos@ufrj.br [Universidade do Estado do Rio de Janeiro (IBRGA/UERJ), RJ (Brazil). Laboratorio de Ciencias Radiologicas; Augusto, Fernando M.; Sant' Yves, Thalis L.A.; Goncalves, Elicardo A.S. [Instituto Nacional de Cancer (INCA), Rio de Janeiro, RJ (Brazil); Botelho, Marina A. [Hospital Universitario Pedro Ernesto (UERJ), Rio de Janeiro, RJ (Brazil)

    2012-08-15

    This article presents the results obtained from a low cost phantom, used to analyze Computed Radiology (CR) equipment. The phantom was constructed to test a few parameters related to image quality, as described in [1-9]. Materials which can be easily purchased were used in the construction of the phantom, with total cost of approximately U$100.00. A bar pattern was placed only to verify the efficacy of the grids in the spatial resolution determination, and was not included in the budget because the data was acquired from the grids. (author)

  15. Cost-effective computational method for radiation heat transfer in semi-crystalline polymers

    Science.gov (United States)

    Boztepe, Sinan; Gilblas, Rémi; de Almeida, Olivier; Le Maoult, Yannick; Schmidt, Fabrice

    2018-05-01

    This paper introduces a cost-effective numerical model for infrared (IR) heating of semi-crystalline polymers. For the numerical and experimental studies presented here semi-crystalline polyethylene (PE) was used. The optical properties of PE were experimentally analyzed under varying temperature and the obtained results were used as input in the numerical studies. The model was built based on optically homogeneous medium assumption whereas the strong variation in the thermo-optical properties of semi-crystalline PE under heating was taken into account. Thus, the change in the amount radiative energy absorbed by the PE medium was introduced in the model induced by its temperature-dependent thermo-optical properties. The computational study was carried out considering an iterative closed-loop computation, where the absorbed radiation was computed using an in-house developed radiation heat transfer algorithm -RAYHEAT- and the computed results was transferred into the commercial software -COMSOL Multiphysics- for solving transient heat transfer problem to predict temperature field. The predicted temperature field was used to iterate the thermo-optical properties of PE that varies under heating. In order to analyze the accuracy of the numerical model experimental analyses were carried out performing IR-thermographic measurements during the heating of the PE plate. The applicability of the model in terms of computational cost, number of numerical input and accuracy was highlighted.

  16. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Science.gov (United States)

    Guerrero, Ginés D.; Imbernón, Baldomero; García, José M.

    2014-01-01

    Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC) platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs) has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO). This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor. PMID:25025055

  17. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Ginés D. Guerrero

    2014-01-01

    Full Text Available Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO. This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor.

  18. Low-cost autonomous perceptron neural network inspired by quantum computation

    Science.gov (United States)

    Zidan, Mohammed; Abdel-Aty, Abdel-Haleem; El-Sadek, Alaa; Zanaty, E. A.; Abdel-Aty, Mahmoud

    2017-11-01

    Achieving low cost learning with reliable accuracy is one of the important goals to achieve intelligent machines to save time, energy and perform learning process over limited computational resources machines. In this paper, we propose an efficient algorithm for a perceptron neural network inspired by quantum computing composite from a single neuron to classify inspirable linear applications after a single training iteration O(1). The algorithm is applied over a real world data set and the results are outer performs the other state-of-the art algorithms.

  19. The Optimal Pricing of Computer Software and Other Products with High Switching Costs

    OpenAIRE

    Pekka Ahtiala

    2004-01-01

    The paper studies the determinants of the optimum prices of computer programs and their upgrades. It is based on the notion that because of the human capital invested in the use of a computer program by its user, this product has high switching costs, and on the finding that pirates are responsible for generating over 80 per cent of new software sales. A model to maximize the present value of the program to the program house is constructed to determine the optimal prices of initial programs a...

  20. Computational cost for detecting inspiralling binaries using a network of laser interferometric detectors

    International Nuclear Information System (INIS)

    Pai, Archana; Bose, Sukanta; Dhurandhar, Sanjeev

    2002-01-01

    We extend a coherent network data-analysis strategy developed earlier for detecting Newtonian waveforms to the case of post-Newtonian (PN) waveforms. Since the PN waveform depends on the individual masses of the inspiralling binary, the parameter-space dimension increases by one from that of the Newtonian case. We obtain the number of templates and estimate the computational costs for PN waveforms: for a lower mass limit of 1M o-dot , for LIGO-I noise and with 3% maximum mismatch, the online computational speed requirement for single detector is a few Gflops; for a two-detector network it is hundreds of Gflops and for a three-detector network it is tens of Tflops. Apart from idealistic networks, we obtain results for realistic networks comprising of LIGO and VIRGO. Finally, we compare costs incurred in a coincidence detection strategy with those incurred in the coherent strategy detailed above

  1. Computational cost for detecting inspiralling binaries using a network of laser interferometric detectors

    CERN Document Server

    Pai, A; Dhurandhar, S V

    2002-01-01

    We extend a coherent network data-analysis strategy developed earlier for detecting Newtonian waveforms to the case of post-Newtonian (PN) waveforms. Since the PN waveform depends on the individual masses of the inspiralling binary, the parameter-space dimension increases by one from that of the Newtonian case. We obtain the number of templates and estimate the computational costs for PN waveforms: for a lower mass limit of 1M sub o sub - sub d sub o sub t , for LIGO-I noise and with 3% maximum mismatch, the online computational speed requirement for single detector is a few Gflops; for a two-detector network it is hundreds of Gflops and for a three-detector network it is tens of Tflops. Apart from idealistic networks, we obtain results for realistic networks comprising of LIGO and VIRGO. Finally, we compare costs incurred in a coincidence detection strategy with those incurred in the coherent strategy detailed above.

  2. POPCYCLE: a computer code for calculating nuclear and fossil plant levelized life-cycle power costs

    International Nuclear Information System (INIS)

    Hardie, R.W.

    1982-02-01

    POPCYCLE, a computer code designed to calculate levelized life-cycle power costs for nuclear and fossil electrical generating plants is described. Included are (1) derivations of the equations and a discussion of the methodology used by POPCYCLE, (2) a description of the input required by the code, (3) a listing of the input for a sample case, and (4) the output for a sample case

  3. Reduced computational cost in the calculation of worst case response time for real time systems

    OpenAIRE

    Urriza, José M.; Schorb, Lucas; Orozco, Javier D.; Cayssials, Ricardo

    2009-01-01

    Modern Real Time Operating Systems require reducing computational costs even though the microprocessors become more powerful each day. It is usual that Real Time Operating Systems for embedded systems have advance features to administrate the resources of the applications that they support. In order to guarantee either the schedulability of the system or the schedulability of a new task in a dynamic Real Time System, it is necessary to know the Worst Case Response Time of the Real Time tasks ...

  4. Greater oil investment opportunities

    International Nuclear Information System (INIS)

    Arenas, Ismael Enrique

    1997-01-01

    Geologically speaking, Colombia is a very attractive country for the world oil community. According to this philosophy new and important steps are being taken to reinforce the oil sector: Expansion of the exploratory frontier by including a larger number of sedimentary areas, and the adoption of innovative contracting instruments. Colombia has to offer, Greater economic incentives for the exploration of new areas to expand the exploratory frontier, stimulation of exploration in areas with prospectivity for small fields. Companies may offer Ecopetrol a participation in production over and above royalties, without it's participating in the investments and costs of these fields, more favorable conditions for natural gas seeking projects, in comparison with those governing the terms for oil

  5. A practical technique for benefit-cost analysis of computer-aided design and drafting systems

    International Nuclear Information System (INIS)

    Shah, R.R.; Yan, G.

    1979-03-01

    Analysis of benefits and costs associated with the operation of Computer-Aided Design and Drafting Systems (CADDS) are needed to derive economic justification for acquiring new systems, as well as to evaluate the performance of existing installations. In practice, however, such analyses are difficult to perform since most technical and economic advantages of CADDS are ΣirreduciblesΣ, i.e. cannot be readily translated into monetary terms. In this paper, a practical technique for economic analysis of CADDS in a drawing office environment is presented. A Σworst caseΣ approach is taken since increase in productivity of existing manpower is the only benefit considered, while all foreseen costs are taken into account. Methods of estimating benefits and costs are described. The procedure for performing the analysis is illustrated by a case study based on the drawing office activities at Atomic Energy of Canada Limited. (auth)

  6. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Science.gov (United States)

    2010-01-01

    ... Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions (a... loan cost rate for various transactions, as well as instructions, explanations, and examples for.... (2) Term of the transaction. For purposes of total annual loan cost disclosures, the term of a...

  7. COMPUTER SYSTEM FOR DETERMINATION OF COST DAILY SUGAR PRODUCTION AND INCIDENTS DECISIONS FOR COMPANIES SUGAR (SACODI

    Directory of Open Access Journals (Sweden)

    Alejandro Álvarez-Navarro

    2016-01-01

    Full Text Available The process of sugar production is complex; anything that affects this chain has direct repercussions in the sugar production’s costs, it’s synthetic and decisive indicator for the taking of decisions. Currently the Cuban sugar factory determine this cost weekly, for that, its process of taking of decisions is affected. Looking for solutions to this problem, the present work, being part of a territorial project approved by CITMA, intended to calculate the cost of production daily, weekly, monthly and accumulated until indicated date, according to an adaptation to the methodology used by the National Costs System of sugarcane created by the MINAZ, it’s supported by a computer system denominated SACODI. This adaptation registers the physical and economic indicators of all direct and indirect expenses of the  sugarcane and besides this information generates an economic-mathematical model of goal programming whose solution indicates the best balance in amount of sugar of the entities of the sugar factory, in short term. The implementation of the system in the sugar factory «Julio A. Mella» in Santiago de Cuba in the sugar-cane production 08-09 produced an estimate of decrease of the cost of until 3,5 % for the taking of better decisions. 

  8. Cost-effectiveness of implementing computed tomography screening for lung cancer in Taiwan.

    Science.gov (United States)

    Yang, Szu-Chun; Lai, Wu-Wei; Lin, Chien-Chung; Su, Wu-Chou; Ku, Li-Jung; Hwang, Jing-Shiang; Wang, Jung-Der

    2017-06-01

    A screening program for lung cancer requires more empirical evidence. Based on the experience of the National Lung Screening Trial (NLST), we developed a method to adjust lead-time bias and quality-of-life changes for estimating the cost-effectiveness of implementing computed tomography (CT) screening in Taiwan. The target population was high-risk (≥30 pack-years) smokers between 55 and 75 years of age. From a nation-wide, 13-year follow-up cohort, we estimated quality-adjusted life expectancy (QALE), loss-of-QALE, and lifetime healthcare expenditures per case of lung cancer stratified by pathology and stage. Cumulative stage distributions for CT-screening and no-screening were assumed equal to those for CT-screening and radiography-screening in the NLST to estimate the savings of loss-of-QALE and additional costs of lifetime healthcare expenditures after CT screening. Costs attributable to screen-negative subjects, false-positive cases and radiation-induced lung cancer were included to obtain the incremental cost-effectiveness ratio from the public payer's perspective. The incremental costs were US$22,755 per person. After dividing this by savings of loss-of-QALE (1.16 quality-adjusted life year (QALY)), the incremental cost-effectiveness ratio was US$19,683 per QALY. This ratio would fall to US$10,947 per QALY if the stage distribution for CT-screening was the same as that of screen-detected cancers in the NELSON trial. Low-dose CT screening for lung cancer among high-risk smokers would be cost-effective in Taiwan. As only about 5% of our women are smokers, future research is necessary to identify the high-risk groups among non-smokers and increase the coverage. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  9. MONITOR: A computer model for estimating the costs of an integral monitored retrievable storage facility

    International Nuclear Information System (INIS)

    Reimus, P.W.; Sevigny, N.L.; Schutz, M.E.; Heller, R.A.

    1986-12-01

    The MONITOR model is a FORTRAN 77 based computer code that provides parametric life-cycle cost estimates for a monitored retrievable storage (MRS) facility. MONITOR is very flexible in that it can estimate the costs of an MRS facility operating under almost any conceivable nuclear waste logistics scenario. The model can also accommodate input data of varying degrees of complexity and detail (ranging from very simple to more complex) which makes it ideal for use in the MRS program, where new designs and new cost data are frequently offered for consideration. MONITOR can be run as an independent program, or it can be interfaced with the Waste System Transportation and Economic Simulation (WASTES) model, a program that simulates the movement of waste through a complete nuclear waste disposal system. The WASTES model drives the MONITOR model by providing it with the annual quantities of waste that are received, stored, and shipped at the MRS facility. Three runs of MONITOR are documented in this report. Two of the runs are for Version 1 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2A (backup) version of the MRS cost estimate. In one of these runs MONITOR was run as an independent model, and in the other run MONITOR was run using an input file generated by the WASTES model. The two runs correspond to identical cases, and the fact that they gave identical results verified that the code performed the same calculations in both modes of operation. The third run was made for Version 2 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2B (integral) version of the MRS cost estimate. This run was made with MONITOR being run as an independent model. The results of several cases have been verified by hand calculations

  10. Cost-Effectiveness of Computed Tomographic Colonography: A Prospective Comparison with Colonoscopy

    International Nuclear Information System (INIS)

    Arnesen, R.B.; Ginnerup-Pedersen, B.; Poulsen, P.B.; Benzon, K. von; Adamsen, S.; Laurberg, S.; Hart-Hansen, O.

    2007-01-01

    Purpose: To estimate the cost-effectiveness of detecting colorectal polyps with computed tomographic colonography (CTC) and subsequent polypectomy with primary colonoscopy (CC), using CC as the alternative strategy. Material and Methods: A marginal analysis was performed regarding 103 patients who had had CTC prior to same-day CC at two hospitals, H-I (n 53) and H-II (n = 50). The patients were randomly chosen from surveillance and symptomatic study populations (148 at H-I and 231 at H-II). Populations, organizations, and procedures were compared. Cost data on time consumption, medication, and minor equipment were collected prospectively, while data on salaries and major equipment were collected retrospectively. The effect was the (previously published) sensitivities of CTC and CC for detection of colorectal polyps ≥6 mm (H-I, n = 148) or ≥5 mm (H-II, n = 231). Results: Thirteen patients at each center had at least one colorectal polyp ≥6 mm or ≥5 mm. CTC was the cost-effective alternative at H-I (Euro 187 vs. Euro 211), while CC was the cost-effective alternative at H-II (Euro 239 vs. Euro 192). The cost-effectiveness (costs per finding) mainly depended on the sensitivity of CTC and CC, but the depreciation of equipment and the staff's use of time were highly influential as well. Conclusion: Detection of colorectal polyps ≥6 mm or ≥5 mm with CTC, followed by polypectomy by CC, can be performed cost-effectively at some institutions with the appropriate hardware and organization keywords

  11. Comparison of different strategies in prenatal screening for Down's syndrome: cost effectiveness analysis of computer simulation.

    Science.gov (United States)

    Gekas, Jean; Gagné, Geneviève; Bujold, Emmanuel; Douillard, Daniel; Forest, Jean-Claude; Reinharz, Daniel; Rousseau, François

    2009-02-13

    To assess and compare the cost effectiveness of three different strategies for prenatal screening for Down's syndrome (integrated test, sequential screening, and contingent screenings) and to determine the most useful cut-off values for risk. Computer simulations to study integrated, sequential, and contingent screening strategies with various cut-offs leading to 19 potential screening algorithms. The computer simulation was populated with data from the Serum Urine and Ultrasound Screening Study (SURUSS), real unit costs for healthcare interventions, and a population of 110 948 pregnancies from the province of Québec for the year 2001. Cost effectiveness ratios, incremental cost effectiveness ratios, and screening options' outcomes. The contingent screening strategy dominated all other screening options: it had the best cost effectiveness ratio ($C26,833 per case of Down's syndrome) with fewer procedure related euploid miscarriages and unnecessary terminations (respectively, 6 and 16 per 100,000 pregnancies). It also outperformed serum screening at the second trimester. In terms of the incremental cost effectiveness ratio, contingent screening was still dominant: compared with screening based on maternal age alone, the savings were $C30,963 per additional birth with Down's syndrome averted. Contingent screening was the only screening strategy that offered early reassurance to the majority of women (77.81%) in first trimester and minimised costs by limiting retesting during the second trimester (21.05%). For the contingent and sequential screening strategies, the choice of cut-off value for risk in the first trimester test significantly affected the cost effectiveness ratios (respectively, from $C26,833 to $C37,260 and from $C35,215 to $C45,314 per case of Down's syndrome), the number of procedure related euploid miscarriages (from 6 to 46 and from 6 to 45 per 100,000 pregnancies), and the number of unnecessary terminations (from 16 to 26 and from 16 to 25 per 100

  12. Computer-Aided Surgical Simulation in Head and Neck Reconstruction: A Cost Comparison among Traditional, In-House, and Commercial Options.

    Science.gov (United States)

    Li, Sean S; Copeland-Halperin, Libby R; Kaminsky, Alexander J; Li, Jihui; Lodhi, Fahad K; Miraliakbari, Reza

    2018-06-01

    Computer-aided surgical simulation (CASS) has redefined surgery, improved precision and reduced the reliance on intraoperative trial-and-error manipulations. CASS is provided by third-party services; however, it may be cost-effective for some hospitals to develop in-house programs. This study provides the first cost analysis comparison among traditional (no CASS), commercial CASS, and in-house CASS for head and neck reconstruction.  The costs of three-dimensional (3D) pre-operative planning for mandibular and maxillary reconstructions were obtained from an in-house CASS program at our large tertiary care hospital in Northern Virginia, as well as a commercial provider (Synthes, Paoli, PA). A cost comparison was performed among these modalities and extrapolated in-house CASS costs were derived. The calculations were based on estimated CASS use with cost structures similar to our institution and sunk costs were amortized over 10 years.  Average operating room time was estimated at 10 hours, with an average of 2 hours saved with CASS. The hourly cost to the hospital for the operating room (including anesthesia and other ancillary costs) was estimated at $4,614/hour. Per case, traditional cases were $46,140, commercial CASS cases were $40,951, and in-house CASS cases were $38,212. Annual in-house CASS costs were $39,590.  CASS reduced operating room time, likely due to improved efficiency and accuracy. Our data demonstrate that hospitals with similar cost structure as ours, performing greater than 27 cases of 3D head and neck reconstructions per year can see a financial benefit from developing an in-house CASS program. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  13. Cost-effectiveness of routine computed tomography in the evaluation of idiopathic unilateral vocal fold paralysis.

    Science.gov (United States)

    Hojjat, Houmehr; Svider, Peter F; Folbe, Adam J; Raza, Syed N; Carron, Michael A; Shkoukani, Mahdi A; Merati, Albert L; Mayerhoff, Ross M

    2017-02-01

    To evaluate the cost-effectiveness of routine computed tomography (CT) in individuals with unilateral vocal fold paralysis (UVFP) STUDY DESIGN: Health Economics Decision Tree Analysis METHODS: A decision tree was constructed to determine the incremental cost-effectiveness ratio (ICER) of CT imaging in UVFP patients. Univariate sensitivity analysis was utilized to calculate what the probability of having an etiology of the paralysis discovered would have to be to make CT with contrast more cost-effective than no imaging. We used two studies examining findings in UVFP patients. The decision pathways were utilizing CT neck with intravenous contrast after diagnostic laryngoscopy versus laryngoscopy alone. The probability of detecting an etiology for UVFP and associated costs were extracted to construct the decision tree. The only incorrect diagnosis was missing a mass in the no-imaging decision branch, which rendered an effectiveness of 0. The ICER of using CT was $3,306, below most acceptable willingness-to-pay (WTP) thresholds. Additionally, univariate sensitivity analysis indicated that at the WTP threshold of $30,000, obtaining CT imaging was the most cost-effective choice when the probability of having a lesion was above 1.7%. Multivariate probabilistic sensitivity analysis with Monte Carlo simulations also showed that at the WTP of $30,000, CT scanning is more cost-effective, with 99.5% certainty. Particularly in the current healthcare environment characterized by increasing consciousness of utilization defensive medicine, economic evaluations represent evidence-based findings that can be employed to facilitate appropriate decision making and enhance physician-patient communication. This economic evaluation strongly supports obtaining CT imaging in patients with newly diagnosed UVFP. 2c. Laryngoscope, 2016 127:440-444, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  14. Social incidence and economic costs of carbon limits; A computable general equilibrium analysis for Switzerland

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, G.; Van Nieuwkoop, R.; Wiedmer, T. (Institute for Applied Microeconomics, Univ. of Bern (Switzerland))

    1992-01-01

    Both distributional and allocational effects of limiting carbon dioxide emissions in a small and open economy are discussed. It starts from the assumption that Switzerland attempts to stabilize its greenhouse gas emissions over the next 25 years, and evaluates costs and benefits of the respective reduction programme. From a methodological viewpoint, it is illustrated how a computable general equilibrium approach can be adopted for identifying economic effects of cutting greenhouse gas emissions on the national level. From a political economy point of view it considers the social incidence of a greenhouse policy. It shows in particular that public acceptance can be increased and economic costs of greenhouse policies can be reduced, if carbon taxes are accompanied by revenue redistribution. 8 tabs., 1 app., 17 refs.

  15. On the role of cost-sensitive learning in multi-class brain-computer interfaces.

    Science.gov (United States)

    Devlaminck, Dieter; Waegeman, Willem; Wyns, Bart; Otte, Georges; Santens, Patrick

    2010-06-01

    Brain-computer interfaces (BCIs) present an alternative way of communication for people with severe disabilities. One of the shortcomings in current BCI systems, recently put forward in the fourth BCI competition, is the asynchronous detection of motor imagery versus resting state. We investigated this extension to the three-class case, in which the resting state is considered virtually lying between two motor classes, resulting in a large penalty when one motor task is misclassified into the other motor class. We particularly focus on the behavior of different machine-learning techniques and on the role of multi-class cost-sensitive learning in such a context. To this end, four different kernel methods are empirically compared, namely pairwise multi-class support vector machines (SVMs), two cost-sensitive multi-class SVMs and kernel-based ordinal regression. The experimental results illustrate that ordinal regression performs better than the other three approaches when a cost-sensitive performance measure such as the mean-squared error is considered. By contrast, multi-class cost-sensitive learning enables us to control the number of large errors made between two motor tasks.

  16. Computational Comparison of Several Greedy Algorithms for the Minimum Cost Perfect Matching Problem on Large Graphs

    DEFF Research Database (Denmark)

    Wøhlk, Sanne; Laporte, Gilbert

    2017-01-01

    The aim of this paper is to computationally compare several algorithms for the Minimum Cost Perfect Matching Problem on an undirected complete graph. Our work is motivated by the need to solve large instances of the Capacitated Arc Routing Problem (CARP) arising in the optimization of garbage...... collection in Denmark. Common heuristics for the CARP involve the optimal matching of the odd-degree nodes of a graph. The algorithms used in the comparison include the CPLEX solution of an exact formulation, the LEDA matching algorithm, a recent implementation of the Blossom algorithm, as well as six...

  17. Expedited Holonomic Quantum Computation via Net Zero-Energy-Cost Control in Decoherence-Free Subspace.

    Science.gov (United States)

    Pyshkin, P V; Luo, Da-Wei; Jing, Jun; You, J Q; Wu, Lian-Ao

    2016-11-25

    Holonomic quantum computation (HQC) may not show its full potential in quantum speedup due to the prerequisite of a long coherent runtime imposed by the adiabatic condition. Here we show that the conventional HQC can be dramatically accelerated by using external control fields, of which the effectiveness is exclusively determined by the integral of the control fields in the time domain. This control scheme can be realized with net zero energy cost and it is fault-tolerant against fluctuation and noise, significantly relaxing the experimental constraints. We demonstrate how to realize the scheme via decoherence-free subspaces. In this way we unify quantum robustness merits of this fault-tolerant control scheme, the conventional HQC and decoherence-free subspace, and propose an expedited holonomic quantum computation protocol.

  18. Cost-effective computations with boundary interface operators in elliptic problems

    International Nuclear Information System (INIS)

    Khoromskij, B.N.; Mazurkevich, G.E.; Nikonov, E.G.

    1993-01-01

    The numerical algorithm for fast computations with interface operators associated with the elliptic boundary value problems (BVP) defined on step-type domains is presented. The algorithm is based on the asymptotically almost optimal technique developed for treatment of the discrete Poincare-Steklov (PS) operators associated with the finite-difference Laplacian on rectangles when using the uniform grid with a 'displacement by h/2'. The approach can be regarded as an extension of the method proposed for the partial solution of the finite-difference Laplace equation to the case of displaced grids and mixed boundary conditions. It is shown that the action of the PS operator for the Dirichlet problem and mixed BVP can be computed with expenses of the order of O(Nlog 2 N) both for arithmetical operations and computer memory needs, where N is the number of unknowns on the rectangle boundary. The single domain algorithm is applied to solving the multidomain elliptic interface problems with piecewise constant coefficients. The numerical experiments presented confirm almost linear growth of the computational costs and memory needs with respect to the dimension of the discrete interface problem. 14 refs., 3 figs., 4 tabs

  19. Direct costs and cost-effectiveness of dual-source computed tomography and invasive coronary angiography in patients with an intermediate pretest likelihood for coronary artery disease.

    Science.gov (United States)

    Dorenkamp, Marc; Bonaventura, Klaus; Sohns, Christian; Becker, Christoph R; Leber, Alexander W

    2012-03-01

    The study aims to determine the direct costs and comparative cost-effectiveness of latest-generation dual-source computed tomography (DSCT) and invasive coronary angiography for diagnosing coronary artery disease (CAD) in patients suspected of having this disease. The study was based on a previously elaborated cohort with an intermediate pretest likelihood for CAD and on complementary clinical data. Cost calculations were based on a detailed analysis of direct costs, and generally accepted accounting principles were applied. Based on Bayes' theorem, a mathematical model was used to compare the cost-effectiveness of both diagnostic approaches. Total costs included direct costs, induced costs and costs of complications. Effectiveness was defined as the ability of a diagnostic test to accurately identify a patient with CAD. Direct costs amounted to €98.60 for DSCT and to €317.75 for invasive coronary angiography. Analysis of model calculations indicated that cost-effectiveness grew hyperbolically with increasing prevalence of CAD. Given the prevalence of CAD in the study cohort (24%), DSCT was found to be more cost-effective than invasive coronary angiography (€970 vs €1354 for one patient correctly diagnosed as having CAD). At a disease prevalence of 49%, DSCT and invasive angiography were equally effective with costs of €633. Above a threshold value of disease prevalence of 55%, proceeding directly to invasive coronary angiography was more cost-effective than DSCT. With proper patient selection and consideration of disease prevalence, DSCT coronary angiography is cost-effective for diagnosing CAD in patients with an intermediate pretest likelihood for it. However, the range of eligible patients may be smaller than previously reported.

  20. Cost-effectiveness of computer-assisted training in cognitive-behavioral therapy as an adjunct to standard care for addiction.

    Science.gov (United States)

    Olmstead, Todd A; Ostrow, Cary D; Carroll, Kathleen M

    2010-08-01

    To determine the cost-effectiveness, from clinic and patient perspectives, of a computer-based version of cognitive-behavioral therapy (CBT4CBT) as an addition to regular clinical practice for substance dependence. PARTICIPANTS, DESIGN AND MEASUREMENTS: This cost-effectiveness study is based on a randomized clinical trial in which 77 individuals seeking treatment for substance dependence at an outpatient community setting were randomly assigned to treatment as usual (TAU) or TAU plus biweekly access to computer-based training in CBT (TAU plus CBT4CBT). The primary patient outcome measure was the total number of drug-free specimens provided during treatment. Incremental cost-effectiveness ratios (ICERs) and cost-effectiveness acceptability curves (CEACs) were used to determine the cost-effectiveness of TAU plus CBT4CBT relative to TAU alone. Results are presented from both the clinic and patient perspectives and are shown to be robust to (i) sensitivity analyses and (ii) a secondary objective patient outcome measure. The per patient cost of adding CBT4CBT to standard care was $39 ($27) from the clinic (patient) perspective. From the clinic (patient) perspective, TAU plus CBT4CBT is likely to be cost-effective when the threshold value to decision makers of an additional drug-free specimen is greater than approximately $21 ($15), and TAU alone is likely to be cost-effective when the threshold value is less than approximately $21 ($15). The ICERs for TAU plus CBT4CBT also compare favorably to ICERs reported elsewhere for other empirically validated therapies, including contingency management. TAU plus CBT4CBT appears to be a good value from both the clinic and patient perspectives. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  1. Feasibility Study and Cost Benefit Analysis of Thin-Client Computer System Implementation Onboard United States Navy Ships

    National Research Council Canada - National Science Library

    Arbulu, Timothy D; Vosberg, Brian J

    2007-01-01

    The purpose of this MBA project was to conduct a feasibility study and a cost benefit analysis of using thin-client computer systems instead of traditional networks onboard United States Navy ships...

  2. Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers

    KAUST Repository

    Woźniak, Maciej

    2014-06-01

    In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.

  3. Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers

    KAUST Repository

    Woźniak, Maciej; Kuźnik, Krzysztof M.; Paszyński, Maciej R.; Calo, Victor M.; Pardo, D.

    2014-01-01

    In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.

  4. Computational sensing of herpes simplex virus using a cost-effective on-chip microscope

    KAUST Repository

    Ray, Aniruddha

    2017-07-03

    Caused by the herpes simplex virus (HSV), herpes is a viral infection that is one of the most widespread diseases worldwide. Here we present a computational sensing technique for specific detection of HSV using both viral immuno-specificity and the physical size range of the viruses. This label-free approach involves a compact and cost-effective holographic on-chip microscope and a surface-functionalized glass substrate prepared to specifically capture the target viruses. To enhance the optical signatures of individual viruses and increase their signal-to-noise ratio, self-assembled polyethylene glycol based nanolenses are rapidly formed around each virus particle captured on the substrate using a portable interface. Holographic shadows of specifically captured viruses that are surrounded by these self-assembled nanolenses are then reconstructed, and the phase image is used for automated quantification of the size of each particle within our large field-of-view, ~30 mm2. The combination of viral immuno-specificity due to surface functionalization and the physical size measurements enabled by holographic imaging is used to sensitively detect and enumerate HSV particles using our compact and cost-effective platform. This computational sensing technique can find numerous uses in global health related applications in resource-limited environments.

  5. Cost Savings Associated with the Adoption of a Cloud Computing Data Transfer System for Trauma Patients.

    Science.gov (United States)

    Feeney, James M; Montgomery, Stephanie C; Wolf, Laura; Jayaraman, Vijay; Twohig, Michael

    2016-09-01

    Among transferred trauma patients, challenges with the transfer of radiographic studies include problems loading or viewing the studies at the receiving hospitals, and problems manipulating, reconstructing, or evalu- ating the transferred images. Cloud-based image transfer systems may address some ofthese problems. We reviewed the charts of patients trans- ferred during one year surrounding the adoption of a cloud computing data transfer system. We compared the rates of repeat imaging before (precloud) and af- ter (postcloud) the adoption of the cloud-based data transfer system. During the precloud period, 28 out of 100 patients required 90 repeat studies. With the cloud computing transfer system in place, three out of 134 patients required seven repeat films. There was a statistically significant decrease in the proportion of patients requiring repeat films (28% to 2.2%, P < .0001). Based on an annualized volume of 200 trauma patient transfers, the cost savings estimated using three methods of cost analysis, is between $30,272 and $192,453.

  6. Greater accordance with the Dietary Approaches to Stop Hypertension dietary pattern is associated with lower diet-related greenhouse gas production but higher dietary costs in the United Kingdom12

    Science.gov (United States)

    Monsivais, Pablo; Scarborough, Peter; Lloyd, Tina; Mizdrak, Anja; Luben, Robert; Mulligan, Angela A; Wareham, Nicholas J; Woodcock, James

    2015-01-01

    Background: The Dietary Approaches to Stop Hypertension (DASH) diet is a proven way to prevent and control hypertension and other chronic disease. Because the DASH diet emphasizes plant-based foods, including vegetables and grains, adhering to this diet might also bring about environmental benefits, including lower associated production of greenhouse gases (GHGs). Objective: The objective was to examine the interrelation between dietary accordance with the DASH diet and associated GHGs. A secondary aim was to examine the retail cost of diets by level of DASH accordance. Design: In this cross-sectional study of adults aged 39–79 y from the European Prospective Investigation into Cancer and Nutrition–Norfolk, United Kingdom cohort (n = 24,293), dietary intakes estimated from food-frequency questionnaires were analyzed for their accordance with the 8 DASH food and nutrient-based targets. Associations between DASH accordance, GHGs, and dietary costs were evaluated in regression analyses. Dietary GHGs were estimated with United Kingdom-specific data on carbon dioxide equivalents associated with commodities and foods. Dietary costs were estimated by using national food prices from a United Kingdom–based supermarket comparison website. Results: Greater accordance with the DASH dietary targets was associated with lower GHGs. Diets in the highest quintile of accordance had a GHG impact of 5.60 compared with 6.71 kg carbon dioxide equivalents/d for least-accordant diets (P dietary costs, with the mean cost of diets in the top quintile of DASH scores 18% higher than that of diets in the lowest quintile (P < 0.0001). Conclusions: Promoting wider uptake of the DASH diet in the United Kingdom may improve population health and reduce diet-related GHGs. However, to make the DASH diet more accessible, food affordability, particularly for lower income groups, will have to be addressed. PMID:25926505

  7. Greater accordance with the Dietary Approaches to Stop Hypertension dietary pattern is associated with lower diet-related greenhouse gas production but higher dietary costs in the United Kingdom.

    Science.gov (United States)

    Monsivais, Pablo; Scarborough, Peter; Lloyd, Tina; Mizdrak, Anja; Luben, Robert; Mulligan, Angela A; Wareham, Nicholas J; Woodcock, James

    2015-07-01

    The Dietary Approaches to Stop Hypertension (DASH) diet is a proven way to prevent and control hypertension and other chronic disease. Because the DASH diet emphasizes plant-based foods, including vegetables and grains, adhering to this diet might also bring about environmental benefits, including lower associated production of greenhouse gases (GHGs). The objective was to examine the interrelation between dietary accordance with the DASH diet and associated GHGs. A secondary aim was to examine the retail cost of diets by level of DASH accordance. In this cross-sectional study of adults aged 39-79 y from the European Prospective Investigation into Cancer and Nutrition-Norfolk, United Kingdom cohort (n = 24,293), dietary intakes estimated from food-frequency questionnaires were analyzed for their accordance with the 8 DASH food and nutrient-based targets. Associations between DASH accordance, GHGs, and dietary costs were evaluated in regression analyses. Dietary GHGs were estimated with United Kingdom-specific data on carbon dioxide equivalents associated with commodities and foods. Dietary costs were estimated by using national food prices from a United Kingdom-based supermarket comparison website. Greater accordance with the DASH dietary targets was associated with lower GHGs. Diets in the highest quintile of accordance had a GHG impact of 5.60 compared with 6.71 kg carbon dioxide equivalents/d for least-accordant diets (P dietary costs, with the mean cost of diets in the top quintile of DASH scores 18% higher than that of diets in the lowest quintile (P < 0.0001). Promoting wider uptake of the DASH diet in the United Kingdom may improve population health and reduce diet-related GHGs. However, to make the DASH diet more accessible, food affordability, particularly for lower income groups, will have to be addressed.

  8. Computational cost of isogeometric multi-frontal solvers on parallel distributed memory machines

    KAUST Repository

    Woźniak, Maciej

    2015-02-01

    This paper derives theoretical estimates of the computational cost for isogeometric multi-frontal direct solver executed on parallel distributed memory machines. We show theoretically that for the Cp-1 global continuity of the isogeometric solution, both the computational cost and the communication cost of a direct solver are of order O(log(N)p2) for the one dimensional (1D) case, O(Np2) for the two dimensional (2D) case, and O(N4/3p2) for the three dimensional (3D) case, where N is the number of degrees of freedom and p is the polynomial order of the B-spline basis functions. The theoretical estimates are verified by numerical experiments performed with three parallel multi-frontal direct solvers: MUMPS, PaStiX and SuperLU, available through PETIGA toolkit built on top of PETSc. Numerical results confirm these theoretical estimates both in terms of p and N. For a given problem size, the strong efficiency rapidly decreases as the number of processors increases, becoming about 20% for 256 processors for a 3D example with 1283 unknowns and linear B-splines with C0 global continuity, and 15% for a 3D example with 643 unknowns and quartic B-splines with C3 global continuity. At the same time, one cannot arbitrarily increase the problem size, since the memory required by higher order continuity spaces is large, quickly consuming all the available memory resources even in the parallel distributed memory version. Numerical results also suggest that the use of distributed parallel machines is highly beneficial when solving higher order continuity spaces, although the number of processors that one can efficiently employ is somehow limited.

  9. A novel cost based model for energy consumption in cloud computing.

    Science.gov (United States)

    Horri, A; Dastghaibyfard, Gh

    2015-01-01

    Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment.

  10. Omniscopes: Large area telescope arrays with only NlogN computational cost

    International Nuclear Information System (INIS)

    Tegmark, Max; Zaldarriaga, Matias

    2010-01-01

    We show that the class of antenna layouts for telescope arrays allowing cheap analysis hardware (with correlator cost scaling as NlogN rather than N 2 with the number of antennas N) is encouragingly large, including not only previously discussed rectangular grids but also arbitrary hierarchies of such grids, with arbitrary rotations and shears at each level. We show that all correlations for such a 2D array with an n-level hierarchy can be efficiently computed via a fast Fourier transform in not two but 2n dimensions. This can allow major correlator cost reductions for science applications requiring exquisite sensitivity at widely separated angular scales, for example, 21 cm tomography (where short baselines are needed to probe the cosmological signal and long baselines are needed for point source removal), helping enable future 21 cm experiments with thousands or millions of cheap dipolelike antennas. Such hierarchical grids combine the angular resolution advantage of traditional array layouts with the cost advantage of a rectangular fast Fourier transform telescope. We also describe an algorithm for how a subclass of hierarchical arrays can efficiently use rotation synthesis to produce global sky maps with minimal noise and a well-characterized synthesized beam.

  11. A Comprehensive and Cost-Effective Computer Infrastructure for K-12 Schools

    Science.gov (United States)

    Warren, G. P.; Seaton, J. M.

    1996-01-01

    Since 1993, NASA Langley Research Center has been developing and implementing a low-cost Internet connection model, including system architecture, training, and support, to provide Internet access for an entire network of computers. This infrastructure allows local area networks which exceed 50 machines per school to independently access the complete functionality of the Internet by connecting to a central site, using state-of-the-art commercial modem technology, through a single standard telephone line. By locating high-cost resources at this central site and sharing these resources and their costs among the school districts throughout a region, a practical, efficient, and affordable infrastructure for providing scale-able Internet connectivity has been developed. As the demand for faster Internet access grows, the model has a simple expansion path that eliminates the need to replace major system components and re-train personnel. Observations of optical Internet usage within an environment, particularly school classrooms, have shown that after an initial period of 'surfing,' the Internet traffic becomes repetitive. By automatically storing requested Internet information on a high-capacity networked disk drive at the local site (network based disk caching), then updating this information only when it changes, well over 80 percent of the Internet traffic that leaves a location can be eliminated by retrieving the information from the local disk cache.

  12. Scilab software as an alternative low-cost computing in solving the linear equations problem

    Science.gov (United States)

    Agus, Fahrul; Haviluddin

    2017-02-01

    Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.

  13. Assessing Tax Form Distribution Costs: A Proposed Method for Computing the Dollar Value of Tax Form Distribution in a Public Library.

    Science.gov (United States)

    Casey, James B.

    1998-01-01

    Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…

  14. Use of several Cloud Computing approaches for climate modelling: performance, costs and opportunities

    Science.gov (United States)

    Perez Montes, Diego A.; Añel Cabanelas, Juan A.; Wallom, David C. H.; Arribas, Alberto; Uhe, Peter; Caderno, Pablo V.; Pena, Tomas F.

    2017-04-01

    Cloud Computing is a technological option that offers great possibilities for modelling in geosciences. We have studied how two different climate models, HadAM3P-HadRM3P and CESM-WACCM, can be adapted in two different ways to run on Cloud Computing Environments from three different vendors: Amazon, Google and Microsoft. Also, we have evaluated qualitatively how the use of Cloud Computing can affect the allocation of resources by funding bodies and issues related to computing security, including scientific reproducibility. Our first experiments were developed using the well known ClimatePrediction.net (CPDN), that uses BOINC, over the infrastructure from two cloud providers, namely Microsoft Azure and Amazon Web Services (hereafter AWS). For this comparison we ran a set of thirteen month climate simulations for CPDN in Azure and AWS using a range of different virtual machines (VMs) for HadRM3P (50 km resolution over South America CORDEX region) nested in the global atmosphere-only model HadAM3P. These simulations were run on a single processor and took between 3 and 5 days to compute depending on the VM type. The last part of our simulation experiments was running WACCM over different VMS on the Google Compute Engine (GCE) and make a comparison with the supercomputer (SC) Finisterrae1 from the Centro de Supercomputacion de Galicia. It was shown that GCE gives better performance than the SC for smaller number of cores/MPI tasks but the model throughput shows clearly how the SC performance is better after approximately 100 cores (related with network speed and latency differences). From a cost point of view, Cloud Computing moves researchers from a traditional approach where experiments were limited by the available hardware resources to monetary resources (how many resources can be afforded). As there is an increasing movement and recommendation for budgeting HPC projects on this technology (budgets can be calculated in a more realistic way) we could see a shift on

  15. User Experience May be Producing Greater Heart Rate Variability than Motor Imagery Related Control Tasks during the User-System Adaptation in Brain-Computer Interfaces

    Science.gov (United States)

    Alonso-Valerdi, Luz M.; Gutiérrez-Begovich, David A.; Argüello-García, Janet; Sepulveda, Francisco; Ramírez-Mendoza, Ricardo A.

    2016-01-01

    Brain-computer interface (BCI) is technology that is developing fast, but it remains inaccurate, unreliable and slow due to the difficulty to obtain precise information from the brain. Consequently, the involvement of other biosignals to decode the user control tasks has risen in importance. A traditional way to operate a BCI system is via motor imagery (MI) tasks. As imaginary movements activate similar cortical structures and vegetative mechanisms as a voluntary movement does, heart rate variability (HRV) has been proposed as a parameter to improve the detection of MI related control tasks. However, HR is very susceptible to body needs and environmental demands, and as BCI systems require high levels of attention, perceptual processing and mental workload, it is important to assess the practical effectiveness of HRV. The present study aimed to determine if brain and heart electrical signals (HRV) are modulated by MI activity used to control a BCI system, or if HRV is modulated by the user perceptions and responses that result from the operation of a BCI system (i.e., user experience). For this purpose, a database of 11 participants who were exposed to eight different situations was used. The sensory-cognitive load (intake and rejection tasks) was controlled in those situations. Two electrophysiological signals were utilized: electroencephalography and electrocardiography. From those biosignals, event-related (de-)synchronization maps and event-related HR changes were respectively estimated. The maps and the HR changes were cross-correlated in order to verify if both biosignals were modulated due to MI activity. The results suggest that HR varies according to the experience undergone by the user in a BCI working environment, and not because of the MI activity used to operate the system. PMID:27458384

  16. Computed tomography findings in young children with minor head injury presenting to the emergency department greater than 24h post injury.

    Science.gov (United States)

    Gelernter, Renana; Weiser, Giora; Kozer, Eran

    2018-01-01

    Large studies which developed decision rules for the use of Computed tomography (CT) in children with minor head trauma excluded children with late presentation (more than 24h). To assess the prevalence of significant traumatic brain injury (TBI) on CT in infants with head trauma presenting to the emergency department (ED) more than 24h from the injury. A retrospective chart review of infants less than 24 months old referred for head CT because of traumatic brain injury from January 2004 to December 2014 in Assaf-Harofeh medical center was conducted. We used the PECARN definitions of TBI on CT to define significant CT findings. 344 cases were analyzed, 68 with late presentation. There was no significant difference in the age between children with late and early presentation (mean 11.4 (SD 5.6) month vs 10. 5 (SD 7.0) month, P=0.27). There was no significant difference between the groups in the incidence of significant TBI (22% vs 19%, p=0.61). Any TBI on CT (e.g. fracture) was found in 43 (63%) patients with late presentation compared with 116 (42%) patients with early presentation (p=0.002, OR 2.37, 95% CI 1.37-4.1). A similar rate of CT-identified traumatic brain injury was detected in both groups.‏ There was no significant difference in the incidence of significant TBI on CT between the groups.‏ Young children presenting to the ED more than 24 hours after the injury may have abnormal findings on CT. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Patient-specific radiation dose and cancer risk in computed tomography examinations in some selected CT facilities in the Greater Accra Region of Ghana

    International Nuclear Information System (INIS)

    Osei, R. K.

    2012-01-01

    The effective dose and cancer risk were determined for patients undergoing seven different types of CT examinations in two CT facilities in the Greater Accra region of Ghana. The two facilities, namely; the Diagnostic Centre Ltd and Cocoa Clinic were chosen because of their significant patient throughput. The effective dose was from patient data namely age, sex, height, weight and technique factors; namely scan length, KVp (Kilovolts peak), mAs (milliamperes per second) and CTDIv from the control console of the CT machines. The effective dose was also estimated using the dose length product (DLP) and k Coefficients which is the anatomic region specific conversion factors. The cancer risk for each patient for a particular examination was determined from the effective dose, age and sex of each patient with the help of BEIR VII. In all, a total number of 800 adult patients with 400 from each of the two CT facilities were compiled. From Diagnostic Centre Ltd, the average effective dose was 5.61mSv in the range of 1.41mSv to 13.34mSv with average BMI of 26.19kg/m 2 in the range of 16.90kg/m 2 to 48.28kg/m 2 for all types of examinations. The average cancer risk was 0.0458 Sv - 1 for 400 patients in the range of 0.0001 Sv - 1 to 0.3036 Sv -1 compared with a population of 900 patients undergoing CT examination per year. From Cocoa Clinic, the average effective dose was 3.91MSv in the range of 0.54mSv to 27.32mSv with an average BMI of 25.59 kg/m 2 in the range of 17.18kg/m 2 to 35.34kg/m 2 and the average cancer risk was 0.0371 Sv - 1 in the range of 0.0001 Sv - 1 and 0.7125 Sv -1 . Some of the values were within the range of values of typical for typical effective dose for CT examinations reported by the ICRP. It was evident from this study that the variations in scanning parameters had significant impact on the effective doses to patient for similar CT examinations among the two facilities.(au)

  18. Costs and role of ultrasound follow-up of polytrauma patients after initial computed tomography

    International Nuclear Information System (INIS)

    Maurer, M.H.; Winkler, A.; Powerski, M.J.; Elgeti, F.; Huppertz, A.; Roettgen, R.; Marnitz, T.; Wichlas, F.

    2012-01-01

    Purpose: To assess the costs and diagnostic gain of abdominal ultrasound follow-up of polytrauma patients initially examined by whole-body computed tomography (CT). Materials and Methods: A total of 176 patients with suspected multiple trauma (126 men, 50 women; age 43.5 ± 17.4 years) were retrospectively analyzed with regard to supplementary and new findings obtained by ultrasound follow-up compared with the results of exploratory FAST (focused assessment with sonography for trauma) at admission and the findings of whole-body CT. A process model was used to document the staff, materials, and total costs of the ultrasound follow-up examinations. Results: FAST yielded 26 abdominal findings (organ injury and/or free intra-abdominal fluid) in 19 patients, while the abdominal scan of whole-body CT revealed 32 findings in 25 patients. FAST had 81 % sensitivity and 100 % specificity. Follow-up ultrasound examinations revealed new findings in 2 of the 25 patients with abdominal injuries detected with initial CT. In the 151 patients without abdominal injuries in the initial CT scan, ultrasound follow-up did not yield any supplementary or new findings. The total costs of an ultrasound follow-up examination were EUR 28.93. The total costs of all follow-up ultrasound examinations performed in the study population were EUR 5658.23. Conclusion: Follow-up abdominal ultrasound yields only a low overall diagnostic gain in polytrauma patients in whom initial CT fails to detect any abdominal injuries but incurs high personnel expenses for radiological departments. (orig.)

  19. Server Operation and Virtualization to Save Energy and Cost in Future Sustainable Computing

    Directory of Open Access Journals (Sweden)

    Jun-Ho Huh

    2018-06-01

    Full Text Available Since the introduction of the LTE (Long Term Evolution service, we have lived in a time of expanding amounts of data. The amount of data produced has increased every year with the increase of smart phone distribution in particular. Telecommunication service providers have to struggle to secure sufficient network capacity in order to maintain quick access to necessary data by consumers. Nonetheless, maintaining the maximum capacity and bandwidth at all times requires considerable cost and excessive equipment. Therefore, to solve such a problem, telecommunication service providers need to maintain an appropriate level of network capacity and to provide sustainable service to customers through a quick network development in case of shortage. So far, telecommunication service providers have bought and used the network equipment directly produced by network equipment manufacturers such as Ericsson, Nokia, Cisco, and Samsung. Since the equipment is specialized for networking, which satisfied consumers with their excellent performances, they are very costly because they are developed with advanced technologies. Moreover, it takes much time due to the purchase process wherein the telecommunication service providers place an order and the manufacturer produces and delivers. Accordingly, there are cases that require signaling and two-way data traffic as well as capacity because of the diversity of IoT devices. For these purposes, the need for NFV (Network Function Virtualization is raised. Equipment virtualization is performed so that it is operated on an x86-based compatible server instead of working on the network equipment manufacturer’s dedicated hardware. By operating in some compatible servers, it can reduce the wastage of hardware and cope with the change thanks to quick hardware development. This study proposed an efficient system of reducing cost in network server operation using such NFV technology and found that the cost was reduced by 24

  20. A precise goniometer/tensiometer using a low cost single-board computer

    Science.gov (United States)

    Favier, Benoit; Chamakos, Nikolaos T.; Papathanasiou, Athanasios G.

    2017-12-01

    Measuring the surface tension and the Young contact angle of a droplet is extremely important for many industrial applications. Here, considering the booming interest for small and cheap but precise experimental instruments, we have constructed a low-cost contact angle goniometer/tensiometer, based on a single-board computer (Raspberry Pi). The device runs an axisymmetric drop shape analysis (ADSA) algorithm written in Python. The code, here named DropToolKit, was developed in-house. We initially present the mathematical framework of our algorithm and then we validate our software tool against other well-established ADSA packages, including the commercial ramé-hart DROPimage Advanced as well as the DropAnalysis plugin in ImageJ. After successfully testing for various combinations of liquids and solid surfaces, we concluded that our prototype device would be highly beneficial for industrial applications as well as for scientific research in wetting phenomena compared to the commercial solutions.

  1. Unenhanced computed tomography in acute renal colic reduces cost outside radiology department

    DEFF Research Database (Denmark)

    Lauritsen, J.; Andersen, J.R.; Nordling, J.

    2008-01-01

    BACKGROUND: Unenhanced multidetector computed tomography (UMDCT) is well established as the procedure of choice for radiologic evaluation of patients with renal colic. The procedure has both clinical and financial consequences for departments of surgery and radiology. However, the financial effect...... outside the radiology department is poorly elucidated. PURPOSE: To evaluate the financial consequences outside of the radiology department, a retrospective study comparing the ward occupation of patients examined with UMDCT to that of intravenous urography (IVU) was performed. MATERIAL AND METHODS......) saved the hospital USD 265,000 every 6 months compared to the use of IVU. CONCLUSION: Use of UMDCT compared to IVU in patients with renal colic leads to cost savings outside the radiology department Udgivelsesdato: 2008/12...

  2. Low-cost, high-performance and efficiency computational photometer design

    Science.gov (United States)

    Siewert, Sam B.; Shihadeh, Jeries; Myers, Randall; Khandhar, Jay; Ivanov, Vitaly

    2014-05-01

    Researchers at the University of Alaska Anchorage and University of Colorado Boulder have built a low cost high performance and efficiency drop-in-place Computational Photometer (CP) to test in field applications ranging from port security and safety monitoring to environmental compliance monitoring and surveying. The CP integrates off-the-shelf visible spectrum cameras with near to long wavelength infrared detectors and high resolution digital snapshots in a single device. The proof of concept combines three or more detectors into a single multichannel imaging system that can time correlate read-out, capture, and image process all of the channels concurrently with high performance and energy efficiency. The dual-channel continuous read-out is combined with a third high definition digital snapshot capability and has been designed using an FPGA (Field Programmable Gate Array) to capture, decimate, down-convert, re-encode, and transform images from two standard definition CCD (Charge Coupled Device) cameras at 30Hz. The continuous stereo vision can be time correlated to megapixel high definition snapshots. This proof of concept has been fabricated as a fourlayer PCB (Printed Circuit Board) suitable for use in education and research for low cost high efficiency field monitoring applications that need multispectral and three dimensional imaging capabilities. Initial testing is in progress and includes field testing in ports, potential test flights in un-manned aerial systems, and future planned missions to image harsh environments in the arctic including volcanic plumes, ice formation, and arctic marine life.

  3. Computational Sensing Using Low-Cost and Mobile Plasmonic Readers Designed by Machine Learning

    KAUST Repository

    Ballard, Zachary S.

    2017-01-27

    Plasmonic sensors have been used for a wide range of biological and chemical sensing applications. Emerging nanofabrication techniques have enabled these sensors to be cost-effectively mass manufactured onto various types of substrates. To accompany these advances, major improvements in sensor read-out devices must also be achieved to fully realize the broad impact of plasmonic nanosensors. Here, we propose a machine learning framework which can be used to design low-cost and mobile multispectral plasmonic readers that do not use traditionally employed bulky and expensive stabilized light sources or high-resolution spectrometers. By training a feature selection model over a large set of fabricated plasmonic nanosensors, we select the optimal set of illumination light-emitting diodes needed to create a minimum-error refractive index prediction model, which statistically takes into account the varied spectral responses and fabrication-induced variability of a given sensor design. This computational sensing approach was experimentally validated using a modular mobile plasmonic reader. We tested different plasmonic sensors with hexagonal and square periodicity nanohole arrays and revealed that the optimal illumination bands differ from those that are “intuitively” selected based on the spectral features of the sensor, e.g., transmission peaks or valleys. This framework provides a universal tool for the plasmonics community to design low-cost and mobile multispectral readers, helping the translation of nanosensing technologies to various emerging applications such as wearable sensing, personalized medicine, and point-of-care diagnostics. Beyond plasmonics, other types of sensors that operate based on spectral changes can broadly benefit from this approach, including e.g., aptamer-enabled nanoparticle assays and graphene-based sensors, among others.

  4. Performance Assessment of a Custom, Portable, and Low-Cost Brain-Computer Interface Platform.

    Science.gov (United States)

    McCrimmon, Colin M; Fu, Jonathan Lee; Wang, Ming; Lopes, Lucas Silva; Wang, Po T; Karimi-Bidhendi, Alireza; Liu, Charles Y; Heydari, Payam; Nenadic, Zoran; Do, An Hong

    2017-10-01

    Conventional brain-computer interfaces (BCIs) are often expensive, complex to operate, and lack portability, which confines their use to laboratory settings. Portable, inexpensive BCIs can mitigate these problems, but it remains unclear whether their low-cost design compromises their performance. Therefore, we developed a portable, low-cost BCI and compared its performance to that of a conventional BCI. The BCI was assembled by integrating a custom electroencephalogram (EEG) amplifier with an open-source microcontroller and a touchscreen. The function of the amplifier was first validated against a commercial bioamplifier, followed by a head-to-head comparison between the custom BCI (using four EEG channels) and a conventional 32-channel BCI. Specifically, five able-bodied subjects were cued to alternate between hand opening/closing and remaining motionless while the BCI decoded their movement state in real time and provided visual feedback through a light emitting diode. Subjects repeated the above task for a total of 10 trials, and were unaware of which system was being used. The performance in each trial was defined as the temporal correlation between the cues and the decoded states. The EEG data simultaneously acquired with the custom and commercial amplifiers were visually similar and highly correlated ( ρ = 0.79). The decoding performances of the custom and conventional BCIs averaged across trials and subjects were 0.70 ± 0.12 and 0.68 ± 0.10, respectively, and were not significantly different. The performance of our portable, low-cost BCI is comparable to that of the conventional BCIs. Platforms, such as the one developed here, are suitable for BCI applications outside of a laboratory.

  5. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Bach, Matthias

    2014-07-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  6. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    International Nuclear Information System (INIS)

    Bach, Matthias

    2014-01-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  7. The Cost-Effectiveness of Dual Mobility Implants for Primary Total Hip Arthroplasty: A Computer-Based Cost-Utility Model.

    Science.gov (United States)

    Barlow, Brian T; McLawhorn, Alexander S; Westrich, Geoffrey H

    2017-05-03

    Dislocation remains a clinically important problem following primary total hip arthroplasty, and it is a common reason for revision total hip arthroplasty. Dual mobility (DM) implants decrease the risk of dislocation but can be more expensive than conventional implants and have idiosyncratic failure mechanisms. The purpose of this study was to investigate the cost-effectiveness of DM implants compared with conventional bearings for primary total hip arthroplasty. Markov model analysis was conducted from the societal perspective with use of direct and indirect costs. Costs, expressed in 2013 U.S. dollars, were derived from the literature, the National Inpatient Sample, and the Centers for Medicare & Medicaid Services. Effectiveness was expressed in quality-adjusted life years (QALYs). The model was populated with health state utilities and state transition probabilities derived from previously published literature. The analysis was performed for a patient's lifetime, and costs and effectiveness were discounted at 3% annually. The principal outcome was the incremental cost-effectiveness ratio (ICER), with a willingness-to-pay threshold of $100,000/QALY. Sensitivity analyses were performed to explore relevant uncertainty. In the base case, DM total hip arthroplasty showed absolute dominance over conventional total hip arthroplasty, with lower accrued costs ($39,008 versus $40,031 U.S. dollars) and higher accrued utility (13.18 versus 13.13 QALYs) indicating cost-savings. DM total hip arthroplasty ceased being cost-saving when its implant costs exceeded those of conventional total hip arthroplasty by $1,023, and the cost-effectiveness threshold for DM implants was $5,287 greater than that for conventional implants. DM was not cost-effective when the annualized incremental probability of revision from any unforeseen failure mechanism or mechanisms exceeded 0.29%. The probability of intraprosthetic dislocation exerted the most influence on model results. This model

  8. Low Computational-Cost Footprint Deformities Diagnosis Sensor through Angles, Dimensions Analysis and Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    J. Rodolfo Maestre-Rendon

    2017-11-01

    Full Text Available Manual measurements of foot anthropometry can lead to errors since this task involves the experience of the specialist who performs them, resulting in different subjective measures from the same footprint. Moreover, some of the diagnoses that are given to classify a footprint deformity are based on a qualitative interpretation by the physician; there is no quantitative interpretation of the footprint. The importance of providing a correct and accurate diagnosis lies in the need to ensure that an appropriate treatment is provided for the improvement of the patient without risking his or her health. Therefore, this article presents a smart sensor that integrates the capture of the footprint, a low computational-cost analysis of the image and the interpretation of the results through a quantitative evaluation. The smart sensor implemented required the use of a camera (Logitech C920 connected to a Raspberry Pi 3, where a graphical interface was made for the capture and processing of the image, and it was adapted to a podoscope conventionally used by specialists such as orthopedist, physiotherapists and podiatrists. The footprint diagnosis smart sensor (FPDSS has proven to be robust to different types of deformity, precise, sensitive and correlated in 0.99 with the measurements from the digitalized image of the ink mat.

  9. A Low-Cost Computer-Controlled Arduino-Based Educational Laboratory System for Teaching the Fundamentals of Photovoltaic Cells

    Science.gov (United States)

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental…

  10. Greater autonomy at work

    NARCIS (Netherlands)

    Houtman, I.L.D.

    2004-01-01

    In the past 10 years, workers in the Netherlands increasingly report more decision-making power in their work. This is important for an economy in recession and where workers face greater work demands. It makes work more interesting, creates a healthier work environment, and provides opportunities

  11. Strength and Reliability of Wood for the Components of Low-cost Wind Turbines: Computational and Experimental Analysis and Applications

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Freere, Peter; Sharma, Ranjan

    2009-01-01

    of experiments and computational investigations. Low cost testing machines have been designed, and employed for the systematic analysis of different sorts of Nepali wood, to be used for the wind turbine construction. At the same time, computational micromechanical models of deformation and strength of wood......This paper reports the latest results of the comprehensive program of experimental and computational analysis of strength and reliability of wooden parts of low cost wind turbines. The possibilities of prediction of strength and reliability of different types of wood are studied in the series...... are developed, which should provide the basis for microstructure-based correlating of observable and service properties of wood. Some correlations between microstructure, strength and service properties of wood have been established....

  12. Cost-effectiveness modeling of colorectal cancer: Computed tomography colonography vs colonoscopy or fecal occult blood tests

    International Nuclear Information System (INIS)

    Lucidarme, Olivier; Cadi, Mehdi; Berger, Genevieve; Taieb, Julien; Poynard, Thierry; Grenier, Philippe; Beresniak, Ariel

    2012-01-01

    Objectives: To assess the cost-effectiveness of three colorectal-cancer (CRC) screening strategies in France: fecal-occult-blood tests (FOBT), computed-tomography-colonography (CTC) and optical-colonoscopy (OC). Methods: Ten-year simulation modeling was used to assess a virtual asymptomatic, average-risk population 50–74 years old. Negative OC was repeated 10 years later, and OC positive for advanced or non-advanced adenoma 3 or 5 years later, respectively. FOBT was repeated biennially. Negative CTC was repeated 5 years later. Positive CTC and FOBT led to triennial OC. Total cost and CRC rate after 10 years for each screening strategy and 0–100% adherence rates with 10% increments were computed. Transition probabilities were programmed using distribution ranges to account for uncertainty parameters. Direct medical costs were estimated using the French national health insurance prices. Probabilistic sensitivity analyses used 5000 Monte Carlo simulations generating model outcomes and standard deviations. Results: For a given adherence rate, CTC screening was always the most effective but not the most cost-effective. FOBT was the least effective but most cost-effective strategy. OC was of intermediate efficacy and the least cost-effective strategy. Without screening, treatment of 123 CRC per 10,000 individuals would cost €3,444,000. For 60% adherence, the respective costs of preventing and treating, respectively 49 and 74 FOBT-detected, 73 and 50 CTC-detected and 63 and 60 OC-detected CRC would be €2,810,000, €6,450,000 and €9,340,000. Conclusion: Simulation modeling helped to identify what would be the most effective (CTC) and cost-effective screening (FOBT) strategy in the setting of mass CRC screening in France.

  13. Computational Stimulation of the Basal Ganglia Neurons with Cost Effective Delayed Gaussian Waveforms.

    Science.gov (United States)

    Daneshzand, Mohammad; Faezipour, Miad; Barkana, Buket D

    2017-01-01

    Deep brain stimulation (DBS) has compelling results in the desynchronization of the basal ganglia neuronal activities and thus, is used in treating the motor symptoms of Parkinson's disease (PD). Accurate definition of DBS waveform parameters could avert tissue or electrode damage, increase the neuronal activity and reduce energy cost which will prolong the battery life, hence avoiding device replacement surgeries. This study considers the use of a charge balanced Gaussian waveform pattern as a method to disrupt the firing patterns of neuronal cell activity. A computational model was created to simulate ganglia cells and their interactions with thalamic neurons. From the model, we investigated the effects of modified DBS pulse shapes and proposed a delay period between the cathodic and anodic parts of the charge balanced Gaussian waveform to desynchronize the firing patterns of the GPe and GPi cells. The results of the proposed Gaussian waveform with delay outperformed that of rectangular DBS waveforms used in in-vivo experiments. The Gaussian Delay Gaussian (GDG) waveforms achieved lower number of misses in eliciting action potential while having a lower amplitude and shorter length of delay compared to numerous different pulse shapes. The amount of energy consumed in the basal ganglia network due to GDG waveforms was dropped by 22% in comparison with charge balanced Gaussian waveforms without any delay between the cathodic and anodic parts and was also 60% lower than a rectangular charged balanced pulse with a delay between the cathodic and anodic parts of the waveform. Furthermore, by defining a Synchronization Level metric, we observed that the GDG waveform was able to reduce the synchronization of GPi neurons more effectively than any other waveform. The promising results of GDG waveforms in terms of eliciting action potential, desynchronization of the basal ganglia neurons and reduction of energy consumption can potentially enhance the performance of DBS

  14. Computational Stimulation of the Basal Ganglia Neurons with Cost Effective Delayed Gaussian Waveforms

    Directory of Open Access Journals (Sweden)

    Mohammad Daneshzand

    2017-08-01

    Full Text Available Deep brain stimulation (DBS has compelling results in the desynchronization of the basal ganglia neuronal activities and thus, is used in treating the motor symptoms of Parkinson's disease (PD. Accurate definition of DBS waveform parameters could avert tissue or electrode damage, increase the neuronal activity and reduce energy cost which will prolong the battery life, hence avoiding device replacement surgeries. This study considers the use of a charge balanced Gaussian waveform pattern as a method to disrupt the firing patterns of neuronal cell activity. A computational model was created to simulate ganglia cells and their interactions with thalamic neurons. From the model, we investigated the effects of modified DBS pulse shapes and proposed a delay period between the cathodic and anodic parts of the charge balanced Gaussian waveform to desynchronize the firing patterns of the GPe and GPi cells. The results of the proposed Gaussian waveform with delay outperformed that of rectangular DBS waveforms used in in-vivo experiments. The Gaussian Delay Gaussian (GDG waveforms achieved lower number of misses in eliciting action potential while having a lower amplitude and shorter length of delay compared to numerous different pulse shapes. The amount of energy consumed in the basal ganglia network due to GDG waveforms was dropped by 22% in comparison with charge balanced Gaussian waveforms without any delay between the cathodic and anodic parts and was also 60% lower than a rectangular charged balanced pulse with a delay between the cathodic and anodic parts of the waveform. Furthermore, by defining a Synchronization Level metric, we observed that the GDG waveform was able to reduce the synchronization of GPi neurons more effectively than any other waveform. The promising results of GDG waveforms in terms of eliciting action potential, desynchronization of the basal ganglia neurons and reduction of energy consumption can potentially enhance the

  15. Effects on costs of frontline diagnostic evaluation in patients suspected of angina: coronary computed tomography angiography vs. conventional ischaemia testing

    DEFF Research Database (Denmark)

    Nielsen, Lene H; Olsen, Jens; Markenvard, John

    2013-01-01

    group. The mean (SD) total costs per patient at the end of thefollow-up were 14% lower in the CTA group than in the ex-test group, € 1510 (3474) vs. €1777 (3746) (P = 0.03). CONCLUSION: Diagnostic assessment of symptomatic patients with a low-intermediate probability of CAD by CTA incurred lower costs......AIMS: The aim of this study was to investigate in patients with stable angina the effects on costs of frontline diagnostics by exercise-stress testing (ex-test) vs. coronary computed tomography angiography (CTA). METHODS AND RESULTS: In two coronary units at Lillebaelt Hospital, Denmark, 498...... patients were identified in whom either ex-test (n = 247) or CTA (n = 251) were applied as the frontline diagnostic strategy in symptomatic patients with a low-intermediate pre-test probability of coronary artery disease (CAD). During 12 months of follow-up, death, myocardial infarction and costs...

  16. Incremental cost of department-wide implementation of a picture archiving and communication system and computed radiography.

    Science.gov (United States)

    Pratt, H M; Langlotz, C P; Feingold, E R; Schwartz, J S; Kundel, H L

    1998-01-01

    To determine the incremental cash flows associated with department-wide implementation of a picture archiving and communication system (PACS) and computed radiography (CR) at a large academic medical center. The authors determined all capital and operational costs associated with PACS implementation during an 8-year time horizon. Economic effects were identified, adjusted for time value, and used to calculate net present values (NPVs) for each section of the department of radiology and for the department as a whole. The chest-bone section used the most resources. Changes in cost assumptions for the chest-bone section had a dominant effect on the department-wide NPV. The base-case NPV (i.e., that determined by using the initial assumptions) was negative, indicating that additional net costs are incurred by the radiology department from PACS implementation. PACS and CR provide cost savings only when a 12-year hardware life span is assumed, when CR equipment is removed from the analysis, or when digitized long-term archives are compressed at a rate of 10:1. Full PACS-CR implementation would not provide cost savings for a large, subspecialized department. However, institutions that are committed to CR implementation (for whom CR implementation would represent a sunk cost) or institutions that are able to archive images by using image compression will experience cost savings from PACS.

  17. Selecting an Architecture for a Safety-Critical Distributed Computer System with Power, Weight and Cost Considerations

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.

  18. Greater-confinement disposal

    International Nuclear Information System (INIS)

    Trevorrow, L.E.; Schubert, J.P.

    1989-01-01

    Greater-confinement disposal (GCD) is a general term for low-level waste (LLW) disposal technologies that employ natural and/or engineered barriers and provide a degree of confinement greater than that of shallow-land burial (SLB) but possibly less than that of a geologic repository. Thus GCD is associated with lower risk/hazard ratios than SLB. Although any number of disposal technologies might satisfy the definition of GCD, eight have been selected for consideration in this discussion. These technologies include: (1) earth-covered tumuli, (2) concrete structures, both above and below grade, (3) deep trenches, (4) augered shafts, (5) rock cavities, (6) abandoned mines, (7) high-integrity containers, and (8) hydrofracture. Each of these technologies employ several operations that are mature,however, some are at more advanced stages of development and demonstration than others. Each is defined and further described by information on design, advantages and disadvantages, special equipment requirements, and characteristic operations such as construction, waste emplacement, and closure

  19. The application of cloud computing to scientific workflows: a study of cost and performance.

    Science.gov (United States)

    Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S

    2013-01-28

    The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.

  20. Thoracoabdominal computed tomography in trauma patients: a cost-consequences analysis

    NARCIS (Netherlands)

    Vugt, R. van; Kool, D.R.; Brink, M.; Dekker, H.M.; Deunk, J.; Edwards, M.J.R.

    2014-01-01

    BACKGROUND: CT is increasingly used during the initial evaluation of blunt trauma patients. In this era of increasing cost-awareness, the pros and cons of CT have to be assessed. OBJECTIVES: This study was performed to evaluate cost-consequences of different diagnostic algorithms that use

  1. Computer software to estimate timber harvesting system production, cost, and revenue

    Science.gov (United States)

    Dr. John E. Baumgras; Dr. Chris B. LeDoux

    1992-01-01

    Large variations in timber harvesting cost and revenue can result from the differences between harvesting systems, the variable attributes of harvesting sites and timber stands, or changing product markets. Consequently, system and site specific estimates of production rates and costs are required to improve estimates of harvesting revenue. This paper describes...

  2. Computation of piecewise affine terminal cost functions for model predictive control

    NARCIS (Netherlands)

    Brunner, F.D.; Lazar, M.; Allgöwer, F.; Fränzle, Martin; Lygeros, John

    2014-01-01

    This paper proposes a method for the construction of piecewise affine terminal cost functions for model predictive control (MPC). The terminal cost function is constructed on a predefined partition by solving a linear program for a given piecewise affine system, a stabilizing piecewise affine

  3. More features, greater connectivity.

    Science.gov (United States)

    Hunt, Sarah

    2015-09-01

    Changes in our political infrastructure, the continuing frailties of our economy, and a stark growth in population, have greatly impacted upon the perceived stability of the NHS. Healthcare teams have had to adapt to these changes, and so too have the technologies upon which they rely to deliver first-class patient care. Here Sarah Hunt, marketing co-ordinator at Aid Call, assesses how the changing healthcare environment has affected one of its fundamental technologies - the nurse call system, argues the case for wireless such systems in terms of what the company claims is greater adaptability to changing needs, and considers the ever-wider range of features and functions available from today's nurse call equipment, particularly via connectivity with both mobile devices, and ancillaries ranging from enuresis sensors to staff attack alert 'badges'.

  4. Cost-effectiveness of computed tomography colonography in colorectal cancer screening: a systematic review.

    Science.gov (United States)

    Hanly, Paul; Skally, Mairead; Fenlon, Helen; Sharp, Linda

    2012-10-01

    The European Code Against Cancer recommends individuals aged ≥ 50 should participate in colorectal cancer screening. CT-colonography (CTC) is one of several screening tests available. We systematically reviewed evidence on, and identified key factors influencing, cost-effectiveness of CTC screening. PubMed, Medline, and the Cochrane library were searched for cost-effectiveness or cost-utility analyses of CTC-based screening, published in English, January 1999 to July 2010. Data was abstracted on setting, model type and horizon, screening scenario(s), comparator(s), participants, uptake, CTC performance and cost, effectiveness, ICERs, and whether extra-colonic findings and medical complications were considered. Sixteen studies were identified from the United States (n = 11), Canada (n = 2), and France, Italy, and the United Kingdom (1 each). Markov state-transition (n = 14) or microsimulation (n = 2) models were used. Eleven considered direct medical costs only; five included indirect costs. Fourteen compared CTC with no screening; fourteen compared CTC with colonoscopy-based screening; fewer compared CTC with sigmoidoscopy (8) or fecal tests (4). Outcomes assessed were life-years gained/saved (13), QALYs (2), or both (1). Three considered extra-colonic findings; seven considered complications. CTC appeared cost-effective versus no screening and, in general, flexible sigmoidoscopy and fecal occult blood testing. Results were mixed comparing CTC to colonoscopy. Parameters most influencing cost-effectiveness included: CTC costs, screening uptake, threshold for polyp referral, and extra-colonic findings. Evidence on cost-effectiveness of CTC screening is heterogeneous, due largely to between-study differences in comparators and parameter values. Future studies should: compare CTC with currently favored tests, especially fecal immunochemical tests; consider extra-colonic findings; and conduct comprehensive sensitivity analyses.

  5. A nearly-linear computational-cost scheme for the forward dynamics of an N-body pendulum

    Science.gov (United States)

    Chou, Jack C. K.

    1989-01-01

    The dynamic equations of motion of an n-body pendulum with spherical joints are derived to be a mixed system of differential and algebraic equations (DAE's). The DAE's are kept in implicit form to save arithmetic and preserve the sparsity of the system and are solved by the robust implicit integration method. At each solution point, the predicted solution is corrected to its exact solution within given tolerance using Newton's iterative method. For each iteration, a linear system of the form J delta X = E has to be solved. The computational cost for solving this linear system directly by LU factorization is O(n exp 3), and it can be reduced significantly by exploring the structure of J. It is shown that by recognizing the recursive patterns and exploiting the sparsity of the system the multiplicative and additive computational costs for solving J delta X = E are O(n) and O(n exp 2), respectively. The formulation and solution method for an n-body pendulum is presented. The computational cost is shown to be nearly linearly proportional to the number of bodies.

  6. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    International Nuclear Information System (INIS)

    Capone, V; Esposito, R; Pardi, S; Taurino, F; Tortone, G

    2012-01-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  7. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    Science.gov (United States)

    Capone, V.; Esposito, R.; Pardi, S.; Taurino, F.; Tortone, G.

    2012-12-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  8. A computational model for determining the minimal cost expansion alternatives in transmission systems planning

    International Nuclear Information System (INIS)

    Pinto, L.M.V.G.; Pereira, M.V.F.; Nunes, A.

    1989-01-01

    A computational model for determining an economical transmission expansion plan, based in the decomposition techniques is presented. The algorithm was used in the Brazilian South System and was able to find an optimal solution, with a low computational resource. Some expansions of this methodology are been investigated: the probabilistic one and the expansion with financier restriction. (C.G.C.). 4 refs, 7 figs

  9. Paper Circuits: A Tangible, Low Threshold, Low Cost Entry to Computational Thinking

    Science.gov (United States)

    Lee, Victor R.; Recker, Mimi

    2018-01-01

    In this paper, we propose that paper circuitry provides a productive space for exploring aspects of computational thinking, an increasingly critical 21st century skills for all students. We argue that the creation and operation of paper circuits involve learning about computational concepts such as rule-based constraints, operations, and defined…

  10. Computed tomographic colonography to screen for colorectal cancer, extracolonic cancer, and aortic aneurysm: model simulation with cost-effectiveness analysis.

    Science.gov (United States)

    Hassan, Cesare; Pickhardt, Perry J; Pickhardt, Perry; Laghi, Andrea; Kim, Daniel H; Kim, Daniel; Zullo, Angelo; Iafrate, Franco; Di Giulio, Lorenzo; Morini, Sergio

    2008-04-14

    In addition to detecting colorectal neoplasia, abdominal computed tomography (CT) with colonography technique (CTC) can also detect unsuspected extracolonic cancers and abdominal aortic aneurysms (AAA).The efficacy and cost-effectiveness of this combined abdominal CT screening strategy are unknown. A computerized Markov model was constructed to simulate the occurrence of colorectal neoplasia, extracolonic malignant neoplasm, and AAA in a hypothetical cohort of 100,000 subjects from the United States who were 50 years of age. Simulated screening with CTC, using a 6-mm polyp size threshold for reporting, was compared with a competing model of optical colonoscopy (OC), both without and with abdominal ultrasonography for AAA detection (OC-US strategy). In the simulated population, CTC was the dominant screening strategy, gaining an additional 1458 and 462 life-years compared with the OC and OC-US strategies and being less costly, with a savings of $266 and $449 per person, respectively. The additional gains for CTC were largely due to a decrease in AAA-related deaths, whereas the modeled benefit from extracolonic cancer downstaging was a relatively minor factor. At sensitivity analysis, OC-US became more cost-effective only when the CTC sensitivity for large polyps dropped to 61% or when broad variations of costs were simulated, such as an increase in CTC cost from $814 to $1300 or a decrease in OC cost from $1100 to $500. With the OC-US approach, suboptimal compliance had a strong negative influence on efficacy and cost-effectiveness. The estimated mortality from CT-induced cancer was less than estimated colonoscopy-related mortality (8 vs 22 deaths), both of which were minor compared with the positive benefit from screening. When detection of extracolonic findings such as AAA and extracolonic cancer are considered in addition to colorectal neoplasia in our model simulation, CT colonography is a dominant screening strategy (ie, more clinically effective and more cost

  11. Near DC eddy current measurement of aluminum multilayers using MR sensors and commodity low-cost computer technology

    Science.gov (United States)

    Perry, Alexander R.

    2002-06-01

    Low Frequency Eddy Current (EC) probes are capable of measurement from 5 MHz down to DC through the use of Magnetoresistive (MR) sensors. Choosing components with appropriate electrical specifications allows them to be matched to the power and impedance characteristics of standard computer connectors. This permits direct attachment of the probe to inexpensive computers, thereby eliminating external power supplies, amplifiers and modulators that have heretofore precluded very low system purchase prices. Such price reduction is key to increased market penetration in General Aviation maintenance and consequent reduction in recurring costs. This paper examines our computer software CANDETECT, which implements this approach and permits effective probe operation. Results are presented to show the intrinsic sensitivity of the software and demonstrate its practical performance when seeking cracks in the underside of a thick aluminum multilayer structure. The majority of the General Aviation light aircraft fleet uses rivets and screws to attach sheet aluminum skin to the airframe, resulting in similar multilayer lap joints.

  12. Minimizing total costs of forest roads with computer-aided design ...

    Indian Academy of Sciences (India)

    imum total road costs, while conforming to design specifications, environmental ..... quality, and enhancing fish and wildlife habitat, an appropriate design ..... Soil, Water and Timber Management: Forest Engineering Solutions in Response to.

  13. A probabilistic approach to the computation of the levelized cost of electricity

    International Nuclear Information System (INIS)

    Geissmann, Thomas

    2017-01-01

    This paper sets forth a novel approach to calculate the levelized cost of electricity (LCOE) using a probabilistic model that accounts for endogenous input parameters. The approach is applied to the example of a nuclear and gas power project. Monte Carlo simulation results show that a correlation between input parameters has a significant effect on the model outcome. By controlling for endogeneity, a statistically significant difference in the mean LCOE estimate and a change in the order of input leverages is observed. Moreover, the paper discusses the role of discounting options and external costs in detail. In contrast to the gas power project, the economic viability of the nuclear project is considerably weaker. - Highlights: • First model of levelized cost of electricity accounting for uncertainty and endogeneities in input parameters. • Allowance for endogeneities significantly affects results. • Role of discounting options and external costs is discussed and modelled.

  14. Model implementation for dynamic computation of system cost for advanced life support

    Science.gov (United States)

    Levri, J. A.; Vaccari, D. A.

    2004-01-01

    Life support system designs for long-duration space missions have a multitude of requirements drivers, such as mission objectives, political considerations, cost, crew wellness, inherent mission attributes, as well as many other influences. Evaluation of requirements satisfaction can be difficult, particularly at an early stage of mission design. Because launch cost is a critical factor and relatively easy to quantify, it is a point of focus in early mission design. The method used to determine launch cost influences the accuracy of the estimate. This paper discusses the appropriateness of dynamic mission simulation in estimating the launch cost of a life support system. This paper also provides an abbreviated example of a dynamic simulation life support model and possible ways in which such a model might be utilized for design improvement. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  15. The greater picture

    CERN Multimedia

    Staff Association

    2013-01-01

    Strike of the European civil servants Two representatives of the CERN Staff Association, who attended a conference of Staff Committees of European Agencies in Brussels, participated in the strike of the European civil servants on 8th November. Indeed, more than 500 people were demonstrating in front of the Commission against the attacks on their working conditions.  By solidarity, our representatives joined them. The governments of the richest countries want large cuts in the Union European budget, especially in the administration costs, cuts of up to 15 billion, presently the European parliament does not follow. This could result in pension fund reforms, going from 1/3 2/3 repartition of the contributions to 1/2 1/2, which is unacceptable especially for low salaries. Besides, reduction of staff of 5 %, or even 15 % is seriously considered. 2004 saw already a diminution of the working conditions at the Union European, will 2013 make even more damages? The AASC (Assembly of Agency Staff Committe...

  16. Computation of spot prices and congestion costs in large interconnected power systems

    International Nuclear Information System (INIS)

    Mukerji, R.; Jordan, G.A.; Clayton, R.; Haringa, G.E.

    1995-01-01

    Foremost among the new paradigms for the US utility industry is the ''poolco'' concept proposed by Prof. William W. Hogan of Harvard University. This concept uses a central pool or power exchange in which physical power is traded based on spot prices or market clearing prices. The rapid and accurate calculation of these ''spot'' prices and associated congestion costs for large interconnected power systems is the central tenet upon which the poolco concept is based. The market clearing price would be the same throughout the system if there were no system losses and transmission limitations did not exist. System losses cause small differences in market clearing prices as the cost of supplying a MW at various load buses includes the cost of losses. Transmission limits may cause large differences in market clearing prices between regions as low cost generation is blocked by the transmission constraints from serving certain loads. In models currently in use in the electric power industry spot price calculations range from ''bubble diagram'' type contract path models to full electrical representation such as GE-MAPS. The modeling aspects of the full electrical representation are included in the Appendix. The problem with the bubble diagram representation is that these models are liable to produce unacceptably large errors in the calculation of spot prices and congestion costs. The subtleties of the calculation of spot prices and congestion costs are illustrated in this paper

  17. Planning for greater confinement disposal

    International Nuclear Information System (INIS)

    Gilbert, T.L.; Luner, C.; Meshkov, N.K.; Trevorrow, L.E.; Yu, C.

    1985-01-01

    A report that provides guidance for planning for greater-confinement disposal (GCD) of low-level radioactive waste is being prepared. The report addresses procedures for selecting a GCD technology and provides information for implementing these procedures. The focus is on GCD; planning aspects common to GCD and shallow-land burial are covered by reference. Planning procedure topics covered include regulatory requirements, waste characterization, benefit-cost-risk assessment and pathway analysis methodologies, determination of need, waste-acceptance criteria, performance objectives, and comparative assessment of attributes that support these objectives. The major technologies covered include augered shafts, deep trenches, engineered structures, hydrofracture, improved waste forms, and high-integrity containers. Descriptive information is provided, and attributes that are relevant for risk assessment and operational requirements are given. 10 refs., 3 figs., 2 tabs

  18. Computer-assisted cognitive remediation therapy in schizophrenia: Durability of the effects and cost-utility analysis.

    Science.gov (United States)

    Garrido, Gemma; Penadés, Rafael; Barrios, Maite; Aragay, Núria; Ramos, Irene; Vallès, Vicenç; Faixa, Carlota; Vendrell, Josep M

    2017-08-01

    The durability of computer-assisted cognitive remediation (CACR) therapy over time and the cost-effectiveness of treatment remains unclear. The aim of the current study is to investigate the effectiveness of CACR and to examine the use and cost of acute psychiatric admissions before and after of CACR. Sixty-seven participants were initially recruited. For the follow-up study a total of 33 participants were enrolled, 20 to the CACR condition group and 13 to the active control condition group. All participants were assessed at baseline, post-therapy and 12 months post-therapy on neuropsychology, QoL and self-esteem measurements. The use and cost of acute psychiatric admissions were collected retrospectively at four assessment points: baseline, 12 months post-therapy, 24 months post-therapy, and 36 months post-therapy. The results indicated that treatment effectiveness persisted in the CACR group one year post-therapy on neuropsychological and well-being outcomes. The CACR group showed a clear decrease in the use of acute psychiatric admissions at 12, 24 and 36 months post-therapy, which lowered the global costs the acute psychiatric admissions at 12, 24 and 36 months post-therapy. The CACR is durable over at least a 12-month period, and CACR may be helping to reduce health care costs for schizophrenia patients. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  19. Personal computers in high energy physics

    International Nuclear Information System (INIS)

    Quarrie, D.R.

    1987-01-01

    The role of personal computers within HEP is expanding as their capabilities increase and their cost decreases. Already they offer greater flexibility than many low-cost graphics terminals for a comparable cost and in addition they can significantly increase the productivity of physicists and programmers. This talk will discuss existing uses for personal computers and explore possible future directions for their integration into the overall computing environment. (orig.)

  20. Chest Computed Tomographic Image Screening for Cystic Lung Diseases in Patients with Spontaneous Pneumothorax Is Cost Effective.

    Science.gov (United States)

    Gupta, Nishant; Langenderfer, Dale; McCormack, Francis X; Schauer, Daniel P; Eckman, Mark H

    2017-01-01

    Patients without a known history of lung disease presenting with a spontaneous pneumothorax are generally diagnosed as having primary spontaneous pneumothorax. However, occult diffuse cystic lung diseases such as Birt-Hogg-Dubé syndrome (BHD), lymphangioleiomyomatosis (LAM), and pulmonary Langerhans cell histiocytosis (PLCH) can also first present with a spontaneous pneumothorax, and their early identification by high-resolution computed tomographic (HRCT) chest imaging has implications for subsequent management. The objective of our study was to evaluate the cost-effectiveness of HRCT chest imaging to facilitate early diagnosis of LAM, BHD, and PLCH. We constructed a Markov state-transition model to assess the cost-effectiveness of screening HRCT to facilitate early diagnosis of diffuse cystic lung diseases in patients presenting with an apparent primary spontaneous pneumothorax. Baseline data for prevalence of BHD, LAM, and PLCH and rates of recurrent pneumothoraces in each of these diseases were derived from the literature. Costs were extracted from 2014 Medicare data. We compared a strategy of HRCT screening followed by pleurodesis in patients with LAM, BHD, or PLCH versus conventional management with no HRCT screening. In our base case analysis, screening for the presence of BHD, LAM, or PLCH in patients presenting with a spontaneous pneumothorax was cost effective, with a marginal cost-effectiveness ratio of $1,427 per quality-adjusted life-year gained. Sensitivity analysis showed that screening HRCT remained cost effective for diffuse cystic lung diseases prevalence as low as 0.01%. HRCT image screening for BHD, LAM, and PLCH in patients with apparent primary spontaneous pneumothorax is cost effective. Clinicians should consider performing a screening HRCT in patients presenting with apparent primary spontaneous pneumothorax.

  1. Computational Intelligence in Software Cost Estimation: Evolving Conditional Sets of Effort Value Ranges

    OpenAIRE

    Papatheocharous, Efi; Andreou, Andreas S.

    2008-01-01

    In this approach we aimed at addressing the problem of large variances found in available historical data that are used in software cost estimation. Project data is expensive to collect, manage and maintain. Therefore, if we wish to lower the dependence of the estimation to

  2. Robotic lower limb prosthesis design through simultaneous computer optimizations of human and prosthesis costs

    Science.gov (United States)

    Handford, Matthew L.; Srinivasan, Manoj

    2016-02-01

    Robotic lower limb prostheses can improve the quality of life for amputees. Development of such devices, currently dominated by long prototyping periods, could be sped up by predictive simulations. In contrast to some amputee simulations which track experimentally determined non-amputee walking kinematics, here, we explicitly model the human-prosthesis interaction to produce a prediction of the user’s walking kinematics. We obtain simulations of an amputee using an ankle-foot prosthesis by simultaneously optimizing human movements and prosthesis actuation, minimizing a weighted sum of human metabolic and prosthesis costs. The resulting Pareto optimal solutions predict that increasing prosthesis energy cost, decreasing prosthesis mass, and allowing asymmetric gaits all decrease human metabolic rate for a given speed and alter human kinematics. The metabolic rates increase monotonically with speed. Remarkably, by performing an analogous optimization for a non-amputee human, we predict that an amputee walking with an appropriately optimized robotic prosthesis can have a lower metabolic cost - even lower than assuming that the non-amputee’s ankle torques are cost-free.

  3. Quantitative evaluation of low-cost frame-grabber boards for personal computers.

    Science.gov (United States)

    Kofler, J M; Gray, J E; Fuelberth, J T; Taubel, J P

    1995-11-01

    Nine moderately priced frame-grabber boards for both Macintosh (Apple Computers, Cupertino, CA) and IBM-compatible computers were evaluated using a Society of Motion Pictures and Television Engineers (SMPTE) pattern and a video signal generator for dynamic range, gray-scale reproducibility, and spatial integrity of the captured image. The degradation of the video information ranged from minor to severe. Some boards are of reasonable quality for applications in diagnostic imaging and education. However, price and quality are not necessarily directly related.

  4. Cost-effectiveness of computed tomographic colonography screening for colorectal cancer in the medicare population

    NARCIS (Netherlands)

    A.B. Knudsen (Amy); I. Lansdorp-Vogelaar (Iris); C.M. Rutter (Carolyn); J.E. Savarino (James); M. van Ballegooijen (Marjolein); K.M. Kuntz (Karen); A. Zauber (Ann)

    2010-01-01

    textabstractBackground The Centers for Medicare and Medicaid Services (CMS) considered whether to reimburse computed tomographic colonography (CTC) for colorectal cancer screening of Medicare enrollees. To help inform its decision, we evaluated the reimbursement rate at which CTC screening could be

  5. Cost analysis of hash collisions : will quantum computers make SHARCS obsolete?

    NARCIS (Netherlands)

    Bernstein, D.J.

    2009-01-01

    Current proposals for special-purpose factorization hardware will become obsolete if large quantum computers are built: the number-field sieve scales much more poorly than Shor's quantum algorithm for factorization. Will all special-purpose cryptanalytic hardware become obsolete in a post-quantum

  6. Low cost SCR lamp driver indicates contents of digital computer registers

    Science.gov (United States)

    Cliff, R. A.

    1967-01-01

    Silicon Controlled Rectifier /SCR/ lamp driver is adapted for use in integrated circuit digital computers where it indicates the contents of the various registers. The threshold voltage at which visual indication begins is very sharply defined and can be adjusted to suit particular system requirements.

  7. The use of 3D CADD (Computer Aided Design and Drafting) models in operation and maintenance cost reduction

    International Nuclear Information System (INIS)

    Didsbury, R.; Bains, N.; Cho, U.Y.

    1998-01-01

    The use of three dimensional(3D) computer-aided design and drafting(CADD) models, and the associated information technology and databases, in the engineering and construction phases of large projects is well established and yielding significant improvements in project cost, schedule and quality. The information contained in these models can also be extremely valuable to operating plants, particularly when the visual and spatial information contained in the 3D models is interfaced to other plant information databases. Indeed many plant owners and operators in the process and power industries are already using this technology to assist with such activities as plant configuration management, staff training, work planning and radiation protection. This paper will explore the application of 3D models and the associated databases in an operating plant environment and describe the resulting operational benefits and cost reduction benefits. Several industrial experience case studies will be presented along with suggestions for further future applications. (author). 4 refs., 1 tab., 8 figs

  8. User's guide to SERICPAC: A computer program for calculating electric-utility avoided costs rates

    Energy Technology Data Exchange (ETDEWEB)

    Wirtshafter, R.; Abrash, M.; Koved, M.; Feldman, S.

    1982-05-01

    SERICPAC is a computer program developed to calculate average avoided cost rates for decentralized power producers and cogenerators that sell electricity to electric utilities. SERICPAC works in tandem with SERICOST, a program to calculate avoided costs, and determines the appropriate rates for buying and selling of electricity from electric utilities to qualifying facilities (QF) as stipulated under Section 210 of PURA. SERICPAC contains simulation models for eight technologies including wind, hydro, biogas, and cogeneration. The simulations are converted in a diversified utility production which can be either gross production or net production, which accounts for an internal electricity usage by the QF. The program allows for adjustments to the production to be made for scheduled and forced outages. The final output of the model is a technology-specific average annual rate. The report contains a description of the technologies and the simulations as well as complete user's guide to SERICPAC.

  9. Costs associated with implementation of computer-assisted clinical decision support system for antenatal and delivery care: case study of Kassena-Nankana district of northern Ghana.

    Science.gov (United States)

    Dalaba, Maxwell Ayindenaba; Akweongo, Patricia; Williams, John; Saronga, Happiness Pius; Tonchev, Pencho; Sauerborn, Rainer; Mensah, Nathan; Blank, Antje; Kaltschmidt, Jens; Loukanova, Svetla

    2014-01-01

    This study analyzed cost of implementing computer-assisted Clinical Decision Support System (CDSS) in selected health care centres in Ghana. A descriptive cross sectional study was conducted in the Kassena-Nankana district (KND). CDSS was deployed in selected health centres in KND as an intervention to manage patients attending antenatal clinics and the labour ward. The CDSS users were mainly nurses who were trained. Activities and associated costs involved in the implementation of CDSS (pre-intervention and intervention) were collected for the period between 2009-2013 from the provider perspective. The ingredients approach was used for the cost analysis. Costs were grouped into personnel, trainings, overheads (recurrent costs) and equipment costs (capital cost). We calculated cost without annualizing capital cost to represent financial cost and cost with annualizing capital costs to represent economic cost. Twenty-two trained CDSS users (at least 2 users per health centre) participated in the study. Between April 2012 and March 2013, users managed 5,595 antenatal clients and 872 labour clients using the CDSS. We observed a decrease in the proportion of complications during delivery (pre-intervention 10.74% versus post-intervention 9.64%) and a reduction in the number of maternal deaths (pre-intervention 4 deaths versus post-intervention 1 death). The overall financial cost of CDSS implementation was US$23,316, approximately US$1,060 per CDSS user trained. Of the total cost of implementation, 48% (US$11,272) was pre-intervention cost and intervention cost was 52% (US$12,044). Equipment costs accounted for the largest proportion of financial cost: 34% (US$7,917). When economic cost was considered, total cost of implementation was US$17,128-lower than the financial cost by 26.5%. The study provides useful information in the implementation of CDSS at health facilities to enhance health workers' adherence to practice guidelines and taking accurate decisions to improve

  10. Development and implementation of a low cost micro computer system for LANDSAT analysis and geographic data base applications

    Science.gov (United States)

    Faust, N.; Jordon, L.

    1981-01-01

    Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.

  11. The Department of Defense and the Power of Cloud Computing: Weighing Acceptable Cost Versus Acceptable Risk

    Science.gov (United States)

    2016-04-01

    of control 8 2 PAAS component stack and scope of control 9 3 SAAS component stack and scope of control 10 vii Foreword It is my great pleasure to...service (PAAS), or software as a service ( SAAS ). Regardless of the model or service selected, the process of implementing a cloud-computing environment... SAAS . To- gether, these build on each other, providing more service to the customer while limiting customers’ abilities to operate, maintain, and

  12. A low-cost, low-energy tangible programming system for computer illiterates in developing regions

    CSIR Research Space (South Africa)

    Smith, Andrew C

    2008-07-01

    Full Text Available approach is to first develop, in the illiterate population, the cognitive process of logical thinking required in the IT field. Having developed this ability, the illiterate person has a tool for potentially controlling a number of objects... functionality. Because of these tangible and visual properties, the cognitive burden on the user is reduced as compared with text-only input systems. We therefore hypothesise that our input devices are well suited for computer-illiterate people. 3...

  13. Cloud Computing and Security Issues

    OpenAIRE

    Rohan Jathanna; Dhanamma Jagli

    2017-01-01

    Cloud computing has become one of the most interesting topics in the IT world today. Cloud model of computing as a resource has changed the landscape of computing as it promises of increased greater reliability, massive scalability, and decreased costs have attracted businesses and individuals alike. It adds capabilities to Information Technology’s. Over the last few years, cloud computing has grown considerably in Information Technology. As more and more information of individuals and compan...

  14. Cost-optimized configuration of computing instances for large sized cloud systems

    Directory of Open Access Journals (Sweden)

    Woo-Chan Kim

    2017-09-01

    Full Text Available Cloud computing services are becoming more popular for various reasons which include ‘having no need for capital expenditure’ and ‘the ability to quickly meet business demands’. However, what seems to be an attractive option may become a substantial expenditure as more projects are moved into the cloud. Cloud service companies provide different pricing options to their customers that can potentially lower the customers’ spending on the cloud. Choosing the right combination of pricing options can be formulated as a linear mixed integer programming problem, which can be solved using optimization.

  15. Computed radiography and the workstation in a study of the cervical spine. Technical and cost implications

    International Nuclear Information System (INIS)

    Garcia, J. M.; Lopez-Galiacho, N.; Martinez, M.

    1999-01-01

    To demonstrate the advantages of computed radiography and the workstation in assessing the images acquired in a study of the cervical spine. Lateral projections of cervical spine obtained using a computed radiography system in 63 ambulatory patients were studied in a workstation. Images of the tip of the odontoid process. C1-C2, basion-opisthion and C7 were visualized prior to and after their transmission and processing, and the overall improvement in their diagnostic quality was assessed. The rate of detection of the tip of the odontoid process, C1-C2, the foramen magnum and C/ increased by 17,6, 11 and 14 percentage points, respectively. Image processing improved the diagnostic quality in over 75% of cases. Image processing in a workstation improved the visualization of the anatomical points being studied and the diagnostic quality of the images. These advantages as well as the possibility of transferring the images to a picture archiving and communication system (PACS) are convincing reasons for using digital radiography. (Author) 7 refs

  16. Clinical and cost effectiveness of computer treatment for aphasia post stroke (Big CACTUS): study protocol for a randomised controlled trial.

    Science.gov (United States)

    Palmer, Rebecca; Cooper, Cindy; Enderby, Pam; Brady, Marian; Julious, Steven; Bowen, Audrey; Latimer, Nicholas

    2015-01-27

    Aphasia affects the ability to speak, comprehend spoken language, read and write. One third of stroke survivors experience aphasia. Evidence suggests that aphasia can continue to improve after the first few months with intensive speech and language therapy, which is frequently beyond what resources allow. The development of computer software for language practice provides an opportunity for self-managed therapy. This pragmatic randomised controlled trial will investigate the clinical and cost effectiveness of a computerised approach to long-term aphasia therapy post stroke. A total of 285 adults with aphasia at least four months post stroke will be randomly allocated to either usual care, computerised intervention in addition to usual care or attention and activity control in addition to usual care. Those in the intervention group will receive six months of self-managed word finding practice on their home computer with monthly face-to-face support from a volunteer/assistant. Those in the attention control group will receive puzzle activities, supplemented by monthly telephone calls. Study delivery will be coordinated by 20 speech and language therapy departments across the United Kingdom. Outcome measures will be made at baseline, six, nine and 12 months after randomisation by blinded speech and language therapist assessors. Primary outcomes are the change in number of words (of personal relevance) named correctly at six months and improvement in functional conversation. Primary outcomes will be analysed using a Hochberg testing procedure. Significance will be declared if differences in both word retrieval and functional conversation at six months are significant at the 5% level, or if either comparison is significant at 2.5%. A cost utility analysis will be undertaken from the NHS and personal social service perspective. Differences between costs and quality-adjusted life years in the three groups will be described and the incremental cost effectiveness ratio

  17. Operational technology for greater confinement disposal

    International Nuclear Information System (INIS)

    Dickman, P.T.; Vollmer, A.T.; Hunter, P.H.

    1984-12-01

    Procedures and methods for the design and operation of a greater confinement disposal facility using large-diameter boreholes are discussed. It is assumed that the facility would be located at an operating low-level waste disposal site and that only a small portion of the wastes received at the site would require greater confinement disposal. The document is organized into sections addressing: facility planning process; facility construction; waste loading and handling; radiological safety planning; operations procedures; and engineering cost studies. While primarily written for low-level waste management site operators and managers, a detailed economic assessment section is included that should assist planners in performing cost analyses. Economic assessments for both commercial and US government greater confinement disposal facilities are included. The estimated disposal costs range from $27 to $104 per cubic foot for a commercial facility and from $17 to $60 per cubic foot for a government facility. These costs are based on average site preparation, construction, and waste loading costs for both contact- and remote-handled wastes. 14 figures, 22 tables

  18. Potentially Low Cost Solution to Extend Use of Early Generation Computed Tomography

    Directory of Open Access Journals (Sweden)

    Tonna, Joseph E

    2010-12-01

    Full Text Available In preparing a case report on Brown-Séquard syndrome for publication, we made the incidental finding that the inexpensive, commercially available three-dimensional (3D rendering software we were using could produce high quality 3D spinal cord reconstructions from any series of two-dimensional (2D computed tomography (CT images. This finding raises the possibility that spinal cord imaging capabilities can be expanded where bundled 2D multi-planar reformats and 3D reconstruction software for CT are not available and in situations where magnetic resonance imaging (MRI is either not available or appropriate (e.g. metallic implants. Given the worldwide burden of trauma and considering the limited availability of MRI and advanced generation CT scanners, we propose an alternative, potentially useful approach to imaging spinal cord that might be useful in areas where technical capabilities and support are limited. [West J Emerg Med. 2010; 11(5:463-466.

  19. Computing Cost Price by Using Activity Based Costing (ABC Method in Dialysis Ward of Shahid Rajaei Medical & Education Center, in Alborz University of Medical Sciences Karaj in 2015

    Directory of Open Access Journals (Sweden)

    H. Derafshi

    2016-08-01

    Full Text Available Background: Analysis of hospital cost is one of the key subjects for resource allocation. The Activity – based costing is an applicable tool to recognize accurate costs .This technique helps to determine costs. The aim of this study is utilizing activity activity-based costing method to estimate the cost of dialysis unit related to Shahid Rajaei hospital in year 2015. Methods: The type of this research is applied and sectioned descriptive study. The required data is collected from dialysis unit , accounting unit, discharge, the completion of medical equipments of Shahid Rajaei hospital in the first six months 2015 which was calculated cost by excel software. Results and Conclusion: In any month, the average 1238 patients accepted to receive the dialysis services in Shahid Rajaei hospital .The cost of consumables materials was 47.6%, which is the majority percentage of allocated costs. The lowest cost related to insurance deductions about 2.27%. After Calculating various costs of dialysis services, we find out, the personal cost covers only 32% of the all cost. The other ongoing overhead cost is about 11.94% of all cost. Therefore, any dialysis service requires 2.017.131 rial costs, however the tariff of any dialysis service is 1.838.871 rial. So, this center loses 178,260 rial in each session. The results show that the cost of doing any dialysis services is more than the revenue of it in Shahid Rajaei hospital. It seems that the reforming processes of supplying consumable, changing the tariffs in chronic dialysis; especially in set the filter and consumable materials unit besides controlling the cost of human resource could decrease the cost of this unit with Regard to the results recommended using capacity of the private department recommended. 

  20. Leveraging Cloud Computing to Improve Storage Durability, Availability, and Cost for MER Maestro

    Science.gov (United States)

    Chang, George W.; Powell, Mark W.; Callas, John L.; Torres, Recaredo J.; Shams, Khawaja S.

    2012-01-01

    The Maestro for MER (Mars Exploration Rover) software is the premiere operation and activity planning software for the Mars rovers, and it is required to deliver all of the processed image products to scientists on demand. These data span multiple storage arrays sized at 2 TB, and a backup scheme ensures data is not lost. In a catastrophe, these data would currently recover at 20 GB/hour, taking several days for a restoration. A seamless solution provides access to highly durable, highly available, scalable, and cost-effective storage capabilities. This approach also employs a novel technique that enables storage of the majority of data on the cloud and some data locally. This feature is used to store the most recent data locally in order to guarantee utmost reliability in case of an outage or disconnect from the Internet. This also obviates any changes to the software that generates the most recent data set as it still has the same interface to the file system as it did before updates

  1. Providing a Spanish interpreter using low-cost videoconferencing in a community study computers

    Directory of Open Access Journals (Sweden)

    James L Wofford

    2013-03-01

    Full Text Available Background The advent of more mobile, more reliable, and more affordable videoconferencing technology finally makes it realistic to offer remote foreign language interpretation in the office setting. Still, such technologies deserve proof of acceptability to clinicians and patients before there is widespread acceptance and routine use.Objective We sought to examine: (1 the audio and video technical fidelity of iPad/Facetime software, (2 the acceptability of videoconferencing to patients and clinicians.Methods The convenience sample included Spanish-speaking adult patients at a community health care medicine clinic in 2011. Videoconferencing was conducted using two iPads connecting patient/physician located in the clinic examination room, and the interpreter in a remote/separate office in the same building. A five-item survey was used to solicit opinions on overall quality of the videoconferencing device, audio/video integrity/fidelity, perception of encounter duration, and attitude toward future use.Results Twenty-five patients, 18 clinicians and 5 interpreters participated in the project. Most patients (24/25 rated overall quality of videoconferencing as good/excellent with only 1 ‘fair’ rating. Eleven patients rated the amount of time as no longer than in-person, and nine reported it as shorter than in person. Most patients, 94.0% (24/25, favoured using videoconferencing during future visits. For the 18 clinicians, the results were similar.Conclusions Based on our experience at a single-site community health centre, the videoconferencing technology appeared to be flawless, and both patients and clinicians were satisfied. Expansion of videoconferencing to other off-site healthcare professionals should be considered in the search for more cost-effective healthcare.

  2. Modeling the economic costs of disasters and recovery: analysis using a dynamic computable general equilibrium model

    Science.gov (United States)

    Xie, W.; Li, N.; Wu, J.-D.; Hao, X.-L.

    2014-04-01

    Disaster damages have negative effects on the economy, whereas reconstruction investment has positive effects. The aim of this study is to model economic causes of disasters and recovery involving the positive effects of reconstruction activities. Computable general equilibrium (CGE) model is a promising approach because it can incorporate these two kinds of shocks into a unified framework and furthermore avoid the double-counting problem. In order to factor both shocks into the CGE model, direct loss is set as the amount of capital stock reduced on the supply side of the economy; a portion of investments restores the capital stock in an existing period; an investment-driven dynamic model is formulated according to available reconstruction data, and the rest of a given country's saving is set as an endogenous variable to balance the fixed investment. The 2008 Wenchuan Earthquake is selected as a case study to illustrate the model, and three scenarios are constructed: S0 (no disaster occurs), S1 (disaster occurs with reconstruction investment) and S2 (disaster occurs without reconstruction investment). S0 is taken as business as usual, and the differences between S1 and S0 and that between S2 and S0 can be interpreted as economic losses including reconstruction and excluding reconstruction, respectively. The study showed that output from S1 is found to be closer to real data than that from S2. Economic loss under S2 is roughly 1.5 times that under S1. The gap in the economic aggregate between S1 and S0 is reduced to 3% at the end of government-led reconstruction activity, a level that should take another four years to achieve under S2.

  3. Radioimmunoassay evaluation and quality control by use of a simple computer program for a low cost desk top calculator

    International Nuclear Information System (INIS)

    Schwarz, S.

    1980-01-01

    A simple computer program for the data processing and quality control of radioimmunoassays is presented. It is written for low cost programmable desk top calculator (Hewlett Packard 97), which can be afforded by smaller laboratories. The untreated counts from the scintillation spectrometer are entered manually; the printout gives the following results: initial data, logit-log transformed calibration points, parameters of goodness of fit and of the position of the standard curve, control and unknown samples dose estimates (mean value from single dose interpolations and scatter of replicates) together with the automatic calculation of within assay variance and, by use of magnetic cards holding the control parameters of all previous assays, between assay variance. (orig.) [de

  4. TOWARDS A LOW-COST, REAL-TIME PHOTOGRAMMETRIC LANDSLIDE MONITORING SYSTEM UTILISING MOBILE AND CLOUD COMPUTING TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    P. Chidburee

    2016-06-01

    Full Text Available Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i the development of an Android mobile application; (ii the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan and a web-based system (Autodesk 123D Catch. Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard

  5. Towards a Low-Cost Real-Time Photogrammetric Landslide Monitoring System Utilising Mobile and Cloud Computing Technology

    Science.gov (United States)

    Chidburee, P.; Mills, J. P.; Miller, P. E.; Fieber, K. D.

    2016-06-01

    Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM) pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i) the development of an Android mobile application; (ii) the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii) performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan) and a web-based system (Autodesk 123D Catch). Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard at a local-scale.

  6. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    International Nuclear Information System (INIS)

    Wattson, Daniel A.; Hunink, M.G. Myriam; DiPiro, Pamela J.; Das, Prajnan; Hodgson, David C.; Mauch, Peter M.; Ng, Andrea K.

    2014-01-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  7. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wattson, Daniel A., E-mail: dwattson@partners.org [Harvard Radiation Oncology Program, Boston, Massachusetts (United States); Hunink, M.G. Myriam [Departments of Radiology and Epidemiology, Erasmus Medical Center, Rotterdam, the Netherlands and Center for Health Decision Science, Harvard School of Public Health, Boston, Massachusetts (United States); DiPiro, Pamela J. [Department of Imaging, Dana-Farber Cancer Institute, Boston, Massachusetts (United States); Das, Prajnan [Department of Radiation Oncology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Hodgson, David C. [Department of Radiation Oncology, University of Toronto, Toronto, Ontario (Canada); Mauch, Peter M.; Ng, Andrea K. [Department of Radiation Oncology, Brigham and Women' s Hospital and Dana-Farber Cancer Institute, Boston, Massachusetts (United States)

    2014-10-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  8. Low-dose chest computed tomography for lung cancer screening among Hodgkin lymphoma survivors: a cost-effectiveness analysis.

    Science.gov (United States)

    Wattson, Daniel A; Hunink, M G Myriam; DiPiro, Pamela J; Das, Prajnan; Hodgson, David C; Mauch, Peter M; Ng, Andrea K

    2014-10-01

    Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening may be cost effective for all smokers but possibly not

  9. Low-cost computer classification of land cover in the Portland area, Oregon, by signature extension techniques

    Science.gov (United States)

    Gaydos, Leonard

    1978-01-01

    Computer-aided techniques for interpreting multispectral data acquired by Landsat offer economies in the mapping of land cover. Even so, the actual establishment of the statistical classes, or "signatures," is one of the relatively more costly operations involved. Analysts have therefore been seeking cost-saving signature extension techniques that would accept training data acquired for one time or place and apply them to another. Opportunities to extend signatures occur in preprocessing steps and in the classification steps that follow. In the present example, land cover classes were derived by the simplest and most direct form of signature extension: Classes statistically derived from a Landsat scene for the Puget Sound area, Wash., were applied to the Portland area, Oreg., using data for the next Landsat scene acquired less than 25 seconds down orbit. Many features can be recognized on the reduced-scale version of the Portland land cover map shown in this report, although no statistical assessment of its accuracy is available.

  10. Computed tomography for preoperative planning in minimal-invasive total hip arthroplasty: Radiation exposure and cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huppertz, Alexander, E-mail: Alexander.Huppertz@charite.de [Imaging Science Institute Charite Berlin, Robert-Koch-Platz 7, D-10115 Berlin (Germany); Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Radmer, Sebastian, E-mail: s.radmer@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany); Asbach, Patrick, E-mail: Patrick.Asbach@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Juran, Ralf, E-mail: ralf.juran@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Schwenke, Carsten, E-mail: carsten.schwenke@scossis.de [Biostatistician, Scossis Statistical Consulting, Zeltinger Str. 58G, D-13465 Berlin (Germany); Diederichs, Gerd, E-mail: gerd.diederichs@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Hamm, Bernd, E-mail: Bernd.Hamm@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Sparmann, Martin, E-mail: m.sparmann@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany)

    2011-06-15

    Computed tomography (CT) was used for preoperative planning of minimal-invasive total hip arthroplasty (THA). 92 patients (50 males, 42 females, mean age 59.5 years) with a mean body-mass-index (BMI) of 26.5 kg/m{sup 2} underwent 64-slice CT to depict the pelvis, the knee and the ankle in three independent acquisitions using combined x-, y-, and z-axis tube current modulation. Arthroplasty planning was performed using 3D-Hip Plan (Symbios, Switzerland) and patient radiation dose exposure was determined. The effects of BMI, gender, and contralateral THA on the effective dose were evaluated by an analysis-of-variance. A process-cost-analysis from the hospital perspective was done. All CT examinations were of sufficient image quality for 3D-THA planning. A mean effective dose of 4.0 mSv (SD 0.9 mSv) modeled by the BMI (p < 0.0001) was calculated. The presence of a contralateral THA (9/92 patients; p = 0.15) and the difference between males and females were not significant (p = 0.08). Personnel involved were the radiologist (4 min), the surgeon (16 min), the radiographer (12 min), and administrative personnel (4 min). A CT operation time of 11 min and direct per-patient costs of 52.80 Euro were recorded. Preoperative CT for THA was associated with a slight and justifiable increase of radiation exposure in comparison to conventional radiographs and low per-patient costs.

  11. A low-cost computer-controlled Arduino-based educational laboratory system for teaching the fundamentals of photovoltaic cells

    International Nuclear Information System (INIS)

    Zachariadou, K; Yiasemides, K; Trougkakos, N

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C number sign and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments’ academic staff. (paper)

  12. A low-cost computer-controlled Arduino-based educational laboratory system for teaching the fundamentals of photovoltaic cells

    Energy Technology Data Exchange (ETDEWEB)

    Zachariadou, K; Yiasemides, K; Trougkakos, N [Technological Educational Institute of Piraeus, P Ralli and Thivon 250, 12244 Egaleo (Greece)

    2012-11-15

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C number sign and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments' academic staff. (paper)

  13. Beat-ID: Towards a computationally low-cost single heartbeat biometric identity check system based on electrocardiogram wave morphology

    Science.gov (United States)

    Paiva, Joana S.; Dias, Duarte

    2017-01-01

    In recent years, safer and more reliable biometric methods have been developed. Apart from the need for enhanced security, the media and entertainment sectors have also been applying biometrics in the emerging market of user-adaptable objects/systems to make these systems more user-friendly. However, the complexity of some state-of-the-art biometric systems (e.g., iris recognition) or their high false rejection rate (e.g., fingerprint recognition) is neither compatible with the simple hardware architecture required by reduced-size devices nor the new trend of implementing smart objects within the dynamic market of the Internet of Things (IoT). It was recently shown that an individual can be recognized by extracting features from their electrocardiogram (ECG). However, most current ECG-based biometric algorithms are computationally demanding and/or rely on relatively large (several seconds) ECG samples, which are incompatible with the aforementioned application fields. Here, we present a computationally low-cost method (patent pending), including simple mathematical operations, for identifying a person using only three ECG morphology-based characteristics from a single heartbeat. The algorithm was trained/tested using ECG signals of different duration from the Physionet database on more than 60 different training/test datasets. The proposed method achieved maximal averaged accuracy of 97.450% in distinguishing each subject from a ten-subject set and false acceptance and rejection rates (FAR and FRR) of 5.710±1.900% and 3.440±1.980%, respectively, placing Beat-ID in a very competitive position in terms of the FRR/FAR among state-of-the-art methods. Furthermore, the proposed method can identify a person using an average of 1.020 heartbeats. It therefore has FRR/FAR behavior similar to obtaining a fingerprint, yet it is simpler and requires less expensive hardware. This method targets low-computational/energy-cost scenarios, such as tiny wearable devices (e.g., a

  14. A feasibility study on direct methanol fuel cells for laptop computers based on a cost comparison with lithium-ion batteries

    International Nuclear Information System (INIS)

    Wee, Jung-Ho

    2007-01-01

    This paper compares the total cost of direct methanol fuel cell (DMFC) and lithium (Li)-ion battery systems when applied as the power supply for laptop computers in the Korean environment. The average power output and operational time of the laptop computers were assumed to be 20 W and 3000 h, respectively. Considering the status of their technologies and with certain conditions assumed, the total costs were calculated to be US$140 for the Li-ion battery and US$362 for DMFC. The manufacturing costs of the DMFC and Li-ion battery systems were calculated to be $16.65 W -1 and $0.77 W h -1 , and the energy consumption costs to be $0.00051 W h -1 and $0.00032 W h -1 , respectively. The higher fuel consumption cost of the DMFC system was due to the methanol (MeOH) crossover loss. Therefore, the requirements for DMFCs to be able to compete with Li-ion batteries in terms of energy cost include reducing the crossover level to at an order magnitude of -9 and the MeOH price to under $0.5 kg -1 . Under these conditions, if the DMFC manufacturing cost could be reduced to $6.30 W -1 , then the DMFC system would become at least as competitive as the Li-ion battery system for powering laptop computers in Korea. (author)

  15. Early assessment of the likely cost-effectiveness of a new technology: A Markov model with probabilistic sensitivity analysis of computer-assisted total knee replacement.

    Science.gov (United States)

    Dong, Hengjin; Buxton, Martin

    2006-01-01

    The objective of this study is to apply a Markov model to compare cost-effectiveness of total knee replacement (TKR) using computer-assisted surgery (CAS) with that of TKR using a conventional manual method in the absence of formal clinical trial evidence. A structured search was carried out to identify evidence relating to the clinical outcome, cost, and effectiveness of TKR. Nine Markov states were identified based on the progress of the disease after TKR. Effectiveness was expressed by quality-adjusted life years (QALYs). The simulation was carried out initially for 120 cycles of a month each, starting with 1,000 TKRs. A discount rate of 3.5 percent was used for both cost and effectiveness in the incremental cost-effectiveness analysis. Then, a probabilistic sensitivity analysis was carried out using a Monte Carlo approach with 10,000 iterations. Computer-assisted TKR was a long-term cost-effective technology, but the QALYs gained were small. After the first 2 years, the incremental cost per QALY of computer-assisted TKR was dominant because of cheaper and more QALYs. The incremental cost-effectiveness ratio (ICER) was sensitive to the "effect of CAS," to the CAS extra cost, and to the utility of the state "Normal health after primary TKR," but it was not sensitive to utilities of other Markov states. Both probabilistic and deterministic analyses produced similar cumulative serious or minor complication rates and complex or simple revision rates. They also produced similar ICERs. Compared with conventional TKR, computer-assisted TKR is a cost-saving technology in the long-term and may offer small additional QALYs. The "effect of CAS" is to reduce revision rates and complications through more accurate and precise alignment, and although the conclusions from the model, even when allowing for a full probabilistic analysis of uncertainty, are clear, the "effect of CAS" on the rate of revisions awaits long-term clinical evidence.

  16. Cost and resource utilization associated with use of computed tomography to evaluate chest pain in the emergency department: the Rule Out Myocardial Infarction using Computer Assisted Tomography (ROMICAT) study.

    Science.gov (United States)

    Hulten, Edward; Goehler, Alexander; Bittencourt, Marcio Sommer; Bamberg, Fabian; Schlett, Christopher L; Truong, Quynh A; Nichols, John; Nasir, Khurram; Rogers, Ian S; Gazelle, Scott G; Nagurney, John T; Hoffmann, Udo; Blankstein, Ron

    2013-09-01

    Coronary computed tomographic angiography (cCTA) allows rapid, noninvasive exclusion of obstructive coronary artery disease (CAD). However, concern exists whether implementation of cCTA in the assessment of patients presenting to the emergency department with acute chest pain will lead to increased downstream testing and costs compared with alternative strategies. Our aim was to compare observed actual costs of usual care (UC) with projected costs of a strategy including early cCTA in the evaluation of patients with acute chest pain in the Rule Out Myocardial Infarction Using Computer Assisted Tomography I (ROMICAT I) study. We compared cost and hospital length of stay of UC observed among 368 patients enrolled in the ROMICAT I study with projected costs of management based on cCTA. Costs of UC were determined by an electronic cost accounting system. Notably, UC was not influenced by cCTA results because patients and caregivers were blinded to the cCTA results. Costs after early implementation of cCTA were estimated assuming changes in management based on cCTA findings of the presence and severity of CAD. Sensitivity analysis was used to test the influence of key variables on both outcomes and costs. We determined that in comparison with UC, cCTA-guided triage, whereby patients with no CAD are discharged, could reduce total hospital costs by 23% (Pcost increases such that when the prevalence of ≥ 50% stenosis is >28% to 33%, the use of cCTA becomes more costly than UC. cCTA may be a cost-saving tool in acute chest pain populations that have a prevalence of potentially obstructive CAD cost would be anticipated in populations with higher prevalence of disease.

  17. Costs associated with implementation of computer-assisted clinical decision support system for antenatal and delivery care: case study of Kassena-Nankana district of northern Ghana.

    Directory of Open Access Journals (Sweden)

    Maxwell Ayindenaba Dalaba

    Full Text Available This study analyzed cost of implementing computer-assisted Clinical Decision Support System (CDSS in selected health care centres in Ghana.A descriptive cross sectional study was conducted in the Kassena-Nankana district (KND. CDSS was deployed in selected health centres in KND as an intervention to manage patients attending antenatal clinics and the labour ward. The CDSS users were mainly nurses who were trained. Activities and associated costs involved in the implementation of CDSS (pre-intervention and intervention were collected for the period between 2009-2013 from the provider perspective. The ingredients approach was used for the cost analysis. Costs were grouped into personnel, trainings, overheads (recurrent costs and equipment costs (capital cost. We calculated cost without annualizing capital cost to represent financial cost and cost with annualizing capital costs to represent economic cost.Twenty-two trained CDSS users (at least 2 users per health centre participated in the study. Between April 2012 and March 2013, users managed 5,595 antenatal clients and 872 labour clients using the CDSS. We observed a decrease in the proportion of complications during delivery (pre-intervention 10.74% versus post-intervention 9.64% and a reduction in the number of maternal deaths (pre-intervention 4 deaths versus post-intervention 1 death. The overall financial cost of CDSS implementation was US$23,316, approximately US$1,060 per CDSS user trained. Of the total cost of implementation, 48% (US$11,272 was pre-intervention cost and intervention cost was 52% (US$12,044. Equipment costs accounted for the largest proportion of financial cost: 34% (US$7,917. When economic cost was considered, total cost of implementation was US$17,128-lower than the financial cost by 26.5%.The study provides useful information in the implementation of CDSS at health facilities to enhance health workers' adherence to practice guidelines and taking accurate decisions to

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  20. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  3. Straightening the Hierarchical Staircase for Basis Set Extrapolations: A Low-Cost Approach to High-Accuracy Computational Chemistry

    Science.gov (United States)

    Varandas, António J. C.

    2018-04-01

    Because the one-electron basis set limit is difficult to reach in correlated post-Hartree-Fock ab initio calculations, the low-cost route of using methods that extrapolate to the estimated basis set limit attracts immediate interest. The situation is somewhat more satisfactory at the Hartree-Fock level because numerical calculation of the energy is often affordable at nearly converged basis set levels. Still, extrapolation schemes for the Hartree-Fock energy are addressed here, although the focus is on the more slowly convergent and computationally demanding correlation energy. Because they are frequently based on the gold-standard coupled-cluster theory with single, double, and perturbative triple excitations [CCSD(T)], correlated calculations are often affordable only with the smallest basis sets, and hence single-level extrapolations from one raw energy could attain maximum usefulness. This possibility is examined. Whenever possible, this review uses raw data from second-order Møller-Plesset perturbation theory, as well as CCSD, CCSD(T), and multireference configuration interaction methods. Inescapably, the emphasis is on work done by the author's research group. Certain issues in need of further research or review are pinpointed.

  4. Cost-Effectiveness Analysis (CEA) of Intravenous Urography (IVU) and Unenhanced Multidetector Computed Tomography (MDCT) for Initial Investigation of Suspected Acute Ureterolithiasis

    International Nuclear Information System (INIS)

    Eikefjord, E.; Askildsen, J.E.; Roervik, J.

    2008-01-01

    Background: It is important to compare the cost and effectiveness of multidetector computed tomography (MDCT) and intravenous urography (IVU) to determine the most cost-effective alternative for the initial investigation of acute ureterolithiasis. Purpose: To analyze the task-specific variable costs combined with the diagnostic effect of MDCT and IVU for patients with acute flank pain, and to determine which is most cost effective. Material and Methods: 119 patients with acute flank pain suggestive of stone disease (ureterolithiasis) were examined by both MDCT and IVU. Variable costs related to medical equipment, consumption material, equipment control, and personnel were calculated. The diagnostic effect was assessed. Results: The variable costs of MDCT versus IVU were EUR 32 and EUR 117, respectively. This significant difference was mainly due to savings in examination time, higher annual examination frequency, lower material costs, and no use of contrast media. As for diagnostic effect, MDCT proved considerably more accurate in the diagnosis of stone disease than IVU and markedly more accurate concerning differential diagnoses. Conclusion: MDCT had lower differential costs and a higher capacity to determine correctly stone disease and differential diagnoses, as compared to IVU, in patients with acute flank pain. Consequently, MDCT is a dominant alternative to IVU when evaluated exclusively from a cost-effective perspective

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  7. The cost-effectiveness of the RSI QuickScan intervention programme for computer workers: Results of an economic evaluation alongside a randomised controlled trial.

    Science.gov (United States)

    Speklé, Erwin M; Heinrich, Judith; Hoozemans, Marco J M; Blatter, Birgitte M; van der Beek, Allard J; van Dieën, Jaap H; van Tulder, Maurits W

    2010-11-11

    The costs of arm, shoulder and neck symptoms are high. In order to decrease these costs employers implement interventions aimed at reducing these symptoms. One frequently used intervention is the RSI QuickScan intervention programme. It establishes a risk profile of the target population and subsequently advises interventions following a decision tree based on that risk profile. The purpose of this study was to perform an economic evaluation, from both the societal and companies' perspective, of the RSI QuickScan intervention programme for computer workers. In this study, effectiveness was defined at three levels: exposure to risk factors, prevalence of arm, shoulder and neck symptoms, and days of sick leave. The economic evaluation was conducted alongside a randomised controlled trial (RCT). Participating computer workers from 7 companies (N = 638) were assigned to either the intervention group (N = 320) or the usual care group (N = 318) by means of cluster randomisation (N = 50). The intervention consisted of a tailor-made programme, based on a previously established risk profile. At baseline, 6 and 12 month follow-up, the participants completed the RSI QuickScan questionnaire. Analyses to estimate the effect of the intervention were done according to the intention-to-treat principle. To compare costs between groups, confidence intervals for cost differences were computed by bias-corrected and accelerated bootstrapping. The mean intervention costs, paid by the employer, were 59 euro per participant in the intervention and 28 euro in the usual care group. Mean total health care and non-health care costs per participant were 108 euro in both groups. As to the cost-effectiveness, improvement in received information on healthy computer use as well as in their work posture and movement was observed at higher costs. With regard to the other risk factors, symptoms and sick leave, only small and non-significant effects were found. In this study, the RSI Quick

  8. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  15. Computer simulation of energy use, greenhouse gas emissions, and costs for alternative methods of processing fluid milk.

    Science.gov (United States)

    Tomasula, P M; Datta, N; Yee, W C F; McAloon, A J; Nutter, D W; Sampedro, F; Bonnaillie, L M

    2014-07-01

    Computer simulation is a useful tool for benchmarking electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature, short time (HTST) pasteurization was extended to include models for processes for shelf-stable milk and extended shelf-life milk that may help prevent the loss or waste of milk that leads to increases in the greenhouse gas (GHG) emissions for fluid milk. The models were for UHT processing, crossflow microfiltration (MF) without HTST pasteurization, crossflow MF followed by HTST pasteurization (MF/HTST), crossflow MF/HTST with partial homogenization, and pulsed electric field (PEF) processing, and were incorporated into the existing model for the fluid milk process. Simulation trials were conducted assuming a production rate for the plants of 113.6 million liters of milk per year to produce only whole milk (3.25%) and 40% cream. Results showed that GHG emissions in the form of process-related CO₂ emissions, defined as CO₂ equivalents (e)/kg of raw milk processed (RMP), and specific energy consumptions (SEC) for electricity and natural gas use for the HTST process alone were 37.6g of CO₂e/kg of RMP, 0.14 MJ/kg of RMP, and 0.13 MJ/kg of RMP, respectively. Emissions of CO2 and SEC for electricity and natural gas use were highest for the PEF process, with values of 99.1g of CO₂e/kg of RMP, 0.44 MJ/kg of RMP, and 0.10 MJ/kg of RMP, respectively, and lowest for the UHT process at 31.4 g of CO₂e/kg of RMP, 0.10 MJ/kg of RMP, and 0.17 MJ/kg of RMP. Estimated unit production costs associated with the various processes were lowest for the HTST process and MF/HTST with partial homogenization at $0.507/L and highest for the UHT process at $0.60/L. The increase in shelf life associated with the UHT and MF processes may eliminate some of the supply chain product and consumer losses and waste of milk and compensate for the small increases in GHG

  16. Is there any way out there? Environmental and cultural influences in computing least-cost paths with GIS

    Directory of Open Access Journals (Sweden)

    Fairén Jiménez, Sara

    2004-12-01

    Full Text Available One of the most interesting subjects in post-structural landscape studies is the analysis of the relationships between its natural and cultural components: the structuring of the landscape, with the identification of the social practices and patterns of movement that took part around them. These patterns depend both on the natural form of the terrain and on socio-cultural decisions. In relation to the settlement pattern and distribution of rock art sites in the central- Mediterranean coastal area of Spain, a method is proposed to evaluate the role of cultural aspects of landscape in computing least-cost paths.

    Entre los recientes estudios dentro de la Arqueología del Paisaje, uno de los aspectos que presenta mayor potencial interpretativo es el análisis de la relación entre sus distintos componentes naturales y culturales: la articulación del paisaje – con la identificación de las prácticas sociales que se realizarían en torno a estos elementos y de las pautas de movimiento entre unos y otros. Este movimiento dependería tanto de las características naturales del terreno como de decisiones prácticas de carácter socio-cultural. A partir del estudio de la distribución del poblamiento y abrigos con arte rupestre neolíticos en las tierras centro-meridionales valencianas, se propone un sistema para la introducción y valoración del papel de los componentes culturales del paisaje en el cálculo de caminos óptimos mediante Sistemas de Información Geográfica.

  17. [Changing the internal cost allocation (ICA) on DRG shares : Example of computed tomography in a university radiology setting].

    Science.gov (United States)

    Wirth, K; Zielinski, P; Trinter, T; Stahl, R; Mück, F; Reiser, M; Wirth, S

    2016-08-01

    In hospitals, the radiological services provided to non-privately insured in-house patients are mostly distributed to requesting disciplines through internal cost allocation (ICA). In many institutions, computed tomography (CT) is the modality with the largest amount of allocation credits. The aim of this work is to compare the ICA to respective DRG (Diagnosis Related Groups) shares for diagnostic CT services in a university hospital setting. The data from four CT scanners in a large university hospital were processed for the 2012 fiscal year. For each of the 50 DRG groups with the most case-mix points, all diagnostic CT services were documented including their respective amount of GOÄ allocation credits and invoiced ICA value. As the German Institute for Reimbursement of Hospitals (InEK) database groups the radiation disciplines (radiology, nuclear medicine and radiation therapy) together and also lacks any modality differentiation, the determination of the diagnostic CT component was based on the existing institutional distribution of ICA allocations. Within the included 24,854 cases, 63,062,060 GOÄ-based performance credits were counted. The ICA relieved these diagnostic CT services by € 819,029 (single credit value of 1.30 Eurocent), whereas accounting by using DRG shares would have resulted in € 1,127,591 (single credit value of 1.79 Eurocent). The GOÄ single credit value is 5.62 Eurocent. The diagnostic CT service was basically rendered as relatively inexpensive. In addition to a better financial result, changing the current ICA to DRG shares might also mean a chance for real revenues. However, the attractiveness considerably depends on how the DRG shares are distributed to the different radiation disciplines of one institution.

  18. Planning for greater-confinement disposal

    International Nuclear Information System (INIS)

    Gilbert, T.L.; Luner, C.; Meshkov, N.K.; Trevorrow, L.E.; Yu, C.

    1984-01-01

    This contribution is a progress report for preparation of a document that will summarize procedures and technical information needed to plan for and implement greater-confinement disposal (GCD) of low-level radioactive waste. Selection of a site and a facility design (Phase I), and construction, operation, and extended care (Phase II) will be covered in the document. This progress report is limited to Phase I. Phase I includes determination of the need for GCD, design alternatives, and selection of a site and facility design. Alternative designs considered are augered shafts, deep trenches, engineered structures, high-integrity containers, hydrofracture, and improved waste form. Design considerations and specifications, performance elements, cost elements, and comparative advantages and disadvantages of the different designs are covered. Procedures are discussed for establishing overall performance objectives and waste-acceptance criteria, and for comparative assessment of the performance and cost of the different alternatives. 16 references

  19. Planning for greater-confinement disposal

    International Nuclear Information System (INIS)

    Gilbert, T.L.; Luner, C.; Meshkov, N.K.; Trevorrow, L.E.; Yu, C.

    1984-01-01

    This contribution is a progress report for preparation of a document that will summarize procedures and technical information needed to plan for and implement greater-confinement disposal (GCD) of low-level radioactive waste. Selection of a site and a facility design (Phase I), and construction, operation, and extended care (Phase II) will be covered in the document. This progress report is limited to Phase I. Phase I includes determination of the need for GCD, design alternatives, and selection of a site and facility design. Alternative designs considered are augered shafts, deep trenches, engineered structures, high-integrity containers, hydrofracture, and improved waste form. Design considerations and specifications, performance elements, cost elements, and comparative advantages and disadvantages of the different designs are covered. Procedures are discussed for establishing overall performance objecties and waste-acceptance criteria, and for comparative assessment of the performance and cost of the different alternatives. 16 refs

  20. Greater trochanteric fracture with occult intertrochanteric extension.

    Science.gov (United States)

    Reiter, Michael; O'Brien, Seth D; Bui-Mansfield, Liem T; Alderete, Joseph

    2013-10-01

    Proximal femoral fractures are frequently encountered in the emergency department (ED). Prompt diagnosis is paramount as delay will exacerbate the already poor outcomes associated with these injuries. In cases where radiography is negative but clinical suspicion remains high, magnetic resonance imaging (MRI) is the study of choice as it has the capability to depict fractures which are occult on other imaging modalities. Awareness of a particular subset of proximal femoral fractures, namely greater trochanteric fractures, is vital for both radiologists and clinicians since it has been well documented that they invariably have an intertrochanteric component which may require surgical management. The detection of intertrochanteric or cervical extension of greater trochanteric fractures has been described utilizing MRI but is underestimated with both computed tomography (CT) and bone scan. Therefore, if MRI is unavailable or contraindicated, the diagnosis of an isolated greater trochanteric fracture should be met with caution. The importance of avoiding this potential pitfall is demonstrated in the following case of an elderly woman with hip pain and CT demonstrating an isolated greater trochanteric fracture who subsequently returned to the ED with a displaced intertrochanteric fracture.

  1. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  2. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  4. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  5. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  8. Suspected acute pulmonary emboli: cost-effectiveness of chest helical computed tomography versus a standard diagnostic algorithm incorporating ventilation-perfusion scintigraphy

    International Nuclear Information System (INIS)

    Larcos, G.; Chi, K.K.G.; Berry, G.; Westmead Hospital, Sydney, NSW; Shiell, A.

    2000-01-01

    There is a controversy regarding the investigation of patients with suspected acute pulmonary embolism (PE). To compare the cost-effectiveness of alternative methods of diagnosing acute PE, chest helical computed tomography (CT) alone and in combination with venous ultrasound (US) of legs and pulmonary angiography (PA) were compared to a conventional algorithm using ventilation-perfusion (V/Q) scintigraphy supplemented in selected cases by US and PA. A decision-analytical model was constructed to model the costs and effects of the three diagnostic strategies in a hypothetical cohort of 1000 patients each. Transition probabilities were based on published data. Life years gained by each strategy were estimated from published mortality rates. Schedule fees were used to estimate costs. The V/Q protocol is both more expensive and more effective than CT alone resulting in 20.1 additional lives saved at a (discounted) cost of $940 per life year gained. An additional 2.5 lives can be saved if CT replaces V/Q scintigraphy in the diagnostic algorithm but at a cost of $23,905 per life year saved. It resulted that the more effective diagnostic strategies are also more expensive. In patients with suspected PE, the incremental cost-effectiveness of the V/Q based strategy over CT alone is reasonable in comparison with other health interventions. The cost-effectiveness of the supplemented CT strategy is more questionable. Copyright (2000) The Australasian College of Physicians

  9. Positron emission tomography/computed tomography surveillance in patients with Hodgkin lymphoma in first remission has a low positive predictive value and high costs.

    Science.gov (United States)

    El-Galaly, Tarec Christoffer; Mylam, Karen Juul; Brown, Peter; Specht, Lena; Christiansen, Ilse; Munksgaard, Lars; Johnsen, Hans Erik; Loft, Annika; Bukh, Anne; Iyer, Victor; Nielsen, Anne Lerberg; Hutchings, Martin

    2012-06-01

    The value of performing post-therapy routine surveillance imaging in patients with Hodgkin lymphoma is controversial. This study evaluates the utility of positron emission tomography/computed tomography using 2-[18F]fluoro-2-deoxyglucose for this purpose and in situations with suspected lymphoma relapse. We conducted a multicenter retrospective study. Patients with newly diagnosed Hodgkin lymphoma achieving at least a partial remission on first-line therapy were eligible if they received positron emission tomography/computed tomography surveillance during follow-up. Two types of imaging surveillance were analyzed: "routine" when patients showed no signs of relapse at referral to positron emission tomography/computed tomography, and "clinically indicated" when recurrence was suspected. A total of 211 routine and 88 clinically indicated positron emission tomography/computed tomography studies were performed in 161 patients. In ten of 22 patients with recurrence of Hodgkin lymphoma, routine imaging surveillance was the primary tool for the diagnosis of the relapse. Extranodal disease, interim positron emission tomography-positive lesions and positron emission tomography activity at response evaluation were all associated with a positron emission tomography/computed tomography-diagnosed preclinical relapse. The true positive rates of routine and clinically indicated imaging were 5% and 13%, respectively (P = 0.02). The overall positive predictive value and negative predictive value of positron emission tomography/computed tomography were 28% and 100%, respectively. The estimated cost per routine imaging diagnosed relapse was US$ 50,778. Negative positron emission tomography/computed tomography reliably rules out a relapse. The high false positive rate is, however, an important limitation and a confirmatory biopsy is mandatory for the diagnosis of a relapse. With no proven survival benefit for patients with a pre-clinically diagnosed relapse, the high costs and low

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  11. Development of a Computer Program (CASK) for the Analysis of Logistics and Transportation Cost of the Spent Fuels

    International Nuclear Information System (INIS)

    Cha, Jeong-Hun; Choi, Heui-Joo; Cho, Dong-Keun; Kim, Seong-Ki; Lee, Jong-Youl; Choi, Jong-Won

    2008-07-01

    The cost for the spent fuel management includes the costs for the interim storage, the transportation, and the permanent disposal of the spent fuels. The CASK(Cost and logistics Analysis program for Spent fuel transportation in Korea) program is developed to analyze logistics and transportation cost of the spent fuels. And the total amount of PWR spent fuels stored in four nuclear plant sites, a centralized interim storage facility near coast and a permanent disposal facility near the interim storage facility are considered in this program. The CASK program is developed by using Visual Basic language and coupled with an excel sheet. The excel sheet shows a change of logistics and transportation cost. Also transportation unit cost is easily changed in the excel sheet. The scopes of the report are explanation of parameters in the CASK program and a preliminary calculation. We have developed the CASK version 1.0 so far, and will update its parameters in transportation cost and transportation scenario. Also, we will incorporate it into the program which is used for the projection of spent fuels from the nuclear power plants. Finally, it is expected that the CASK program could be a part of the cost estimation tools which are under development at KAERI. And this program will be a very useful tool for the establishment of transportation scenario and transportation cost in Korean situations

  12. A Proposed Model for Improving Performance and Reducing Costs of IT Through Cloud Computing of Egyptian Business Enterprises

    OpenAIRE

    Mohamed M.El Hadi; Azza Monir Ismail

    2016-01-01

    Information technologies are affecting the big business enterprises of todays from data processing and transactions to achieve the goals efficiently and effectively, affecting creates new business opportunities and towards new competitive advantage, service must be enough to match the recent trends of IT such as cloud computing. Cloud computing technology has provided all IT services. Therefore, cloud computing offers an alternative to adaptable with technology model current , creating reduci...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  14. Costs and clinical outcomes in individuals without known coronary artery disease undergoing coronary computed tomographic angiography from an analysis of Medicare category III transaction codes.

    Science.gov (United States)

    Min, James K; Shaw, Leslee J; Berman, Daniel S; Gilmore, Amanda; Kang, Ning

    2008-09-15

    Multidetector coronary computed tomographic angiography (CCTA) demonstrates high accuracy for the detection and exclusion of coronary artery disease (CAD) and predicts adverse prognosis. To date, opportunity costs relating the clinical and economic outcomes of CCTA compared with other methods of diagnosing CAD, such as myocardial perfusion single-photon emission computed tomography (SPECT), remain unknown. An observational, multicenter, patient-level analysis of patients without known CAD who underwent CCTA or SPECT was performed. Patients who underwent CCTA (n = 1,938) were matched to those who underwent SPECT (n = 7,752) on 8 demographic and clinical characteristics and 2 summary measures of cardiac medications and co-morbidities and were evaluated for 9-month expenditures and clinical outcomes. Adjusted total health care and CAD expenditures were 27% (p cost-efficient alternative to SPECT for the initial coronary evaluation of patients without known CAD.

  15. An Analysis of the RCA Price-S Cost Estimation Model as it Relates to Current Air Force Computer Software Acquisition and Management.

    Science.gov (United States)

    1979-12-01

    because of the use of complex computational algorithms (Ref 25). Another important factor effecting the cost of soft- ware is the size of the development...involved the alignment and navigational algorithm portions of the software. The second avionics system application was the development of an inertial...001 1 COAT CONL CREA CINT CMAT CSTR COPR CAPP New Code .001 .001 .001 .001 1001 ,OO .00 Device TDAT T03NL TREA TINT Types o * Quantity OGAT OONL OREA

  16. Computer simulation to predict energy use, greenhouse gas emissions and costs for production of fluid milk using alternative processing methods

    Science.gov (United States)

    Computer simulation is a useful tool for benchmarking the electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature short time (HTST) pasteurization was extended to include models for pr...

  17. [Autoerotic fatalities in Greater Dusseldorf].

    Science.gov (United States)

    Hartung, Benno; Hellen, Florence; Borchard, Nora; Huckenbeck, Wolfgang

    2011-01-01

    Autoerotic fatalities in the Greater Dusseldorf area correspond to the relevant medicolegal literature. Our results included exclusively young to middle-aged, usually single men who were found dead in their city apartments. Clothing and devices used showed a great variety. Women's or fetish clothing and complex shackling or hanging devices were disproportionately frequent. In most cases, death occurred due to hanging or ligature strangulation. There was no increased incidence of underlying psychiatric disorders. In most of the deceased no or at least no remarkable alcohol intoxication was found. Occasionally, it may be difficult to reliably differentiate autoerotic accidents, accidents occurring in connection with practices of bondage & discipline, dominance & submission (BDSM) from natural death, suicide or homicide.

  18. Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics.

    Science.gov (United States)

    Patrizi, Alfredo; Pennestrì, Ettore; Valentini, Pier Paolo

    2016-01-01

    The paper deals with the comparison between a high-end marker-based acquisition system and a low-cost marker-less methodology for the assessment of the human posture during working tasks. The low-cost methodology is based on the use of a single Microsoft Kinect V1 device. The high-end acquisition system is the BTS SMART that requires the use of reflective markers to be placed on the subject's body. Three practical working activities involving object lifting and displacement have been investigated. The operational risk has been evaluated according to the lifting equation proposed by the American National Institute for Occupational Safety and Health. The results of the study show that the risk multipliers computed from the two acquisition methodologies are very close for all the analysed activities. In agreement to this outcome, the marker-less methodology based on the Microsoft Kinect V1 device seems very promising to promote the dissemination of computer-aided assessment of ergonomics while maintaining good accuracy and affordable costs. PRACTITIONER’S SUMMARY: The study is motivated by the increasing interest for on-site working ergonomics assessment. We compared a low-cost marker-less methodology with a high-end marker-based system. We tested them on three different working tasks, assessing the working risk of lifting loads. The two methodologies showed comparable precision in all the investigations.

  19. Product Costing in FMT: Comparing Deterministic and Stochastic Models Using Computer-Based Simulation for an Actual Case Study

    DEFF Research Database (Denmark)

    Nielsen, Steen

    2000-01-01

    This paper expands the traditional product costing technique be including a stochastic form in a complex production process for product costing. The stochastic phenomenon in flesbile manufacturing technologies is seen as an important phenomenon that companies try to decreas og eliminate. DFM has...... been used for evaluating the appropriateness of the firm's production capability. In this paper a simulation model is developed to analyze the relevant cost behaviour with respect to DFM and to develop a more streamlined process in the layout of the manufacturing process....

  20. Doing Very Big Calculations on Modest Size Computers: Reducing the Cost of Exact Diagonalization Using Singular Value Decomposition

    International Nuclear Information System (INIS)

    Weinstein, M.

    2012-01-01

    I will talk about a new way of implementing Lanczos and contraction algorithms to diagonalize lattice Hamiltonians that dramatically reduces the memory required to do the computation, without restricting to variational ansatzes. (author)

  1. Noise Threshold and Resource Cost of Fault-Tolerant Quantum Computing with Majorana Fermions in Hybrid Systems.

    Science.gov (United States)

    Li, Ying

    2016-09-16

    Fault-tolerant quantum computing in systems composed of both Majorana fermions and topologically unprotected quantum systems, e.g., superconducting circuits or quantum dots, is studied in this Letter. Errors caused by topologically unprotected quantum systems need to be corrected with error-correction schemes, for instance, the surface code. We find that the error-correction performance of such a hybrid topological quantum computer is not superior to a normal quantum computer unless the topological charge of Majorana fermions is insusceptible to noise. If errors changing the topological charge are rare, the fault-tolerance threshold is much higher than the threshold of a normal quantum computer and a surface-code logical qubit could be encoded in only tens of topological qubits instead of about 1,000 normal qubits.

  2. Greater confinement disposal of radioactive wastes

    International Nuclear Information System (INIS)

    Trevorrow, L.E.; Gilbert, T.L.; Luner, C.; Merry-Libby, P.A.; Meshkov, N.K.; Yu, C.

    1985-01-01

    Low-level radioactive waste (LLW) includes a broad spectrum of different radionuclide concentrations, half-lives, and hazards. Standard shallow-land burial practice can provide adequate protection of public health and safety for most LLW. A small volume fraction (approx. 1%) containing most of the activity inventory (approx. 90%) requires specific measures known as greater-confinement disposal (GCD). Different site characteristics and different waste characteristics - such as high radionuclide concentrations, long radionuclide half-lives, high radionuclide mobility, and physical or chemical characteristics that present exceptional hazards - lead to different GCD facility design requirements. Facility design alternatives considered for GCD include the augered shaft, deep trench, engineered structure, hydrofracture, improved waste form, and high-integrity container. Selection of an appropriate design must also consider the interplay between basic risk limits for protection of public health and safety, performance characteristics and objectives, costs, waste-acceptance criteria, waste characteristics, and site characteristics

  3. Waste management in Greater Vancouver

    Energy Technology Data Exchange (ETDEWEB)

    Carrusca, K. [Greater Vancouver Regional District, Burnaby, BC (Canada); Richter, R. [Montenay Inc., Vancouver, BC (Canada)]|[Veolia Environmental Services, Vancouver, BC (Canada)

    2006-07-01

    An outline of the Greater Vancouver Regional District (GVRD) waste-to-energy program was presented. The GVRD has an annual budget for solid waste management of $90 million. Energy recovery revenues from solid waste currently exceed $10 million. Over 1,660,00 tonnes of GVRD waste is recycled, and another 280,000 tonnes is converted from waste to energy. The GVRD waste-to-energy facility combines state-of-the-art combustion and air pollution control, and has processed over 5 million tonnes of municipal solid waste since it opened in 1988. Its central location minimizes haul distance, and it was originally sited to utilize steam through sales to a recycle paper mill. The facility has won several awards, including the Solid Waste Association of North America award for best facility in 1990. The facility focuses on continual improvement, and has installed a carbon injection system; an ammonia injection system; a flyash stabilization system; and heat capacity upgrades in addition to conducting continuous waste composition studies. Continuous air emissions monitoring is also conducted at the plant, which produces a very small percentage of the total air emissions in metropolitan Vancouver. The GVRD is now seeking options for the management of a further 500,000 tonnes per year of solid waste, and has received 23 submissions from a range of waste energy technologies which are now being evaluated. It was concluded that waste-to-energy plants can be located in densely populated metropolitan areas and provide a local disposal solution as well as a source of renewable energy. Other GVRD waste reduction policies were also reviewed. refs., tabs., figs.

  4. Can broader diffusion of value-based insurance design increase benefits from US health care without increasing costs? Evidence from a computer simulation model.

    Directory of Open Access Journals (Sweden)

    R Scott Braithwaite

    2010-02-01

    Full Text Available BACKGROUND: Evidence suggests that cost sharing (i.e.,copayments and deductibles decreases health expenditures but also reduces essential care. Value-based insurance design (VBID has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. METHODS AND FINDINGS: We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1 applying VBID solely to pharmacy benefits and (2 applying VBID to both pharmacy benefits and other health care services (e.g., devices. We assumed that cost sharing would be eliminated for high-value services ($300,000 per life-year. All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80% of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to subsidize insurance coverage would increase the benefit conferred by health care by 1.21 life-years, a 31% increase. CONCLUSION: Broader diffusion of VBID may amplify benefits from

  5. Can broader diffusion of value-based insurance design increase benefits from US health care without increasing costs? Evidence from a computer simulation model.

    Science.gov (United States)

    Braithwaite, R Scott; Omokaro, Cynthia; Justice, Amy C; Nucifora, Kimberly; Roberts, Mark S

    2010-02-16

    Evidence suggests that cost sharing (i.e.,copayments and deductibles) decreases health expenditures but also reduces essential care. Value-based insurance design (VBID) has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1) applying VBID solely to pharmacy benefits and (2) applying VBID to both pharmacy benefits and other health care services (e.g., devices). We assumed that cost sharing would be eliminated for high-value services (value services ($100,000-$300,000 per life-year or unknown), and would be increased for low-value services (>$300,000 per life-year). All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80%) of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to subsidize insurance coverage would increase the benefit conferred by health care by 1.21 life-years, a 31% increase

  6. The truncated conjugate gradient (TCG), a non-iterative/fixed-cost strategy for computing polarization in molecular dynamics: Fast evaluation of analytical forces

    Science.gov (United States)

    Aviat, Félix; Lagardère, Louis; Piquemal, Jean-Philip

    2017-10-01

    In a recent paper [F. Aviat et al., J. Chem. Theory Comput. 13, 180-190 (2017)], we proposed the Truncated Conjugate Gradient (TCG) approach to compute the polarization energy and forces in polarizable molecular simulations. The method consists in truncating the conjugate gradient algorithm at a fixed predetermined order leading to a fixed computational cost and can thus be considered "non-iterative." This gives the possibility to derive analytical forces avoiding the usual energy conservation (i.e., drifts) issues occurring with iterative approaches. A key point concerns the evaluation of the analytical gradients, which is more complex than that with a usual solver. In this paper, after reviewing the present state of the art of polarization solvers, we detail a viable strategy for the efficient implementation of the TCG calculation. The complete cost of the approach is then measured as it is tested using a multi-time step scheme and compared to timings using usual iterative approaches. We show that the TCG methods are more efficient than traditional techniques, making it a method of choice for future long molecular dynamics simulations using polarizable force fields where energy conservation matters. We detail the various steps required for the implementation of the complete method by software developers.

  7. Design and implementation of a medium speed communications interface and protocol for a low cost, refreshed display computer

    Science.gov (United States)

    Phyne, J. R.; Nelson, M. D.

    1975-01-01

    The design and implementation of hardware and software systems involved in using a 40,000 bit/second communication line as the connecting link between an IMLAC PDS 1-D display computer and a Univac 1108 computer system were described. The IMLAC consists of two independent processors sharing a common memory. The display processor generates the deflection and beam control currents as it interprets a program contained in the memory; the minicomputer has a general instruction set and is responsible for starting and stopping the display processor and for communicating with the outside world through the keyboard, teletype, light pen, and communication line. The processing time associated with each data byte was minimized by designing the input and output processes as finite state machines which automatically sequence from each state to the next. Several tests of the communication link and the IMLAC software were made using a special low capacity computer grade cable between the IMLAC and the Univac.

  8. 10 CFR Appendix I to Part 504 - Procedures for the Computation of the Real Cost of Capital

    Science.gov (United States)

    2010-01-01

    ... I Appendix I to Part 504 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS EXISTING... parameters specified above are not obtainable, alternate parameters that closely correspond to those above... common equity, an alternate methodology to predict the firm's real after-tax marginal cost of capital may...

  9. Behaviors of cost functions in image registration between 201Tl brain tumor single-photon emission computed tomography and magnetic resonance images

    International Nuclear Information System (INIS)

    Soma, Tsutomu; Takaki, Akihiro; Teraoka, Satomi; Ishikawa, Yasushi; Murase, Kenya; Koizumi, Kiyoshi

    2008-01-01

    We studied the behaviors of cost functions in the registration of thallium-201 ( 201 Tl) brain tumor single-photon emission computed tomography (SPECT) and magnetic resonance (MR) images, as the similarity index of image positioning. A marker for image registration [technetium-99m ( 99m Tc) point source] was attached at three sites on the heads of 13 patients with brain tumor, from whom 42 sets of 99m Tc- 201 Tl SPECT (the dual-isotope acquisition) and MR images were obtained. The 201 Tl SPECT and MR images were manually registered according to the markers. From the positions where the two images were registered, the position of the 201 Tl SPECT was moved to examine the behaviors of the three cost functions, i.e., ratio image uniformity (RIU), mutual information (MI), and normalized MI (NMI). The cost functions MI and NMI reached the maximum at positions adjacent to those where the SPECT and MR images were manually registered. As for the accuracy of image registration in terms of the cost functions MI and NMI, on average, the images were accurately registered within 3 deg of rotation around the X-, Y-, and Z-axes, and within 1.5 mm (within 2 pixels), 3 mm (within 3 pixels), and 4 mm (within 1 slice) of translation to the X-, Y-, and Z-axes, respectively. In terms of rotation around the Z-axis, the cost function RIU reached the minimum at positions where the manual registration of the two images was substantially inadequate. The MI and NMI were suitable cost functions in the registration of 201 Tl SPECT and MR images. The behavior of the RIU, in contrast, was unstable, being unsuitable as an index of image registration. (author)

  10. 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography Accuracy in the Staging of Non-Small Cell Lung Cancer: Review and Cost-Effectiveness

    International Nuclear Information System (INIS)

    Gómez León, Nieves; Escalona, Sofía; Bandrés, Beatriz; Belda, Cristobal; Callejo, Daniel; Blasco, Juan Antonio

    2014-01-01

    Aim of the performed clinical study was to compare the accuracy and cost-effectiveness of PET/CT in the staging of non-small cell lung cancer (NSCLC). Material and Methods. Cross-sectional and prospective study including 103 patients with histologically confirmed NSCLC. All patients were examined using PET/CT with intravenous contrast medium. Those with disease stage ≤IIB underwent surgery (n = 40). Disease stage was confirmed based on histology results, which were compared with those of PET/CT and positron emission tomography (PET) and computed tomography (CT) separately. 63 patients classified with ≥IIIA disease stage by PET/CT did not undergo surgery. The cost-effectiveness of PET/CT for disease classification was examined using a decision tree analysis. Results. Compared with histology, the accuracy of PET/CT for disease staging has a positive predictive value of 80%, a negative predictive value of 95%, a sensitivity of 94%, and a specificity of 82%. For PET alone, these values are 53%, 66%, 60%, and 50%, whereas for CT alone they are 68%, 86%, 76%, and 72%, respectively. Incremental cost-effectiveness of PET/CT over CT alone was €17,412 quality-adjusted life-year (QALY). Conclusion. In our clinical study, PET/CT using intravenous contrast medium was an accurate and cost-effective method for staging of patients with NSCLC

  11. 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography Accuracy in the Staging of Non-Small Cell Lung Cancer: Review and Cost-Effectiveness

    International Nuclear Information System (INIS)

    Leon, N.G.; Bandrs, B.; Escalona, S.; Callejo, D.; Blasco, J.A.; Belda, C.; Blasco, J.A.

    2014-01-01

    Aim of the performed clinical study was to compare the accuracy and cost-effectiveness of PET/CT in the staging of non-small cell lung cancer (NSCLC). Material and Methods. Cross-sectional and prospective study including 103 patients with histologically confirmed NSCLC. All patients were examined using PET/CT with intravenous contrast medium. Those with disease stage ≤IIB underwent surgery (η=40). Disease stage was confirmed based on histology results, which were compared with those of PET/CT and positron emission tomography (PET) and computed tomography (CT) separately. 63 patients classified with ≥IIIA disease stage by PET/CT did not undergo surgery. The cost-effectiveness of PET/CT for disease classification was examined using a decision tree analysis. Results. Compared with histology, the accuracy of PET/CT for disease staging has a positive predictive value of 80%, a negative predictive value of 95%, a sensitivity of 94%, and a specificity of 82%. For PET alone, these values are 53%, 66%, 60%, and 50%, whereas for CT alone they are 68%, 86%, 76%, and 72%, respectively. Incremental cost-effectiveness of PET/CT over CT alone was €17,412 quality-adjusted life-year (QALY). Conclusion. In our clinical study, PET/CT using intravenous contrast medium was an accurate and cost-effective method for staging of patients with NSCLC

  12. Performance and Cost-Effectiveness of Computed Tomography Lung Cancer Screening Scenarios in a Population-Based Setting: A Microsimulation Modeling Analysis in Ontario, Canada.

    Directory of Open Access Journals (Sweden)

    Kevin Ten Haaf

    2017-02-01

    Full Text Available The National Lung Screening Trial (NLST results indicate that computed tomography (CT lung cancer screening for current and former smokers with three annual screens can be cost-effective in a trial setting. However, the cost-effectiveness in a population-based setting with >3 screening rounds is uncertain. Therefore, the objective of this study was to estimate the cost-effectiveness of lung cancer screening in a population-based setting in Ontario, Canada, and evaluate the effects of screening eligibility criteria.This study used microsimulation modeling informed by various data sources, including the Ontario Health Insurance Plan (OHIP, Ontario Cancer Registry, smoking behavior surveys, and the NLST. Persons, born between 1940 and 1969, were examined from a third-party health care payer perspective across a lifetime horizon. Starting in 2015, 576 CT screening scenarios were examined, varying by age to start and end screening, smoking eligibility criteria, and screening interval. Among the examined outcome measures were lung cancer deaths averted, life-years gained, percentage ever screened, costs (in 2015 Canadian dollars, and overdiagnosis. The results of the base-case analysis indicated that annual screening was more cost-effective than biennial screening. Scenarios with eligibility criteria that required as few as 20 pack-years were dominated by scenarios that required higher numbers of accumulated pack-years. In general, scenarios that applied stringent smoking eligibility criteria (i.e., requiring higher levels of accumulated smoking exposure were more cost-effective than scenarios with less stringent smoking eligibility criteria, with modest differences in life-years gained. Annual screening between ages 55-75 for persons who smoked ≥40 pack-years and who currently smoke or quit ≤10 y ago yielded an incremental cost-effectiveness ratio of $41,136 Canadian dollars ($33,825 in May 1, 2015, United States dollars per life-year gained

  13. Accuracy of a Low-Cost Novel Computer-Vision Dynamic Movement Assessment: Potential Limitations and Future Directions

    Science.gov (United States)

    McGroarty, M.; Giblin, S.; Meldrum, D.; Wetterling, F.

    2016-04-01

    The aim of the study was to perform a preliminary validation of a low cost markerless motion capture system (CAPTURE) against an industry gold standard (Vicon). Measurements of knee valgus and flexion during the performance of a countermovement jump (CMJ) between CAPTURE and Vicon were compared. After correction algorithms were applied to the raw CAPTURE data acceptable levels of accuracy and precision were achieved. The knee flexion angle measured for three trials using Capture deviated by -3.8° ± 3° (left) and 1.7° ± 2.8° (right) compared to Vicon. The findings suggest that low-cost markerless motion capture has potential to provide an objective method for assessing lower limb jump and landing mechanics in an applied sports setting. Furthermore, the outcome of the study warrants the need for future research to examine more fully the potential implications of the use of low-cost markerless motion capture in the evaluation of dynamic movement for injury prevention.

  14. Money for Research, Not for Energy Bills: Finding Energy and Cost Savings in High Performance Computer Facility Designs

    Energy Technology Data Exchange (ETDEWEB)

    Drewmark Communications; Sartor, Dale; Wilson, Mark

    2010-07-01

    High-performance computing facilities in the United States consume an enormous amount of electricity, cutting into research budgets and challenging public- and private-sector efforts to reduce energy consumption and meet environmental goals. However, these facilities can greatly reduce their energy demand through energy-efficient design of the facility itself. Using a case study of a facility under design, this article discusses strategies and technologies that can be used to help achieve energy reductions.

  15. Computing, Information, and Communications Technology (CICT) Program Overview

    Science.gov (United States)

    VanDalsem, William R.

    2003-01-01

    The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

  16. Computed tomography versus intravenous urography in diagnosis of acute flank pain from urolithiasis: a randomized study comparing imaging costs and radiation dose

    International Nuclear Information System (INIS)

    Thomson, J.M.Z.; Maling, T.M.J.; Glocer, J.; Mark, S.; Abbott, C.

    2001-01-01

    The equivalent sensitivity of non-contrast computed tomography (NCCT) and intravenous urography (IVU) in the diagnosis of suspected ureteric colic has been established. Approximately 50% of patients with suspected ureteric colic do not have a nephro-urological cause for pain. Because many such patients require further imaging studies, NCCT may obviate the need for these studies and, in so doing, be more cost effective and involve less overall radiation exposure. The present study compares the total imaging cost and radiation dose of NCCT versus IVU in the diagnosis of acute flank pain. Two hundred and twenty-four patients (157 men; mean age 45 years; age range 19-79 years) with suspected renal colic were randomized either to NCCT or IVU. The number of additional diagnostic imaging studies, cost (IVU A$ 136; CTU A$ 173), radiation exposure and imaging times were compared. Of 119(53%) patients with renal obstruction, 105 had no nephro-urological causes of pain. For 21 (20%) of these patients an alternative diagnosis was made at the initial imaging, 10 of which were significant. Of 118 IVU patients, 28 (24%) required 32 additional imaging tests to reach a diagnosis, whereas seven of 106 (6%) NCCT patients required seven additional imaging studies. The average total diagnostic imaging cost for the NCCT group was A$181.94 and A$175.46 for the IVU group (P< 0.43). Mean radiation dose to diagnosis was 5.00 mSv (NCCT) versus 3.50 mSv (IVU) (P < 0.001). Mean imaging time was 30 min (NCCT) versus 75 min (IVU) (P < 0.001). Diagnostic imaging costs were remarkably similar. Although NCCT involves a higher radiation dose than IVU, its advantages of faster diagnosis, the avoidance of additional diagnostic imaging tests and its ability to diagnose other causes makes it the study of choice for acute flank pain at Christchurch Hospital. Copyright (2001) Blackwell Science Pty Ltd

  17. Unit Cost Compendium Calculations

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Unit Cost Compendium (UCC) Calculations raw data set was designed to provide for greater accuracy and consistency in the use of unit costs across the USEPA...

  18. Cost-effective pediatric head and body phantoms for computed tomography dosimetry and its evaluation using pencil ion chamber and CT dose profiler

    Directory of Open Access Journals (Sweden)

    A Saravanakumar

    2015-01-01

    Full Text Available In the present work, a pediatric head and body phantom was fabricated using polymethyl methacrylate (PMMA at a low cost when compared to commercially available phantoms for the purpose of computed tomography (CT dosimetry. The dimensions of head and body phantoms were 10 cm diameter, 15 cm length and 16 cm diameter, 15 cm length, respectively. The dose from a 128-slice CT machine received by the head and body phantom at the center and periphery were measured using a 100 mm pencil ion chamber and 150 mm CT dose profiler (CTDP. Using these values, the weighted computed tomography dose index (CTDIw and in turn the volumetric CTDI (CTDIv were calculated for various combinations of tube voltage and current-time product. A similar study was carried out using standard calibrated phantom and the results have been compared with the fabricated ones to ascertain that the performance of the latter is equivalent to that of the former. Finally, CTDIv measured using fabricated and standard phantoms were compared with respective values displayed on the console. The difference between the values was well within the limits specified by Atomic Energy Regulatory Board (AERB, India. These results indicate that the cost-effective pediatric phantom can be employed for CT dosimetry.

  19. Development and implementation of a low-cost phantom for quality control in cone beam computed tomography

    International Nuclear Information System (INIS)

    Batista, W. O.; Navarro, M. V. T.; Maia, A. F.

    2013-01-01

    A phantom for quality control in cone beam computed tomography (CBCT) scanners was designed and constructed, and a methodology for testing was developed. The phantom had a polymethyl methacrylate structure filled with water and plastic objects that allowed the assessment of parameters related to quality control. The phantom allowed the evaluation of essential parameters in CBCT as well as the evaluation of linear and angular dimensions. The plastics used in the phantom were chosen so that their density and linear attenuation coefficient were similar to those of human facial structures. Three types of CBCT equipment, with two different technological concepts, were evaluated. The results of the assessment of the accuracy of linear and angular dimensions agreed with the existing standards. However, other parameters such as computed tomography number accuracy, uniformity and high-contrast detail did not meet the tolerances established in current regulations or the manufacturer's specifications. The results demonstrate the importance of establishing specific protocols and phantoms, which meet the specificities of CBCT. The practicality of implementation, the quality control test results for the proposed phantom and the consistency of the results using different equipment demonstrate its adequacy. (authors)

  20. Development of a Computer Program for an Analysis of the Logistics and Transportation Costs of the PWR Spent Fuels in Korea

    International Nuclear Information System (INIS)

    Cha, Jeong Hun; Choi, Heui Joo; Lee, Jong Youl; Choi, Jong Won

    2009-01-01

    It is expected that a substantial amount of spent fuels will be transported from the four nuclear power plant (NPP) sites in Korea to a hypothetical centralized interim storage facility or a final repository in the near future. The cost for the transportation is proportional to the amount of spent fuels. In this paper, a cost estimation program is developed based on the conceptual design of a transportation system and a logistics analysis. Using the developed computer program, named as CASK, the minimum capacity of a centralized interim storage facility (CISF) and the transportation cost for PWR spent fuels are calculated. The PWR spent fuels are transported from 4 NPP sites to a final repository (FR) via the CISF. Since NPP sites and the CISF are located along the coast, a sea-transportation is considered and a road-transportation is considered between the CISF and the FR. The result shows that the minimum capacity of the interim storage facility is 15,000 MTU

  1. Costs and role of ultrasound follow-up of polytrauma patients after initial computed tomography; Kosten und Stellenwert von Ultraschallverlaufskontrollen bei polytraumatisierten Patienten nach initialer Computertomografie

    Energy Technology Data Exchange (ETDEWEB)

    Maurer, M.H.; Winkler, A.; Powerski, M.J.; Elgeti, F.; Huppertz, A.; Roettgen, R.; Marnitz, T. [Charite - Universitaetsmedizin Berlin (Germany). Klinik fuer Diagnostische und Interventionelle Radiologie; Wichlas, F. [Charite - Universitaetsmedizin Berlin (Germany). Centrum fuer Muskuloskeletale Chirurgie

    2012-01-15

    Purpose: To assess the costs and diagnostic gain of abdominal ultrasound follow-up of polytrauma patients initially examined by whole-body computed tomography (CT). Materials and Methods: A total of 176 patients with suspected multiple trauma (126 men, 50 women; age 43.5 {+-} 17.4 years) were retrospectively analyzed with regard to supplementary and new findings obtained by ultrasound follow-up compared with the results of exploratory FAST (focused assessment with sonography for trauma) at admission and the findings of whole-body CT. A process model was used to document the staff, materials, and total costs of the ultrasound follow-up examinations. Results: FAST yielded 26 abdominal findings (organ injury and/or free intra-abdominal fluid) in 19 patients, while the abdominal scan of whole-body CT revealed 32 findings in 25 patients. FAST had 81 % sensitivity and 100 % specificity. Follow-up ultrasound examinations revealed new findings in 2 of the 25 patients with abdominal injuries detected with initial CT. In the 151 patients without abdominal injuries in the initial CT scan, ultrasound follow-up did not yield any supplementary or new findings. The total costs of an ultrasound follow-up examination were EUR 28.93. The total costs of all follow-up ultrasound examinations performed in the study population were EUR 5658.23. Conclusion: Follow-up abdominal ultrasound yields only a low overall diagnostic gain in polytrauma patients in whom initial CT fails to detect any abdominal injuries but incurs high personnel expenses for radiological departments. (orig.)

  2. Effectiveness and cost-effectiveness of computer and other electronic aids for smoking cessation: a systematic review and network meta-analysis.

    Science.gov (United States)

    Chen, Y-F; Madan, J; Welton, N; Yahaya, I; Aveyard, P; Bauld, L; Wang, D; Fry-Smith, A; Munafò, M R

    2012-01-01

    Smoking is harmful to health. On average, lifelong smokers lose 10 years of life, and about half of all lifelong smokers have their lives shortened by smoking. Stopping smoking reverses or prevents many of these harms. However, cessation services in the NHS achieve variable success rates with smokers who want to quit. Approaches to behaviour change can be supplemented with electronic aids, and this may significantly increase quit rates and prevent a proportion of cases that relapse. The primary research question we sought to answer was: What is the effectiveness and cost-effectiveness of internet, pc and other electronic aids to help people stop smoking? We addressed the following three questions: (1) What is the effectiveness of internet sites, computer programs, mobile telephone text messages and other electronic aids for smoking cessation and/or reducing relapse? (2) What is the cost-effectiveness of incorporating internet sites, computer programs, mobile telephone text messages and other electronic aids into current nhs smoking cessation programmes? and (3) What are the current gaps in research into the effectiveness of internet sites, computer programs, mobile telephone text messages and other electronic aids to help people stop smoking? For the effectiveness review, relevant primary studies were sought from The Cochrane Library [Cochrane Central Register of Controlled Trials (CENTRAL)] 2009, Issue 4, and MEDLINE (Ovid), EMBASE (Ovid), PsycINFO (Ovid), Health Management Information Consortium (HMIC) (Ovid) and Cumulative Index to Nursing and Allied Health Literature (CINAHL) (EBSCOhost) from 1980 to December 2009. In addition, NHS Economic Evaluation Database (NHS EED) and Database of Abstracts of Reviews of Effects (DARE) were searched for information on cost-effectiveness and modelling for the same period. Reference lists of included studies and of relevant systematic reviews were examined to identify further potentially relevant studies. Research registries

  3. Cost Analyses after a single intervention using a computer application (DIAGETHER in the treatment of diabetic patients admitted to a third level hospital

    Directory of Open Access Journals (Sweden)

    César Carballo Cardona

    2018-01-01

    Full Text Available Goals: To quantify the savings that could be made by the hospital implementation of a computer application (DIAGETHER®, which advises the treatment of hyperglycemia of the diabetic patient in the emergency department when this patient is admitted to a third level hospital. Methods: A multicenter interventional study was designed, including patients in two arms, one in the conventional treatment prescribed by the physician and the other applied the treatment indicated by the computer application DIAGETHER®. The days of hospitalization were collected in the two arms of intervention. Results: A total of 183 patients were included, 86 received treatment with the computer application, and 97 received conventional treatment. The mean blood glucose level on the first day of admission in the GLIKAL group was 178.56 (59.53, compared to 212.93 (62.23 in the conventional group (p <0.001 and on the second day 173.86 (58.86 versus 196.37 (66.60 (p = 0.017. There was no difference in the frequency of hypoglycemia reported in each group (p = 0.555. A reduction in mean stay was observed in patients treated with DIAGETHER. The days of admission were 7 (2-39 days for the GLIKAL group and 10 (2-53 days for the PCH group (p <0.001. Conclusions: The annual savings that could be generated with the use of the computer tool (DIAGETHER®, with the volume of diabetic patients admitted to the hospital, could decrease hospitalization days by 26,147 (14,134 patients for 1.85 days of stay reduction, this would generate a saving of 8,811,842 million euros per year (cost of stay / day of the diabetic patient, for the savings days generated.

  4. Socio-economic considerations of cleaning Greater Vancouver's air

    International Nuclear Information System (INIS)

    2005-08-01

    Socio-economic considerations of better air quality on the Greater Vancouver population and economy were discussed. The purpose of the study was to provide socio-economic information to staff and stakeholders of the Greater Vancouver Regional District (GVRD) who are participating in an Air Quality Management Plan (AQMP) development process and the Sustainable Region Initiative (SRI) process. The study incorporated the following methodologies: identification and review of Canadian, American, and European quantitative socio-economic, cost-benefit, cost effectiveness, competitiveness and health analyses of changes in air quality and measures to improve air quality; interviews with industry representatives in Greater Vancouver on competitiveness impacts of air quality changes and ways to improve air quality; and a qualitative analysis and discussion of secondary quantitative information that identifies and evaluates socio-economic impacts arising from changes in Greater Vancouver air quality. The study concluded that for the Greater Vancouver area, the qualitative analysis of an improvement in Greater Vancouver air quality shows positive socio-economic outcomes, as high positive economic efficiency impacts are expected along with good social quality of life impacts. 149 refs., 30 tabs., 6 appendices

  5. The Role of Telematic Practices in Computer Engineering: A Low-cost Remote Power Control in a Network Lab

    Directory of Open Access Journals (Sweden)

    Tomas Mateo Sanguino

    2012-05-01

    Full Text Available The present paper describes a practical solution of e-learning laboratory devoted to the study of computer networks. This laboratory has been proven with two groups of students from the University of Huelva (Spain during two academic years. In order to achieve this objective, it has been necessary to create an entire network infrastructure that includes both the telematic access to the laboratory equipment and the remote power control. The interest of this work lies in an economical and simple system of remote control and telematic access with a twofold objective. On the one hand, to develop distance practices with attendance appearance by means of real hardware systems, not simulated. On the other hand, to reduce the power consumption regarding other proposals of remote labs with permanent power connection, providing herein an on demand connection only when required. As a result, a versatile and flexible laboratory has been put into practice whose basic network topology allows transferring traditional practices to telematic practices in a natural way and without harsh changes

  6. Travel costs associated with flood closures of state highways near Centralia/Chehalis, Washington.

    Science.gov (United States)

    2014-09-01

    This report discusses the travel costs associated with the closure of roads in the greater : Centralia/Chehalis, Washington region due to 100-year flood conditions starting on the Chehalis River. The costs : were computed for roadway closures on I-5,...

  7. A computer simulation model of the cost-effectiveness of routine Staphylococcus aureus screening and decolonization among lung and heart-lung transplant recipients.

    Science.gov (United States)

    Clancy, C J; Bartsch, S M; Nguyen, M H; Stuckey, D R; Shields, R K; Lee, B Y

    2014-06-01

    Our objective was to model the cost-effectiveness and economic value of routine peri-operative Staphylococcus aureus screening and decolonization of lung and heart-lung transplant recipients from hospital and third-party payer perspectives. We used clinical data from 596 lung and heart-lung transplant recipients to develop a model in TreeAge Pro 2009 (Williamsport, MA, USA). Sensitivity analyses varied S. aureus colonization rate (5-15 %), probability of infection if colonized (10-30 %), and decolonization efficacy (25-90 %). Data were collected from the Cardiothoracic Transplant Program at the University of Pittsburgh Medical Center. Consecutive lung and heart-lung transplant recipients from January 2006 to December 2010 were enrolled retrospectively. Baseline rates of S. aureus colonization, infection and decolonization efficacy were 9.6 %, 36.7 %, and 31.9 %, respectively. Screening and decolonization was economically dominant for all scenarios tested, providing more cost savings and health benefits than no screening. Savings per case averted (2012 $US) ranged from $73,567 to $133,157 (hospital perspective) and $10,748 to $16,723 (third party payer perspective), varying with the probability of colonization, infection, and decolonization efficacy. Using our clinical data, screening and decolonization led to cost savings per case averted of $240,602 (hospital perspective) and averted 6.7 S. aureus infections (4.3 MRSA and 2.4 MSSA); 89 patients needed to be screened to prevent one S. aureus infection. Our data support routine S. aureus screening and decolonization of lung and heart-lung transplant patients. The economic value of screening and decolonization was greater than in previous models of other surgical populations.

  8. Application of off-line image processing for optimization in chest computed radiography using a low cost system.

    Science.gov (United States)

    Muhogora, Wilbroad E; Msaki, Peter; Padovani, Renato

    2015-03-08

     The objective of this study was to improve the visibility of anatomical details by applying off-line postimage processing in chest computed radiography (CR). Four spatial domain-based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann-Whitney U-test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005 ≤ p ≤ 0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60 ≤ kVp ≤ 70. However, there was no improvement for images acquired using 102 ≤ kVp ≤ 107 (0.127 ≤ p ≤ 0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists.

  9. Implementing and developing cloud computing applications

    CERN Document Server

    Sarna, David E Y

    2010-01-01

    From small start-ups to major corporations, companies of all sizes have embraced cloud computing for the scalability, reliability, and cost benefits it can provide. It has even been said that cloud computing may have a greater effect on our lives than the PC and dot-com revolutions combined.Filled with comparative charts and decision trees, Implementing and Developing Cloud Computing Applications explains exactly what it takes to build robust and highly scalable cloud computing applications in any organization. Covering the major commercial offerings available, it provides authoritative guidan

  10. Role of information systems in controlling costs: the electronic medical record (EMR) and the high-performance computing and communications (HPCC) efforts

    Science.gov (United States)

    Kun, Luis G.

    1994-12-01

    On October 18, 1991, the IEEE-USA produced an entity statement which endorsed the vital importance of the High Performance Computer and Communications Act of 1991 (HPCC) and called for the rapid implementation of all its elements. Efforts are now underway to develop a Computer Based Patient Record (CBPR), the National Information Infrastructure (NII) as part of the HPCC, and the so-called `Patient Card'. Multiple legislative initiatives which address these and related information technology issues are pending in Congress. Clearly, a national information system will greatly affect the way health care delivery is provided to the United States public. Timely and reliable information represents a critical element in any initiative to reform the health care system as well as to protect and improve the health of every person. Appropriately used, information technologies offer a vital means of improving the quality of patient care, increasing access to universal care and lowering overall costs within a national health care program. Health care reform legislation should reflect increased budgetary support and a legal mandate for the creation of a national health care information system by: (1) constructing a National Information Infrastructure; (2) building a Computer Based Patient Record System; (3) bringing the collective resources of our National Laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; and (4) utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues.

  11. Greater trochanteric pain syndrome diagnosis and treatment.

    Science.gov (United States)

    Mallow, Michael; Nazarian, Levon N

    2014-05-01

    Lateral hip pain, or greater trochanteric pain syndrome, is a commonly seen condition; in this article, the relevant anatomy, epidemiology, and evaluation strategies of greater trochanteric pain syndrome are reviewed. Specific attention is focused on imaging of this syndrome and treatment techniques, including ultrasound-guided interventions. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Automated Data Handling And Instrument Control Using Low-Cost Desktop Computers And An IEEE 488 Compatible Version Of The ODETA V.

    Science.gov (United States)

    van Leunen, J. A. J.; Dreessen, J.

    1984-05-01

    The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to

  13. Torsion of the greater omentum: A rare preoperative diagnosis

    International Nuclear Information System (INIS)

    Tandon, Ankit Anil; Lim, Kian Soon

    2010-01-01

    Torsion of the greater omentum is a rare acute abdominal condition that is seldom diagnosed preoperatively. We report the characteristic computed tomography (CT) scan findings and the clinical implications of this unusual diagnosis in a 41-year-old man, who also had longstanding right inguinal hernia. Awareness of omental torsion as a differential diagnosis in the acute abdomen setting is necessary for correct patient management

  14. Reducing computational costs in large scale 3D EIT by using a sparse Jacobian matrix with block-wise CGLS reconstruction

    International Nuclear Information System (INIS)

    Yang, C L; Wei, H Y; Soleimani, M; Adler, A

    2013-01-01

    Electrical impedance tomography (EIT) is a fast and cost-effective technique to provide a tomographic conductivity image of a subject from boundary current–voltage data. This paper proposes a time and memory efficient method for solving a large scale 3D EIT inverse problem using a parallel conjugate gradient (CG) algorithm. The 3D EIT system with a large number of measurement data can produce a large size of Jacobian matrix; this could cause difficulties in computer storage and the inversion process. One of challenges in 3D EIT is to decrease the reconstruction time and memory usage, at the same time retaining the image quality. Firstly, a sparse matrix reduction technique is proposed using thresholding to set very small values of the Jacobian matrix to zero. By adjusting the Jacobian matrix into a sparse format, the element with zeros would be eliminated, which results in a saving of memory requirement. Secondly, a block-wise CG method for parallel reconstruction has been developed. The proposed method has been tested using simulated data as well as experimental test samples. Sparse Jacobian with a block-wise CG enables the large scale EIT problem to be solved efficiently. Image quality measures are presented to quantify the effect of sparse matrix reduction in reconstruction results. (paper)

  15. Reducing computational costs in large scale 3D EIT by using a sparse Jacobian matrix with block-wise CGLS reconstruction.

    Science.gov (United States)

    Yang, C L; Wei, H Y; Adler, A; Soleimani, M

    2013-06-01

    Electrical impedance tomography (EIT) is a fast and cost-effective technique to provide a tomographic conductivity image of a subject from boundary current-voltage data. This paper proposes a time and memory efficient method for solving a large scale 3D EIT inverse problem using a parallel conjugate gradient (CG) algorithm. The 3D EIT system with a large number of measurement data can produce a large size of Jacobian matrix; this could cause difficulties in computer storage and the inversion process. One of challenges in 3D EIT is to decrease the reconstruction time and memory usage, at the same time retaining the image quality. Firstly, a sparse matrix reduction technique is proposed using thresholding to set very small values of the Jacobian matrix to zero. By adjusting the Jacobian matrix into a sparse format, the element with zeros would be eliminated, which results in a saving of memory requirement. Secondly, a block-wise CG method for parallel reconstruction has been developed. The proposed method has been tested using simulated data as well as experimental test samples. Sparse Jacobian with a block-wise CG enables the large scale EIT problem to be solved efficiently. Image quality measures are presented to quantify the effect of sparse matrix reduction in reconstruction results.

  16. Synchronized Pair Configuration in Virtualization-Based Lab for Learning Computer Networks

    Science.gov (United States)

    Kongcharoen, Chaknarin; Hwang, Wu-Yuin; Ghinea, Gheorghita

    2017-01-01

    More studies are concentrating on using virtualization-based labs to facilitate computer or network learning concepts. Some benefits are lower hardware costs and greater flexibility in reconfiguring computer and network environments. However, few studies have investigated effective mechanisms for using virtualization fully for collaboration.…

  17. Costs of traffic injuries

    DEFF Research Database (Denmark)

    Kruse, Marie

    2015-01-01

    assessed using Danish national healthcare registers. Productivity costs were computed using duration analysis (Cox regression models). In a subanalysis, cost per severe traffic injury was computed for the 12 995 individuals that experienced a severe injury. RESULTS: The socioeconomic cost of a traffic...... injury was €1406 (2009 price level) in the first year, and €8950 over a 10-year period. Per 100 000 population, the 10-year cost was €6 565 668. A severe traffic injury costs €4969 per person in the first year, and €4 006 685 per 100 000 population over a 10-year period. Victims of traffic injuries...

  18. New Federal Cost Accounting Regulations

    Science.gov (United States)

    Wolff, George J.; Handzo, Joseph J.

    1973-01-01

    Discusses a new set of indirect cost accounting procedures which must be followed by school districts wishing to recover any indirect costs of administering federal grants and contracts. Also discusses the amount of indirect costs that may be recovered, computing indirect costs, classifying project costs, and restricted grants. (Author/DN)

  19. From Deposit to Point Cloud – a Study of Low-Cost Computer Vision Approaches for the Straightforward Documentation of Archaeological Excavations

    Directory of Open Access Journals (Sweden)

    M. Doneus

    2011-12-01

    Full Text Available Stratigraphic archaeological excavations demand high-resolution documentation techniques for 3D recording. Today, this is typically accomplished using total stations or terrestrial laser scanners. This paper demonstrates the potential of another technique that is low-cost and easy to execute. It takes advantage of software using Structure from Motion (SfM algorithms, which are known for their ability to reconstruct camera pose and threedimensional scene geometry (rendered as a sparse point cloud from a series of overlapping photographs captured by a camera moving around the scene. When complemented by stereo matching algorithms, detailed 3D surface models can be built from such relatively oriented photo collections in a fully automated way. The absolute orientation of the model can be derived by the manual measurement of control points. The approach is extremely flexible and appropriate to deal with a wide variety of imagery, because this computer vision approach can also work with imagery resulting from a randomly moving camera (i.e. uncontrolled conditions and calibrated optics are not a prerequisite. For a few years, these algorithms are embedded in several free and low-cost software packages. This paper will outline how such a program can be applied to map archaeological excavations in a very fast and uncomplicated way, using imagery shot with a standard compact digital camera (even if the ima ges were not taken for this purpose. Archived data from previous excavations of VIAS-University of Vienna has been chosen and the derived digital surface models and orthophotos have been examined for their usefulness for archaeological applications. The a bsolute georeferencing of the resulting surface models was performed with the manual identification of fourteen control points. In order to express the positional accuracy of the generated 3D surface models, the NSSDA guidelines were applied.  Simultaneously acquired terrestrial laser scanning data

  20. Simultaneous bilateral isolated greater trochanter fracture

    Directory of Open Access Journals (Sweden)

    Maruti Kambali

    2013-01-01

    Full Text Available A 48-year-old woman sustained simultaneous isolated bilateral greater trochanteric fracture, following a road traffic accident. The patient presented to us 1 month after the injury. She presented with complaints of pain in the left hip and inability to walk. Roentgenograms revealed displaced comminuted bilateral greater trochanter fractures. The fracture of the left greater trochanter was reduced and fixed internally using the tension band wiring technique. The greater trochanter fracture on the right side was asymptomatic and was managed conservatively. The patient regained full range of motion and use of her hips after a postoperative follow-up of 6 months. Isolated fractures of the greater trochanter are unusual injuries. Because of their relative rarity and the unsettled controversy regarding their etiology and pathogenesis, several methods of treatment have been advocated. Furthermore, the reports of this particular type of injury are not plentiful and the average textbook coverage afforded to this entity is limited. In our study we discuss the mechanism of injury and the various treatment options available.

  1. Butterfly valves: greater use in power plants

    International Nuclear Information System (INIS)

    McCoy, M.

    1975-01-01

    Improvements in butterfly valves, particularly in the areas of automatic control and leak tightness are described. The use of butterfly valves in nuclear power plants is discussed. These uses include service in component cooling, containment cooling, and containment isolation. The outlook for further improvements and greater uses is examined. (U.S.)

  2. Greater Somalia, the never-ending dream?

    DEFF Research Database (Denmark)

    Zoppi, Marco

    2015-01-01

    This paper provides an historical analysis of the concept of Greater Somalia, the nationalist project that advocates the political union of all Somali-speaking people, including those inhabiting areas in current Djibouti, Ethiopia and Kenya. The Somali territorial unification project of “lost...

  3. Strategic Analysis of Autodesk and the Move to Cloud Computing

    OpenAIRE

    Kewley, Kathleen

    2012-01-01

    This paper provides an analysis of the opportunity for Autodesk to move its core technology to a cloud delivery model. Cloud computing offers clients a number of advantages, such as lower costs for computer hardware, increased access to technology and greater flexibility. With the IT industry embracing this transition, software companies need to plan for future change and lead with innovative solutions. Autodesk is in a unique position to capitalize on this market shift, as it is the leader i...

  4. Utilization of wind energy in greater Hanover

    International Nuclear Information System (INIS)

    Sahling, U.

    1993-01-01

    Since the beginning of the Eighties, the association of communities of Greater Hanover has dealt intensively with energy and ecopolitical questions in the scope of regional planning. Renewable energy sources play a dominant role in this context. This brochure is the third contribution to the subject ''Energy policy and environmental protection''. Experts as well as possibly interested parties are addressed especially. For all 8 contributions contained, separate entries have been recorded in this database. (BWI) [de

  5. Small cities face greater impact from automation

    OpenAIRE

    Frank, Morgan R.; Sun, Lijun; Cebrian, Manuel; Youn, Hyejin; Rahwan, Iyad

    2017-01-01

    The city has proven to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: How will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across U.S. urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content su...

  6. The Greater Sekhukhune-CAPABILITY outreach project.

    Science.gov (United States)

    Gregersen, Nerine; Lampret, Julie; Lane, Tony; Christianson, Arnold

    2013-07-01

    The Greater Sekhukhune-CAPABILITY Outreach Project was undertaken in a rural district in Limpopo, South Africa, as part of the European Union-funded CAPABILITY programme to investigate approaches for capacity building for the translation of genetic knowledge into care and prevention of congenital disorders. Based on previous experience of a clinical genetic outreach programme in Limpopo, it aimed to initiate a district clinical genetic service in Greater Sekhukhune to gain knowledge and experience to assist in the implementation and development of medical genetic services in South Africa. Implementing the service in Greater Sekhukhune was impeded by a developing staff shortage in the province and pressure on the health service from the existing HIV/AIDS and TB epidemics. This situation underscores the need for health needs assessment for developing services for the care and prevention of congenital disorders in middle- and low-income countries. However, these impediments stimulated the pioneering of innovate ways to offer medical genetic services in these circumstances, including tele-teaching of nurses and doctors, using cellular phones to enhance clinical care and adapting and assessing the clinical utility of a laboratory test, QF-PCR, for use in the local circumstances.

  7. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  8. Greater happiness for a greater number: Is that possible in Austria?

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    2011-01-01

    textabstractWhat is the final goal of public policy? Jeremy Bentham (1789) would say: greater happiness for a greater number. He thought of happiness as subjective enjoyment of life; in his words as “the sum of pleasures and pains”. In his time the happiness of the great number could not be measured

  9. Greater happiness for a greater number: Is that possible? If so how? (Arabic)

    NARCIS (Netherlands)

    R. Veenhoven (Ruut); E. Samuel (Emad)

    2012-01-01

    textabstractWhat is the final goal of public policy? Jeremy Bentham (1789) would say: greater happiness for a greater number. He thought of happiness as subjective enjoyment of life; in his words as “the sum of pleasures and pains”. In his time, the happiness of the great number could not be

  10. Greater happiness for a greater number: Is that possible in Germany?

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    2009-01-01

    textabstractWhat is the final goal of public policy? Jeremy Bentham (1789) would say: greater happiness for a greater number. He thought of happiness as subjective enjoyment of life; in his words as “the sum of pleasures and pains”. In his time the Happiness of the great number could not be measured

  11. Deregulation and Nuclear Training: Cost Effective Alternatives

    International Nuclear Information System (INIS)

    Richard P. Coe; Patricia A. Lake

    2000-01-01

    Training is crucial to the success of any organization. It is also expensive, with some estimates exceeding $50 billion annually spent on training by U.S. corporations. Nuclear training, like that of many other highly technical organizations, is both crucial and costly. It is unlikely that the amount of training can be significantly reduced. If anything, current trends indicate that training needs will probably increase as the industry and workforce ages and changes. With the advent of energy deregulation in the United States, greater pressures will surface to make the costs of energy more cost-competitive. This in turn will drive businesses to more closely examine existing costs and find ways to do things in a more cost-effective way. The commercial nuclear industry will be no exception, and nuclear training will be equally affected. It is time for nuclear training and indeed the entire nuclear industry to begin using more aggressive techniques to reduce costs. This includes the need for nuclear training to find alternatives to traditional methods for the delivery of cost-effective high-quality training that meets regulatory requirements and produces well-qualified personnel capable of working in an efficient and safe manner. Computer-based and/or Web-based training are leading emerging technologies

  12. Search for greater stability in nuclear regulation

    International Nuclear Information System (INIS)

    Asselstine, J.K.

    1985-01-01

    The need for greater stability in nuclear regulation is discussed. Two possible approaches for dealing with the problems of new and rapidly changing regulatory requirements are discussed. The first approach relies on the more traditional licensing reform initiatives that have been considered off and on for the past decade. The second approach considers a new regulator philosophy aimed at the root causes of the proliferation of new safety requirements that have been imposed in recent years. For the past few years, the concepts of deregulation and regulatory reform have been in fashion in Washington, and the commercial nuclear power program has not remained unaffected. Many look to these concepts to provide greater stability in the regulatory program. The NRC, the nuclear industry and the administration have all been avidly pursuing regulatory reform initiatives, which take the form of both legislative and administrative proposals. Many of these proposals look to the future, and, if adopted, would have little impact on currently operating nuclear power plants or plants now under construction

  13. Greater Sudbury fuel efficient driving handbook

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2009-12-15

    Reducing the amount of fuel that people use for personal driving saves money, improves local air quality, and reduces personal contributions to climate change. This handbook was developed to be used as a tool for a fuel efficient driving pilot program in Greater Sudbury in 2009-2010. Specifically, the purpose of the handbook was to provide greater Sudbury drivers with information on how to drive and maintain their personal vehicles in order to maximize fuel efficiency. The handbook also provides tips for purchasing fuel efficient vehicles. It outlines the benefits of fuel maximization, with particular reference to reducing contributions to climate change; reducing emissions of air pollutants; safe driving; and money savings. Some tips for efficient driving are to avoid aggressive driving; use cruise control; plan trips; and remove excess weight. Tips for efficient winter driving are to avoid idling to warm up the engine; use a block heater; remove snow and ice; use snow tires; and check tire pressure. The importance of car maintenance and tire pressure was emphasized. The handbook also explains how fuel consumption ratings are developed by vehicle manufacturers. refs., figs.

  14. Women at greater risk of HIV infection.

    Science.gov (United States)

    Mahathir, M

    1997-04-01

    Although many people believe that mainly men get infected with HIV/AIDS, women are actually getting infected at a faster rate than men, especially in developing countries, and suffer more from the adverse impact of AIDS. As of mid-1996, the Joint UN Program on AIDS estimated that more than 10 million of the 25 million adults infected with HIV since the beginning of the epidemic are women. The proportion of HIV-positive women is growing, with almost half of the 7500 new infections daily occurring among women. 90% of HIV-positive women live in a developing country. In Asia-Pacific, 1.4 million women have been infected with HIV out of an estimated total 3.08 million adults from the late 1970s until late 1994. Biologically, women are more vulnerable than men to infection because of the greater mucus area exposed to HIV during penile penetration. Women under age 17 years are at even greater risk because they have an underdeveloped cervix and low vaginal mucus production. Concurrent sexually transmitted diseases increase the risk of HIV transmission. Women's risk is also related to their exposure to gender inequalities in society. The social and economic pressures of poverty exacerbate women's risk. Prevention programs are discussed.

  15. Aufwandsanalyse für computerunterstützte Multiple-Choice Papierklausuren [Cost analysis for computer supported multiple-choice paper examinations

    Directory of Open Access Journals (Sweden)

    Mandel, Alexander

    2011-11-01

    Full Text Available [english] Introduction: Multiple-choice-examinations are still fundamental for assessment in medical degree programs. In addition to content related research, the optimization of the technical procedure is an important question. Medical examiners face three options: paper-based examinations with or without computer support or completely electronic examinations. Critical aspects are the effort for formatting, the logistic effort during the actual examination, quality, promptness and effort of the correction, the time for making the documents available for inspection by the students, and the statistical analysis of the examination results.Methods: Since three semesters a computer program for input and formatting of MC-questions in medical and other paper-based examinations is used and continuously improved at Wuerzburg University. In the winter semester (WS 2009/10 eleven, in the summer semester (SS 2010 twelve and in WS 2010/11 thirteen medical examinations were accomplished with the program and automatically evaluated. For the last two semesters the remaining manual workload was recorded. Results: The cost of the formatting and the subsequent analysis including adjustments of the analysis of an average examination with about 140 participants and about 35 questions was 5-7 hours for exams without complications in the winter semester 2009/2010, about 2 hours in SS 2010 and about 1.5 hours in the winter semester 2010/11. Including exams with complications, the average time was about 3 hours per exam in SS 2010 and 2.67 hours for the WS 10/11. Discussion: For conventional multiple-choice exams the computer-based formatting and evaluation of paper-based exams offers a significant time reduction for lecturers in comparison with the manual correction of paper-based exams and compared to purely electronically conducted exams it needs a much simpler technological infrastructure and fewer staff during the exam.[german] Einleitung: Multiple

  16. Rural New Zealand health professionals' perceived barriers to greater use of the internet for learning.

    Science.gov (United States)

    Janes, Ron; Arroll, Bruce; Buetow, Stephen; Coster, Gregor; McCormick, Ross; Hague, Iain

    2005-01-01

    The purpose of this research was to investigate rural North Island (New Zealand) health professionals' attitudes and perceived barriers to using the internet for ongoing professional learning. A cross-sectional postal survey of all rural North Island GPs, practice nurses and pharmacists was conducted in mid-2003. The questionnaire contained both quantitative and qualitative questions. The transcripts from two open questions requiring written answers were analysed for emergent themes, which are reported here. The first open question asked: 'Do you have any comments on the questionnaire, learning, computers or the Internet?' The second open question asked those who had taken a distance-learning course using the internet to list positive and negative aspects of their course, and suggest improvements. Out of 735 rural North Island health professionals surveyed, 430 returned useable questionnaires (a response rate of 59%). Of these, 137 answered the question asking for comments on learning, computers and the internet. Twenty-eight individuals who had completed a distance-learning course using the internet, provided written responses to the second question. Multiple barriers to greater use of the internet were identified. They included lack of access to computers, poor availability of broadband (fast) internet access, lack of IT skills/knowledge, lack of time, concerns about IT costs and database security, difficulty finding quality information, lack of time, energy or motivation to learn new skills, competing priorities (eg family), and a preference for learning modalities which include more social interaction. Individuals also stated that rural health professionals needed to engage the technology, because it provided rapid, flexible access from home or work to a significant health information resource, and would save money and travelling time to urban-based education. In mid-2003, there were multiple barriers to rural North Island health professionals making greater

  17. Computer automation in veterinary hospitals.

    Science.gov (United States)

    Rogers, H

    1996-05-01

    Computers have been used to automate complex and repetitive tasks in veterinary hospitals since the 1960s. Early systems were expensive, but their use was justified because they performed jobs which would have been impossible or which would have required greater resources in terms of time and personnel had they been performed by other methods. Systems found in most veterinary hospitals today are less costly, magnitudes more capable, and often underused. Modern multitasking operating systems and graphical interfaces bring many opportunities for automation. Commercial and custom programs developed and used in a typical multidoctor mixed species veterinary practice are described.

  18. Small cities face greater impact from automation.

    Science.gov (United States)

    Frank, Morgan R; Sun, Lijun; Cebrian, Manuel; Youn, Hyejin; Rahwan, Iyad

    2018-02-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. © 2018 The Authors.

  19. Small cities face greater impact from automation

    Science.gov (United States)

    Sun, Lijun; Cebrian, Manuel; Rahwan, Iyad

    2018-01-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. PMID:29436514

  20. Computation Directorate 2008 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to its 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.

  1. Applied cost allocation

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Hougaard, Jens Leth; Smilgins, Aleksandrs

    2016-01-01

    This paper deals with empirical computation of Aumann–Shapley cost shares for joint production. We show that if one uses a mathematical programing approach with its non-parametric estimation of the cost function there may be observations in the data set for which we have multiple Aumann–Shapley p...

  2. (CICT) Computing, Information, and Communications Technology Overview

    Science.gov (United States)

    VanDalsem, William R.

    2003-01-01

    The goal of the Computing, Information, and Communications Technology (CICT) program is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communications technologies. This viewgraph presentation includes diagrams of how the political guidance behind CICT is structured. The presentation profiles each part of the NASA Mission in detail, and relates the Mission to the activities of CICT. CICT's Integrated Capability Goal is illustrated, and hypothetical missions which could be enabled by CICT are profiled. CICT technology development is profiled.

  3. The cost of electrocoagulation

    Energy Technology Data Exchange (ETDEWEB)

    Donini, J.C.; Kan, J.; Szynkarczuk, J.; Hassan, T.A.; Kar, K.L.

    1993-01-01

    Electrocoagulation could be an attractive and suitable method for separating solids from waste water. The electrocoagulation of kaolinite and bentonite suspensions was studied in a pilot electrocoagulation unit to assess the cost and efficiency of the process. Factors affecting cost such as the formation of passivation layers on electrode plates and the recirculation and concentration of sodium chloride were examined. Colorimetry was used to analyze aluminum content in the suspension. The results were used to calculate the cost due to consumption of electrode material (aluminium) during the process. Total cost was assumed to comprise the energy cost and the cost of electrode material. Comparison was based on the settling properties of the treated product: turbidity, settling rate, and cake height. In most cases, aluminium efficiency averaged around 200% and material cost accounted for 80% of total cost. Although higher concentrations of sodium chloride could only slightly increase aluminium efficiency and electrode efficiency, the higher concentrations resulted in much greater total cost, due to the greater current generated by the increased suspension conductivity, which in turn dissolved a larger amount of aluminium. The recirculation loop increased the flow rate by 3-10 times, enhancing the mass transport between the electrodes and resulting in lower cost and better settling properties. Over the course of two months the electrodes coatings became thicker while efficiency decreased. The electrode efficiency was found to be as high as 94% for virgin electrodes and as low as 10% after two months. 8 refs., 25 figs., 9 tabs.

  4. Applied cost allocation

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Hougaard, Jens Leth; Smilgins, Aleksandrs

    2016-01-01

    This paper deals with empirical computation of Aumann–Shapley cost shares for joint production. We show that if one uses a mathematical programing approach with its non-parametric estimation of the cost function there may be observations in the data set for which we have multiple Aumann–Shapley p...... of assumptions concerning firm behavior. These assumptions enable us to connect inefficient with efficient production and thereby provide consistent ways of allocating the costs arising from inefficiency....

  5. Design, Construction, and Use of a Single Board Computer Beowulf Cluster: Application of the Small-Footprint, Low-Cost, InSignal 5420 Octa Board

    OpenAIRE

    Cusick, James J.; Miller, William; Laurita, Nicholas; Pitt, Tasha

    2014-01-01

    In recent years development in the area of Single Board Computing has been advancing rapidly. At Wolters Kluwer's Corporate Legal Services Division a prototyping effort was undertaken to establish the utility of such devices for practical and general computing needs. This paper presents the background of this work, the design and construction of a 64 core 96 GHz cluster, and their possibility of yielding approximately 400 GFLOPs from a set of small footprint InSignal boards created for just o...

  6. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  7. Sacrificing information for the greater good

    DEFF Research Database (Denmark)

    Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian; Igel, Christian

    2017-01-01

    Sky Survey (SDSS). For estimating sSFRs, we demonstrate that our method produces better estimates than traditional spectral energy distribution (SED) fitting. For estimating photo-z's, we show that our method produces more accurate photo-z's than the method employed by SDSS. The study highlights......Large-scale surveys make huge amounts of photometric data available. Because of the sheer amount of objects, spectral data cannot be obtained for all of them. Therefore it is important to devise techniques for reliably estimating physical properties of objects from photometric information alone....... These estimates are needed to automatically identify interesting objects worth a follow-up investigation as well as to produce the required data for a statistical analysis of the space covered by a survey. We argue that machine learning techniques are suitable to compute these estimates accurately and efficiently...

  8. Management and cost accounting

    CERN Document Server

    Drury, Colin

    1992-01-01

    This third edition of a textbook on management and cost accounting features coverage of activity-based costing (ABC), advance manufacturing technologies (AMTs), JIT, MRP, target costing, life-cycle costing, strategic management accounting, total quality management and customer profitability analysis. Also included are revised and new end-of-chapter problems taken from past examination papers of CIMA, ACCA and ICAEW. There is increased reference to management accounting in practice, including many of the results of the author's CIMA sponsored survey, and greater emphasis on operational control and performance measurement.

  9. [Cost-benefit analysis of cranial computed tomography in mild traumatic brain injury--appropriate depiction within the G-DRG system?].

    Science.gov (United States)

    Garving, C; Weber, C D; Poßelt, S; Pishnamaz, M; Pape, H C; Dienstknecht, T

    2014-06-01

    The treatment of patients with mild head injury is related to a continuous lack of finances. The current investigation summarises radiological costs of patients from a level I trauma centre and discusses the indication for CT scanning within the G-DRG system. The study includes all patients who underwent a CCT scan in 2011. Diagnosis, length of stay and cost data were recorded for every patient. Finally, frequent diagnosis groups were summarised to clusters (Basis-DRG/MDC 21A). A total of 380 patients was treated. Within the largest group (G-DRG B80Z) the costs for a CCT already took up one quarter of the total proceedings. In combination with the high cost for monitoring patients with mild head injuries this causes an ongoing lack of finances. In spite of the often necessary CCT investigation in mild head injuries, the earnings do not cover the costs of the patients. To improve the situation clear guidelines for CCT scanning should be provided and the reimbursement in particular in the diagnosis group of the G-DRG B80Z has to be improved. Georg Thieme Verlag KG Stuttgart · New York.

  10. Urban acid deposition in Greater Manchester

    Energy Technology Data Exchange (ETDEWEB)

    Lee, D.S.; Longhurst, J.W.S.; Gee, D.R.; Hare, S.E. (Manchester Polytechnic, Manchester (UK). Acid Rain Information Centre)

    1989-08-01

    Data are presented from a monitoring network of 18 bulk precipitation collectors and one wet-only collector in the urban area of Greater Manchester, in the north west of England. Weekly samples were analysed for all the major ions in precipitation along with gaseous nitrogen dioxide concentrations from diffusion tubes. Statistical analysis of the data shows significant spatial variation of non marine sulphate, nitrate, ammonium, acidity and calcium concentrations, and nitrogen dioxide concentrations. Calcium is thought to be responsible for the buffering of acidity and is of local origin. Wet deposition is the likely removal process for calcium in the atmosphere and probably by below cloud scavenging. Nitrate and ammonium concentrations and depositions show close spatial, temporal and statistical association. Examination of high simultaneous episodes of nitrate and ammonium deposition shows that these depositions cannot be explained in terms of trajectories and it is suggested that UK emissions of ammonia may be important. Statistical analysis of the relationships between nitrate and ammonium depositions, concentrations and precipitation amount suggest that ammonia from mesoscale sources reacts reversibly with nitric acid aerosol and is removed by below cloud scavenging. High episodes of the deposition of non marine sulphate are difficult to explain by trajectory analysis alone, perhaps suggesting local sources. In a comparison between wet deposition and bulk deposition, it was shown that only 15.2% of the non marine sulphur was dry deposited to the bulk precipitation collector. 63 refs., 86 figs., 31 tabs.

  11. Middleware enabling computational self-reflection: exploring the need for and some costs of selfreflecting networks with application to homeland defense

    Science.gov (United States)

    Kramer, Michael J.; Bellman, Kirstie L.; Landauer, Christopher

    2002-07-01

    This paper will review and examine the definitions of Self-Reflection and Active Middleware. Then it will illustrate a conceptual framework for understanding and enumerating the costs of Self-Reflection and Active Middleware at increasing levels of Application. Then it will review some application of Self-Reflection and Active Middleware to simulations. Finally it will consider the application and additional kinds of costs applying Self-Reflection and Active Middleware to sharing information among the organizations expected to participate in Homeland Defense.

  12. Cyber security for greater service reliability

    Energy Technology Data Exchange (ETDEWEB)

    Vickery, P. [N-Dimension Solutions Inc., Richmond Hill, ON (Canada)

    2008-05-15

    Service reliability in the electricity transmission and distribution (T and D) industry is being challenged by increased equipment failures, harsher climatic conditions, and computer hackers who aim to disrupt services by gaining access to transmission and distribution resources. This article discussed methods of ensuring the cyber-security of T and D operators. Weak points in the T and D industry include remote terminal units; intelligent electronic devices; distributed control systems; programmable logic controllers; and various intelligent field devices. An increasing number of interconnection points exist between an operator's service control system and external systems. The North American Electric Reliability Council (NERC) standards specify that cyber security strategies should ensure that all cyber assets are protected, and that access points must be monitored to detect intrusion attempts. The introduction of new advanced metering initiatives must also be considered. Comprehensive monitoring systems should be available to support compliance with cyber security standards. It was concluded that senior management should commit to a periodic cyber security re-assessment program in order to keep up-to-date.

  13. Matching Cost Filtering for Dense Stereo Correspondence

    Directory of Open Access Journals (Sweden)

    Yimin Lin

    2013-01-01

    Full Text Available Dense stereo correspondence enabling reconstruction of depth information in a scene is of great importance in the field of computer vision. Recently, some local solutions based on matching cost filtering with an edge-preserving filter have been proved to be capable of achieving more accuracy than global approaches. Unfortunately, the computational complexity of these algorithms is quadratically related to the window size used to aggregate the matching costs. The recent trend has been to pursue higher accuracy with greater efficiency in execution. Therefore, this paper proposes a new cost-aggregation module to compute the matching responses for all the image pixels at a set of sampling points generated by a hierarchical clustering algorithm. The complexity of this implementation is linear both in the number of image pixels and the number of clusters. Experimental results demonstrate that the proposed algorithm outperforms state-of-the-art local methods in terms of both accuracy and speed. Moreover, performance tests indicate that parameters such as the height of the hierarchical binary tree and the spatial and range standard deviations have a significant influence on time consumption and the accuracy of disparity maps.

  14. Computer-assisted propofol administration.

    LENUS (Irish Health Repository)

    O'Connor, J P A

    2012-02-01

    The use of propofol for sedation in endoscopy may allow for better quality of sedation, quicker recovery and facilitate greater throughput in endoscopy units. The cost-effectiveness and utility of propofol sedation for endoscopic procedures is contingent on the personnel and resources required to carry out the procedure. Computer-based platforms are based on the patients response to stimulation and physiologic parameters. They offer an appealing means of delivering safe and effective doses of propofol. One such means is the bispectral index where continuous EEG recordings are used to assess the degree of sedation. Another is the closed-loop target-controlled system where a set of physical parameters, such as muscle relaxation and auditory-evoked potential, determine a level of medication appropriate to achieve sedation. Patient-controlled platforms may also be used. These electronic adjuncts may help endoscopists who wish to adopt propofol sedation to change current practices with greater confidence.

  15. Computer-assisted propofol administration.

    Science.gov (United States)

    O'Connor, J P A; O'Moráin, C A; Vargo, J J

    2010-01-01

    The use of propofol for sedation in endoscopy may allow for better quality of sedation, quicker recovery and facilitate greater throughput in endoscopy units. The cost-effectiveness and utility of propofol sedation for endoscopic procedures is contingent on the personnel and resources required to carry out the procedure. Computer-based platforms are based on the patients response to stimulation and physiologic parameters. They offer an appealing means of delivering safe and effective doses of propofol. One such means is the bispectral index where continuous EEG recordings are used to assess the degree of sedation. Another is the closed-loop target-controlled system where a set of physical parameters, such as muscle relaxation and auditory-evoked potential, determine a level of medication appropriate to achieve sedation. Patient-controlled platforms may also be used. These electronic adjuncts may help endoscopists who wish to adopt propofol sedation to change current practices with greater confidence. Copyright 2010 S. Karger AG, Basel.

  16. Greater-confinement disposal of low-level radioactive wastes

    International Nuclear Information System (INIS)

    Trevorrow, L.E.; Gilbert, T.L.; Luner, C.; Merry-Libby, P.A.; Meshkov, N.K.; Yu, C.

    1985-01-01

    Low-level radioactive wastes include a broad spectrum of wastes that have different radionuclide concentrations, half-lives, and physical and chemical properties. Standard shallow-land burial practice can provide adequate protection of public health and safety for most low-level wastes, but a small volume fraction (about 1%) containing most of the activity inventory (approx.90%) requires specific measures known as ''greater-confinement disposal'' (GCD). Different site characteristics and different waste characteristics - such as high radionuclide concentrations, long radionuclide half-lives, high radionuclide mobility, and physical or chemical characteristics that present exceptional hazards - lead to different GCD facility design requirements. Facility design alternatives considered for GCD include the augered shaft, deep trench, engineered structure, hydrofracture, improved waste form, and high-integrity container. Selection of an appropriate design must also consider the interplay between basic risk limits for protection of public health and safety, performance characteristics and objectives, costs, waste-acceptance criteria, waste characteristics, and site characteristics. This paper presents an overview of the factors that must be considered in planning the application of methods proposed for providing greater confinement of low-level wastes. 27 refs

  17. Land cover mapping of Greater Mesoamerica using MODIS data

    Science.gov (United States)

    Giri, Chandra; Jenkins, Clinton N.

    2005-01-01

    A new land cover database of Greater Mesoamerica has been prepared using moderate resolution imaging spectroradiometer (MODIS, 500 m resolution) satellite data. Daily surface reflectance MODIS data and a suite of ancillary data were used in preparing the database by employing a decision tree classification approach. The new land cover data are an improvement over traditional advanced very high resolution radiometer (AVHRR) based land cover data in terms of both spatial and thematic details. The dominant land cover type in Greater Mesoamerica is forest (39%), followed by shrubland (30%) and cropland (22%). Country analysis shows forest as the dominant land cover type in Belize (62%), Cost Rica (52%), Guatemala (53%), Honduras (56%), Nicaragua (53%), and Panama (48%), cropland as the dominant land cover type in El Salvador (60.5%), and shrubland as the dominant land cover type in Mexico (37%). A three-step approach was used to assess the quality of the classified land cover data: (i) qualitative assessment provided good insight in identifying and correcting gross errors; (ii) correlation analysis of MODIS- and Landsat-derived land cover data revealed strong positive association for forest (r2 = 0.88), shrubland (r2 = 0.75), and cropland (r2 = 0.97) but weak positive association for grassland (r2 = 0.26); and (iii) an error matrix generated using unseen training data provided an overall accuracy of 77.3% with a Kappa coefficient of 0.73608. Overall, MODIS 500 m data and the methodology used were found to be quite useful for broad-scale land cover mapping of Greater Mesoamerica.

  18. Software para estimativa do custo operacional de máquinas agrícolas - MAQCONTROL Development of software to compute operational costs of farm machinery - MAQCONTROL

    Directory of Open Access Journals (Sweden)

    Liane Piacentini

    2012-06-01

    Full Text Available A seleção e a otimização de sistemas mecanizados são os principais objetivos da mecanização racional. Não é suficiente uma compra adequada do maquinário agrícola se sua utilização não for controlada em aspectos operacionais e financeiros. Neste trabalho, é descrito o desenvolvimento de software para estimativa do custo operacional de máquinas agrícolas (MAQCONTROL, utilizando o ambiente de desenvolvimento Borland Delphi e o banco de dados Firebird. Os custos operacionais foram divididos em fixos e variáveis. Nos custos fixos, foram estimadas as despesas com depreciação, juros, alojamento e seguros. Nos custos variáveis, foi dada ênfase aos custos de manutenção como: óleos lubrificantes, filtros, pneus, graxa, combustível, pequenos reparos e troca de peças. Os resultados demonstraram a eficiência do software para os objetivos propostos. Assim, o MAQCONTROL pode ser uma importante ferramenta no processo de administração rural, pois reduz os custos da informação e agiliza a determinação precisa dos custos operacionais de máquinas agrícolas.The rational mechanization has as main objectives the selection and optimization of mechanized systems. An adequate purchase of agricultural machinery is not sufficient if its use is not controlled in operational and financial aspects. This work describes the development of software to estimate operational costs of agricultural machinery (MAQCONTROL, using Borland Delphi's development environment and Firebird database. The operational costs were divided in fixed and variable. In fixed costs, the expenses with depreciation, interest, storage and insurance were estimated. In variable costs, the emphasis was given to the expenses on maintenance, lubricating oils, filters, tires, grease, fuel, small repairs, and parts replacement. Results have shown the software efficiency for the proposed objectives. Therefore, the MAQCONTROL software proved to be an important tool in the rural

  19. A time and imaging cost analysis of low-risk ED observation patients: a conservative 64-section computed tomography coronary angiography "triple rule-out" compared to nuclear stress test strategy.

    Science.gov (United States)

    Takakuwa, Kevin M; Halpern, Ethan J; Shofer, Frances S

    2011-02-01

    The study aimed to examine time and imaging costs of 2 different imaging strategies for low-risk emergency department (ED) observation patients with acute chest pain or symptoms suggestive of acute coronary syndrome. We compared a "triple rule-out" (TRO) 64-section multidetector computed tomography protocol with nuclear stress testing. This was a prospective observational cohort study of consecutive ED patients who were enrolled in our chest pain observation protocol during a 16-month period. Our standard observation protocol included a minimum of 2 sets of cardiac enzymes at least 6 hours apart followed by a nuclear stress test. Once a week, observation patients were offered a TRO (to evaluate for coronary artery disease, thoracic dissection, and pulmonary embolus) multidetector computed tomography with the option of further stress testing for those patients found to have evidence of coronary artery disease. We analyzed 832 consecutive observation patients including 214 patients who underwent the TRO protocol. Mean total length of stay was 16.1 hours for TRO patients, 16.3 hours for TRO plus other imaging test, 22.6 hours for nuclear stress testing, 23.3 hours for nuclear stress testing plus other imaging tests, and 23.7 hours for nuclear stress testing plus TRO (P < .0001 for TRO and TRO + other test compared to stress test ± other test). Mean imaging times were 3.6, 4.4, 5.9, 7.5, and 6.6 hours, respectively (P < .05 for TRO and TRO + other test compared to stress test ± other test). Mean imaging costs were $1307 for TRO patients vs $945 for nuclear stress testing. Triple rule-out reduced total length of stay and imaging time but incurred higher imaging costs. A per-hospital analysis would be needed to determine if patient time savings justify the higher imaging costs. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Greater physician involvement improves coding outcomes in endobronchial ultrasound-guided transbronchial needle aspiration procedures.

    Science.gov (United States)

    Pillai, Anilkumar; Medford, Andrew R L

    2013-01-01

    Correct coding is essential for accurate reimbursement for clinical activity. Published data confirm that significant aberrations in coding occur, leading to considerable financial inaccuracies especially in interventional procedures such as endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA). Previous data reported a 15% coding error for EBUS-TBNA in a U.K. service. We hypothesised that greater physician involvement with coders would reduce EBUS-TBNA coding errors and financial disparity. The study was done as a prospective cohort study in the tertiary EBUS-TBNA service in Bristol. 165 consecutive patients between October 2009 and March 2012 underwent EBUS-TBNA for evaluation of unexplained mediastinal adenopathy on computed tomography. The chief coder was prospectively electronically informed of all procedures and cross-checked on a prospective database and by Trust Informatics. Cost and coding analysis was performed using the 2010-2011 tariffs. All 165 procedures (100%) were coded correctly as verified by Trust Informatics. This compares favourably with the 14.4% coding inaccuracy rate for EBUS-TBNA in a previous U.K. prospective cohort study [odds ratio 201.1 (1.1-357.5), p = 0.006]. Projected income loss was GBP 40,000 per year in the previous study, compared to a GBP 492,195 income here with no coding-attributable loss in revenue. Greater physician engagement with coders prevents coding errors and financial losses which can be significant especially in interventional specialties. The intervention can be as cheap, quick and simple as a prospective email to the coding team with cross-checks by Trust Informatics and against a procedural database. We suggest that all specialties should engage more with their coders using such a simple intervention to prevent revenue losses. Copyright © 2013 S. Karger AG, Basel.

  1. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  2. Cost accounting in ECN

    International Nuclear Information System (INIS)

    Wout, E.L.; Bever Donker, J.M. van.

    1979-01-01

    A five year planning is made in which the available money is distributed to the expected programmes. This five year plan is used as basis for working plan and budget for the next year. In the working plan all financial means are divided into kinds of costs, cost centres and cost units. Based on this working plan and the relevant budgets the tariffs are calculated per working centre (cost centre). The tariffs are fixed for a whole year. Up till now these tariffs are also basis for the cost unit accounting at the end of the year together with the results of the time registration. The estimated work shop services for the working centres are included in the tariffs. For the allocation of overhead costs ECN uses dynamic keys. Depreciation costs with respect to instruments, investments etc. are determined per working centre according to a computer programme. The cost unit related costs are charged directly to cost unit. This implies that project related in instruments are looked upon as running costs. In the future we will try to refine the present cost accounting system still further in this way that we will look upon a cost centre as a profit centre. Furthermore we will try to analyse the tariff and calculation deviations and under/over occupation deviations afterwards (post calculation). The information provided to the management knows a hierachic construction: project information to projectleader, programme (compound projects) information to programme coordinator, cost centre summary to department heads, attention area (compound programme) information to programme coordinator and managing director, ECN research (compound attention areas) information to general management, information re kind of costs to relevant persons, f.e. surveys of expenditure for part time personnel to personnel bureau. The information is provided by the department of Finance and Administrative Organisation. The entire scope of cost accounting is the responsibility of the head of the department

  3. Cost Analysis of the STONE Randomized Trial: Can Health Care Costs be Reduced One Test at a Time?

    Science.gov (United States)

    Melnikow, Joy; Xing, Guibo; Cox, Ginger; Leigh, Paul; Mills, Lisa; Miglioretti, Diana L; Moghadassi, Michelle; Smith-Bindman, Rebecca

    2016-04-01

    Decreasing the use of high-cost tests may reduce health care costs. To compare costs of care for patients presenting to the emergency department (ED) with suspected kidney stones randomized to 1 of 3 initial imaging tests. Patients were randomized to point-of-care ultrasound (POC US, least costly), radiology ultrasound (RAD US), or computed tomography (CT, most costly). Subsequent testing and treatment were the choice of the treating physician. A total of 2759 patients at 15 EDs were randomized to POC US (n=908), RAD US, (n=893), or CT (n=958). Mean age was 40.4 years; 51.8% were male. All medical care documented in the trial database in the 7 days following enrollment was abstracted and coded to estimate costs using national average 2012 Medicare reimbursements. Costs for initial ED care and total 7-day costs were compared using nonparametric bootstrap to account for clustering of patients within medical centers. Initial ED visit costs were modestly lower for patients assigned to RAD US: $423 ($411, $434) compared with patients assigned to CT: $448 ($438, $459) (Pcosts were not significantly different between groups: $1014 ($912, $1129) for POC US, $970 ($878, $1078) for RAD US, and $959 ($870, $1044) for CT. Hospital admissions contributed over 50% of total costs, though only 11% of patients were admitted. Mean total costs (and admission rates) varied substantially by site from $749 to $1239. Assignment to a less costly test had no impact on overall health care costs for ED patients. System-level interventions addressing variation in admission rates from the ED might have greater impact on costs.

  4. Can Broader Diffusion of Value-Based Insurance Design Increase Benefits from US Health Care without Increasing Costs? Evidence from a Computer Simulation Model

    OpenAIRE

    Scott Braithwaite, R.; Omokaro, Cynthia; Justice, Amy C.; Nucifora, Kimberly; Roberts, Mark S.

    2010-01-01

    Editors' Summary Background More money is spent per person on health care in the US than in any other country. US health care expenditure accounts for 16.2% of the gross domestic product and this figure is rising. Indeed, the increase in health care costs is outstripping the economy's growth rate. Consequently, US policy makers and providers of health insurance?health care in the US is largely provided by the private sector and is paid for through private health insurance or through governmen...

  5. Combining computational modelling with radioisotope technology for a more cost- effective and time-efficient method of solving industrial and medical diagnostic problems

    International Nuclear Information System (INIS)

    Tu, J.Y.; Easey, J.F.; Burch, W.M.

    1997-01-01

    In this paper, some work on computational modelling for industrial operations and processes will be presented, for example, the modelling of fly-ash flow and the associated prediction of erosion in power utility boilers. The introduction and use of new formulations of encapsulated radioisotopes, currently being research at ANSTO, will open up further possibilities for the utilisation of radiotracer applications for a wider range of validation work not only in industrial but also in medical investigations. Applications of developed models to solving industrial problems will also be discussed in the paper

  6. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  7. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  8. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  9. What does an MRI scan cost?

    Science.gov (United States)

    Young, David W

    2015-11-01

    Historically, hospital departments have computed the costs of individual tests or procedures using the ratio of cost to charges (RCC) method, which can produce inaccurate results. To determine a more accurate cost of a test or procedure, the activity-based costing (ABC) method must be used. Accurate cost calculations will ensure reliable information about the profitability of a hospital's DRGs.

  10. 24 CFR 908.108 - Cost.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Cost. 908.108 Section 908.108..., RENTAL VOUCHER, AND MODERATE REHABILITATION PROGRAMS § 908.108 Cost. (a) General. The costs of the... computer hardware or software, or both, the cost of contracting for those services, or the cost of...

  11. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  12. The Challenge of Computers.

    Science.gov (United States)

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  13. Volunteer Computing for Science Gateways

    OpenAIRE

    Anderson, David

    2017-01-01

    This poster offers information about volunteer computing for science gateways that offer high-throughput computing services. Volunteer computing can be used to get computing power. This increases the visibility of the gateway to the general public as well as increasing computing capacity at little cost.

  14. Metabolic cost of running is greater on a treadmill with a stiffer running platform.

    Science.gov (United States)

    Smith, James A H; McKerrow, Alexander D; Kohn, Tertius A

    2017-08-01

    Exercise testing on motorised treadmills provides valuable information about running performance and metabolism; however, the impact of treadmill type on these tests has not been investigated. This study compared the energy demand of running on two laboratory treadmills: an HP Cosmos (C) and a Quinton (Q) model, with the latter having a 4.5 times stiffer running platform. Twelve experienced runners ran identical bouts on these treadmills at a range of four submaximal velocities (reported data is for the velocity that approximated 75-81% VO 2max ). The stiffer treadmill elicited higher oxygen consumption (C: 46.7 ± 3.8; Q: 50.1 ± 4.3 ml·kg -1 · min -1 ), energy expenditure (C: 16.0 ± 2.5; Q: 17.7 ± 2.9 kcal · min -1 ), carbohydrate oxidation (C: 9.6 ± 3.1; Q: 13.0 ± 3.9 kcal · min -1 ), heart rate (C: 155 ± 16; Q: 163 ± 16 beats · min -1 ) and rating of perceived exertion (C: 13.8 ± 1.2; Q: 14.7 ± 1.2), but lower fat oxidation (C: 6.4 ± 2.3; Q: 4.6 ± 2.5 kcal · min -1 ) (all analysis of variance treadmill comparisons P running depending on the running platform stiffness.

  15. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  16. Upgrade and benchmarking of the NIFS physics-engineering-cost code

    International Nuclear Information System (INIS)

    Dolan, T.J.; Yamazaki, K.

    2004-07-01

    The NIFS Physics-Engineering-Cost (PEC) code for helical and tokamak fusion reactors is upgraded by adding data from three blanket-shield designs, a new cost section based on the ARIES cost schedule, more recent unit costs, and improved algorithms for various computations. The PEC code is also benchmarked by modeling the ARIES-AT (advanced technology) tokamak and the ARIES-SPPS (stellarator power plant system). The PEC code succeeds in predicting many of the pertinent plasma parameters and reactor component masses within about 10%. There are cost differences greater than 10% for some fusion power core components, which may be attributed to differences of unit costs used by the codes. The COEs estimated by the PEC code differ from the COEs of the ARIES-AT and ARIES-SPPS studies by 5%. (author)

  17. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  18. Greater commitment to the domestic violence training is required.

    Science.gov (United States)

    Leppäkoski, Tuija Helena; Flinck, Aune; Paavilainen, Eija

    2015-05-01

    Domestic violence (DV) is a major public health problem with high health and social costs. A solution to this multi-faceted problem requires that various help providers work together in an effective and optimal manner when dealing with different parties of DV. The objective of our research and development project (2008-2013) was to improve the preparedness of the social and healthcare professionals to manage DV. This article focuses on the evaluation of interprofessional education (IPE) to provide knowledge and skills for identifying and intervening in DV and to improve collaboration among social and health care professionals and other help providers at the local and regional level. The evaluation data were carried out with an internal evaluation. The evaluation data were collected from the participants orally and in the written form. The participants were satisfied with the content of the IPE programme itself and the teaching methods used. Participation in the training sessions could have been more active. Moreover, some of the people who had enrolled for the trainings could not attend all of them. IPE is a valuable way to develop intervening in DV. However, greater commitment to the training is required from not only the participants and their superiors but also from trustees.

  19. Computed tomography of the heart using thallium-201 in children

    International Nuclear Information System (INIS)

    Treves, S.; Hill, T.C.; VanPraagh, R.; Holman, B.L.

    1979-01-01

    Thallium-201 emission computed tomography (ECT) was performed in 3 pediatric patients in whom conventional scintigraphy was normal but there was a strong clinical suspicion of myocardial disease. Abnormalities in the distribution of myocardial perfusion appeared sharply delineated with ECT compared to normal conventional gamma camera scintigraphy. Single photon ECT provides a three dimensional reconstruction which results in greater enhancement since activity in overlying structures does not interfere. Its widespread use is limited only by the cost of the imaging device

  20. Tapping Transaction Costs to Forecast Acquisition Cost Breaches

    Science.gov (United States)

    2016-01-01

    experience a cost breach. In our medical example, we could use survival analysis to identify risk fac- tors, such as obesity , that might indicate a greater... exogenous variables on the probability of a dichotomous outcome, such as whether or not a cost breach occurs in any given program year. Logit is

  1. A new model predictive control algorithm by reducing the computing time of cost function minimization for NPC inverter in three-phase power grids.

    Science.gov (United States)

    Taheri, Asghar; Zhalebaghi, Mohammad Hadi

    2017-11-01

    This paper presents a new control strategy based on finite-control-set model-predictive control (FCS-MPC) for Neutral-point-clamped (NPC) three-level converters. Containing some advantages like fast dynamic response, easy inclusion of constraints and simple control loop, makes the FCS-MPC method attractive to use as a switching strategy for converters. However, the large amount of required calculations is a problem in the widespread of this method. In this way, to resolve this problem this paper presents a modified method that effectively reduces the computation load compare with conventional FCS-MPC method and at the same time does not affect on control performance. The proposed method can be used for exchanging power between electrical grid and DC resources by providing active and reactive power compensations. Experiments on three-level converter for three Power Factor Correction (PFC), inductive and capacitive compensation modes verify the good and comparable performance. The results have been simulated using MATLAB/SIMULINK software. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Optimizing Data Centre Energy and Environmental Costs

    Science.gov (United States)

    Aikema, David Hendrik

    Data centres use an estimated 2% of US electrical power which accounts for much of their total cost of ownership. This consumption continues to grow, further straining power grids attempting to integrate more renewable energy. This dissertation focuses on assessing and reducing data centre environmental and financial costs. Emissions of projects undertaken to lower the data centre environmental footprints can be assessed and the emission reduction projects compared using an ISO-14064-2-compliant greenhouse gas reduction protocol outlined herein. I was closely involved with the development of the protocol. Full lifecycle analysis and verifying that projects exceed business-as-usual expectations are addressed, and a test project is described. Consuming power when it is low cost or when renewable energy is available can be used to reduce the financial and environmental costs of computing. Adaptation based on the power price showed 10--50% potential savings in typical cases, and local renewable energy use could be increased by 10--80%. Allowing a fraction of high-priority tasks to proceed unimpeded still allows significant savings. Power grid operators use mechanisms called ancillary services to address variation and system failures, paying organizations to alter power consumption on request. By bidding to offer these services, data centres may be able to lower their energy costs while reducing their environmental impact. If providing contingency reserves which require only infrequent action, savings of up to 12% were seen in simulations. Greater power cost savings are possible for those ceding more control to the power grid operator. Coordinating multiple data centres adds overhead, and altering at which data centre requests are processed based on changes in the financial or environmental costs of power is likely to increase this overhead. Tests of virtual machine migrations showed that in some cases there was no visible increase in power use while in others power use

  3. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images can be viewed on a computer monitor, printed on film or transferred to a CD or DVD. CT images of internal organs, bones, soft tissue and blood vessels provide greater detail ...

  4. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images can be viewed on a computer monitor, printed on film or transferred to a CD or DVD. CT images of internal organs, bones, soft tissue and blood vessels provide greater detail ...

  5. Comparing statistical tests for detecting soil contamination greater than background

    International Nuclear Information System (INIS)

    Hardin, J.W.; Gilbert, R.O.

    1993-12-01

    The Washington State Department of Ecology (WSDE) recently issued a report that provides guidance on statistical issues regarding investigation and cleanup of soil and groundwater contamination under the Model Toxics Control Act Cleanup Regulation. Included in the report are procedures for determining a background-based cleanup standard and for conducting a 3-step statistical test procedure to decide if a site is contaminated greater than the background standard. The guidance specifies that the State test should only be used if the background and site data are lognormally distributed. The guidance in WSDE allows for using alternative tests on a site-specific basis if prior approval is obtained from WSDE. This report presents the results of a Monte Carlo computer simulation study conducted to evaluate the performance of the State test and several alternative tests for various contamination scenarios (background and site data distributions). The primary test performance criteria are (1) the probability the test will indicate that a contaminated site is indeed contaminated, and (2) the probability that the test will indicate an uncontaminated site is contaminated. The simulation study was conducted assuming the background concentrations were from lognormal or Weibull distributions. The site data were drawn from distributions selected to represent various contamination scenarios. The statistical tests studied are the State test, t test, Satterthwaite's t test, five distribution-free tests, and several tandem tests (wherein two or more tests are conducted using the same data set)

  6. Designer's unified cost model

    Science.gov (United States)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  7. Designers' unified cost model

    Science.gov (United States)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  8. Cost Behavior

    DEFF Research Database (Denmark)

    Hoffmann, Kira

    The objective of this dissertation is to investigate determinants and consequences of asymmetric cost behavior. Asymmetric cost behavior arises if the change in costs is different for increases in activity compared to equivalent decreases in activity. In this case, costs are termed “sticky......” if the change is less when activity falls than when activity rises, whereas costs are termed “anti-sticky” if the change is more when activity falls than when activity rises. Understanding such cost behavior is especially relevant for decision-makers and financial analysts that rely on accurate cost information...... to facilitate resource planning and earnings forecasting. As such, this dissertation relates to the topic of firm profitability and the interpretation of cost variability. The dissertation consists of three parts that are written in the form of separate academic papers. The following section briefly summarizes...

  9. Computational methods in drug discovery

    Directory of Open Access Journals (Sweden)

    Sumudu P. Leelananda

    2016-12-01

    Full Text Available The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed.

  10. Automatic Computer Mapping of Terrain

    Science.gov (United States)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  11. Fog Computing and Edge Computing Architectures for Processing Data From Diabetes Devices Connected to the Medical Internet of Things.

    Science.gov (United States)

    Klonoff, David C

    2017-07-01

    The Internet of Things (IoT) is generating an immense volume of data. With cloud computing, medical sensor and actuator data can be stored and analyzed remotely by distributed servers. The results can then be delivered via the Internet. The number of devices in IoT includes such wireless diabetes devices as blood glucose monitors, continuous glucose monitors, insulin pens, insulin pumps, and closed-loop systems. The cloud model for data storage and analysis is increasingly unable to process the data avalanche, and processing is being pushed out to the edge of the network closer to where the data-generating devices are. Fog computing and edge computing are two architectures for data handling that can offload data from the cloud, process it nearby the patient, and transmit information machine-to-machine or machine-to-human in milliseconds or seconds. Sensor data can be processed near the sensing and actuating devices with fog computing (with local nodes) and with edge computing (within the sensing devices). Compared to cloud computing, fog computing and edge computing offer five advantages: (1) greater data transmission speed, (2) less dependence on limited bandwidths, (3) greater privacy and security, (4) greater control over data generated in foreign countries where laws may limit use or permit unwanted governmental access, and (5) lower costs because more sensor-derived data are used locally and less data are transmitted remotely. Connected diabetes devices almost all use fog computing or edge computing because diabetes patients require a very rapid response to sensor input and cannot tolerate delays for cloud computing.

  12. How to Bill Your Computer Services.

    Science.gov (United States)

    Dooskin, Herbert P.

    1981-01-01

    A computer facility billing procedure should be designed so that the full costs of a computer center operation are equitably charged to the users. Design criteria, costing methods, and management's role are discussed. (Author/MLF)

  13. Secure equality and greater-than tests with sublinear online complexity

    DEFF Research Database (Denmark)

    Lipmaa, Helger; Toft, Tomas

    2013-01-01

    Secure multiparty computation (MPC) allows multiple parties to evaluate functions without disclosing the private inputs. Secure comparisons (testing equality and greater-than) are important primitives required by many MPC applications. We propose two equality tests for ℓ-bit values with O(1) online...

  14. A new probabilistic seismic hazard assessment for greater Tokyo

    Science.gov (United States)

    Stein, R.S.; Toda, S.; Parsons, T.; Grunewald, E.; Blong, R.; Sparks, S.; Shah, H.; Kennedy, J.

    2006-01-01

    Tokyo and its outlying cities are home to one-quarter of Japan's 127 million people. Highly destructive earthquakes struck the capital in 1703, 1855 and 1923, the last of which took 105 000 lives. Fuelled by greater Tokyo's rich seismological record, but challenged by its magnificent complexity, our joint Japanese-US group carried out a new study of the capital's earthquake hazards. We used the prehistoric record of great earthquakes preserved by uplifted marine terraces and tsunami deposits (17 M???8 shocks in the past 7000 years), a newly digitized dataset of historical shaking (10 000 observations in the past 400 years), the dense modern seismic network (300 000 earthquakes in the past 30 years), and Japan's GeoNet array (150 GPS vectors in the past 10 years) to reinterpret the tectonic structure, identify active faults and their slip rates and estimate their earthquake frequency. We propose that a dislodged fragment of the Pacific plate is jammed between the Pacific, Philippine Sea and Eurasian plates beneath the Kanto plain on which Tokyo sits. We suggest that the Kanto fragment controls much of Tokyo's seismic behaviour for large earthquakes, including the damaging 1855 M???7.3 Ansei-Edo shock. On the basis of the frequency of earthquakes beneath greater Tokyo, events with magnitude and location similar to the M??? 7.3 Ansei-Edo event have a ca 20% likelihood in an average 30 year period. In contrast, our renewal (time-dependent) probability for the great M??? 7.9 plate boundary shocks such as struck in 1923 and 1703 is 0.5% for the next 30 years, with a time-averaged 30 year probability of ca 10%. The resulting net likelihood for severe shaking (ca 0.9g peak ground acceleration (PGA)) in Tokyo, Kawasaki and Yokohama for the next 30 years is ca 30%. The long historical record in Kanto also affords a rare opportunity to calculate the probability of shaking in an alternative manner exclusively from intensity observations. This approach permits robust estimates

  15. Drilling cost analysis

    International Nuclear Information System (INIS)

    Anand, A.B.

    1992-01-01

    Drilling assumes greater importance in present day uranium exploration which emphasizes to explore more areas on the basis of conceptual model than merely on surface anomalies. But drilling is as costly as it is important and consumes a major share (50% to 60%) of the exploration budget. As such the cost of drilling has great bearing on the exploration strategy as well as on the overall cost of the project. Therefore, understanding the cost analysis is very much important when planning or intensifying an exploration programme. This not only helps in controlling the current operations but also in planning the budgetary provisions for future operations. Also, if the work is entrusted to a private party, knowledge of in-house cost analysis helps in fixing the rates of drilling in different formations and areas to be drilled. Under this topic, various factors that contribute to the cost of drilling per meter as well as ways to minimize the drilling cost for better economic evaluation of mineral deposits are discussed. (author)

  16. Distribution costs -- the cost of local delivery

    International Nuclear Information System (INIS)

    Winger, N.; Zarnett, P.; Carr, J.

    2000-01-01

    Most of the power transmission system in the province of Ontario is owned and operated as a regulated monopoly by Ontario Hydro Services Company (OHSC). Local distribution systems deliver to end-users from bulk supply points within a service territory. OHSC distributes to approximately one million, mostly rural customers, while the approximately 250 municipal utilities together serve about two million, mostly urban customers. Under the Energy Competition Act of 1998 local distribution companies will face some new challenges, including unbundled billing systems, a broader range of distribution costs, increased costs, made up of corporate taxes or payments in lieu of taxes and added costs for regulatory affairs. The consultants provide a detailed discussion of the components of distribution costs, the three components of the typical budget process (capital expenditures, (CAPEX), operating and maintenance (O and M) and administration and corporate (GA and C), a summary of some typical distribution costs in Ontario, and the estimated impacts of the Energy Competition Act (ECA) compliance on charges and rates. Various mitigation strategies are also reviewed. Among these are joint ventures by local distribution companies to reduce ECA compliance costs, re-examination of controllable costs, temporary reduction of the allowable return on equity (ROE) by 50 per cent, and/or reducing the competitive transition charge (CTC). It is estimated that either one of these two reductions could eliminate the full amount of the five to seven per cent uplift in delivered energy service costs. The conclusion of the consultants is that local distribution delivery charges will make up a greater proportion of end-user cost in the future than it has in the past. An increase to customers of about five per cent is expected when the competitive electricity market opens and unbundled billing begins. The cost increase could be mitigated by a combination of actions that would be needed for about

  17. The cost-effectiveness of methanol for reducing motor vehicle emissions and urban ozone

    International Nuclear Information System (INIS)

    Krupnick, A.J.; Walls, M.A.

    1992-01-01

    This article analyzes the costs and emissions characteristics of methanol vehicles. The cost-effectiveness of methanol - the cost per ton of reactive hydrocarbon emissions reduced - is calculated and compared to the cost-effectiveness of other hydrocarbon reduction strategies. Methanol is found to cost from $33,000 to nearly $60,000 per ton, while several other options are available for under $10,000 per ton. The cost per part-per-million reduction in peak ambient ozone levels is also computed for two cities, Houston and Philadelphia. Despite the greater improvement in ozone in Philadelphia than Houston, methanol is found to be more cost-effective in Houston. This result occurs because Houston's distribution and marketing costs are lower than Philadelphia's. The costs in both cities, however, are far higher than estimates of the benefits from acute health improvements. Finally, the reduction in ozone exposure in Los Angeles is estimated and the costs of the reduction compared with an estimate of acute health benefits. Again, the benefits fall far short of the costs. 51 refs., 5 tabs

  18. Cost analysis of living donor kidney transplantation in China: a single-center experience.

    Science.gov (United States)

    Zhao, Wenyu; Zhang, Lei; Han, Shu; Zhu, Youhua; Wang, Liming; Zhou, Meisheng; Zeng, Li

    2012-01-01

    Kidney transplantation is the most cost-effective option for the treatment of end-stage renal disease, but the financial aspects of kidney transplantation have not yet been fully investigated. The purpose of this study was to determine the hospital cost of living donor kidney transplantation in China and to identify factors associated with the high cost. Demographic and clinical data of 103 consecutive patients who underwent living donor kidney transplantation from January 2007 to January 2011 at our center were reviewed, and detailed hospital cost of initial admission for kidney transplantation was analyzed. A stepwise multiple regression analysis was computed to determine predictors affecting the total hospital cost. The median total hospital cost was US $10,531, of which 69.2% was for medications, 13.2% for surgical procedures, 11.4% for para clinics, 3.7% for accommodations, 0.5% for nursing care, and 2.0% for other miscellaneous medical services. A multivariate stepwise logistic regression model for overall cost of transplantation revealed that the length of hospital stay, induction therapy, steroid-resistant rejection, maintenance therapy, infection status and body weight were independent predictors affecting the total hospitalization cost. Although the cost of living donor kidney transplantation in China is much lower than that in developed countries, it is a heavy burden for both the government and the patients. As medications formed the greater proportion of the total hospitalization cost, efforts to reduce the cost of drugs should be addressed.

  19. Teacher Costs

    OpenAIRE

    DINIS MOTA DA COSTA PATRICIA; DE SOUSA LOBO BORGES DE ARAUJO LUISA

    2015-01-01

    The purpose of this technical brief is to assess current methodologies for the collection and calculation of teacher costs in European Union (EU) Member States in view of improving data series and indicators related to teacher salaries and teacher costs. To this end, CRELL compares the Eurydice collection on teacher salaries with the similar Organisation for Economic Co-operation and Development (OECD) data collection and calculates teacher costs based on the methodology established by Statis...

  20. Modelling User-Costs in Life Cycle Cost-Benefit (LCCB) analysis

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    2008-01-01

    The importance of including user's costs in Life-Cycle Cost-Benefit analysis of structures is discussed in this paper. This is especially for bridges of great importance. Repair or/and failure of a bridge will usually result in user costs greater than the repair or replacement costs of the bridge...

  1. Length and anatomic routes of the greater palatine canal as observed by cone beam computed tomography

    Directory of Open Access Journals (Sweden)

    Mahnaz Sheikhi

    2013-01-01

    Conclusions: The mean CL was significantly different according to sex and side. The mean distance from the IOF to CMP was significantly different according to sex. On comparing the mean distance from the IOF to the CMP with the CL, no significant difference was observed. Therefore, the mean distance from the IOF to CMP may be a reliable clinical index.

  2. Higher-order techniques in computational electromagnetics

    CERN Document Server

    Graglia, Roberto D

    2016-01-01

    Higher-Order Techniques in Computational Electromagnetics explains 'high-order' techniques that can significantly improve the accuracy, computational cost, and reliability of computational techniques for high-frequency electromagnetics, such as antennas, microwave devices and radar scattering applications.

  3. Rehabilitation costs

    Energy Technology Data Exchange (ETDEWEB)

    Kubo, Arthur S [BDM Corp., VA (United States); [Bikini Atoll Rehabilitation Committee, Berkeley, CA (United States)

    1986-07-01

    The costs of radioactivity contamination control and other matters relating to the resettlement of Bikin atoll were reviewed for Bikini Atoll Rehabilitation Committee by a panel of engineers which met in Berkeley, California on January 22-24, 1986. This Appendix presents the cost estimates.

  4. Rehabilitation costs

    International Nuclear Information System (INIS)

    Kubo, Arthur S.

    1986-01-01

    The costs of radioactivity contamination control and other matters relating to the resettlement of Bikin atoll were reviewed for Bikini Atoll Rehabilitation Committee by a panel of engineers which met in Berkeley, California on January 22-24, 1986. This Appendix presents the cost estimates

  5. Cost considerations

    NARCIS (Netherlands)

    Michiel Ras; Debbie Verbeek-Oudijk; Evelien Eggink

    2013-01-01

    Original title: Lasten onder de loep The Dutch government spends almost 7 billion euros  each year on care for people with intellectual disabilities, and these costs are rising steadily. This report analyses what underlies the increase in costs that occurred between 2007 and 2011. Was

  6. Technical concept for a greater-confinement-disposal test facility

    International Nuclear Information System (INIS)

    Hunter, P.H.

    1982-01-01

    Greater confinement disposal (GCO) has been defined by the National Low-Level Waste Program as the disposal of low-level waste in such a manner as to provide greater containment of radiation, reduce potential for migration or dispersion or radionuclides, and provide greater protection from inadvertent human and biological intrusions in order to protect the public health and safety. This paper discusses: the need for GCD; definition of GCD; advantages and disadvantages of GCD; relative dose impacts of GCD versus shallow land disposal; types of waste compatible with GCD; objectives of GCD borehole demonstration test; engineering and technical issues; and factors affecting performance of the greater confinement disposal facility

  7. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  8. Troubleshooting Costs

    Science.gov (United States)

    Kornacki, Jeffrey L.

    Seventy-six million cases of foodborne disease occur each year in the United States alone. Medical and lost productivity costs of the most common pathogens are estimated to be 5.6-9.4 billion. Product recalls, whether from foodborne illness or spoilage, result in added costs to manufacturers in a variety of ways. These may include expenses associated with lawsuits from real or allegedly stricken individuals and lawsuits from shorted customers. Other costs include those associated with efforts involved in finding the source of the contamination and eliminating it and include time when lines are shut down and therefore non-productive, additional non-routine testing, consultant fees, time and personnel required to overhaul the entire food safety system, lost market share to competitors, and the cost associated with redesign of the factory and redesign or acquisition of more hygienic equipment. The cost associated with an effective quality assurance plan is well worth the effort to prevent the situations described.

  9. Cost comparisons

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    How much does the LHC cost? And how much does this represent in other currencies? Below we present a table showing some comparisons with the cost of other projects. Looking at the figures, you will see that the cost of the LHC can be likened to that of three skyscrapers, or two seasons of Formula 1 racing! One year's budget of a single large F1 team is comparable to the entire materials cost of the ATLAS or CMS experiments.   Please note that all the figures are rounded for ease of reading.    CHF € $   LHC 4.6 billions 3 billions  4 billions   Space Shuttle Endeavour (NASA) 1.9 billion 1.3 billion 1.7 billion   Hubble Space Telescope (cost at launch – NASA/...

  10. [Operating cost analysis of anaesthesia: activity based costing (ABC analysis)].

    Science.gov (United States)

    Majstorović, Branislava M; Kastratović, Dragana A; Vučović, Dragan S; Milaković, Branko D; Miličić, Biljana R

    2011-01-01

    Cost of anaesthesiology represent defined measures to determine a precise profile of expenditure estimation of surgical treatment, which is important regarding planning of healthcare activities, prices and budget. In order to determine the actual value of anaestesiological services, we started with the analysis of activity based costing (ABC) analysis. Retrospectively, in 2005 and 2006, we estimated the direct costs of anestesiological services (salaries, drugs, supplying materials and other: analyses and equipment.) of the Institute of Anaesthesia and Resuscitation of the Clinical Centre of Serbia. The group included all anesthetized patients of both sexes and all ages. We compared direct costs with direct expenditure, "each cost object (service or unit)" of the Republican Healthcare Insurance. The Summary data of the Departments of Anaesthesia documented in the database of the Clinical Centre of Serbia. Numerical data were utilized and the numerical data were estimated and analyzed by computer programs Microsoft Office Excel 2003 and SPSS for Windows. We compared using the linear model of direct costs and unit costs of anaesthesiological services from the Costs List of the Republican Healthcare Insurance. Direct costs showed 40% of costs were spent on salaries, (32% on drugs and supplies, and 28% on other costs, such as analyses and equipment. The correlation of the direct costs of anaestesiological services showed a linear correlation with the unit costs of the Republican Healthcare Insurance. During surgery, costs of anaesthesia would increase by 10% the surgical treatment cost of patients. Regarding the actual costs of drugs and supplies, we do not see any possibility of costs reduction. Fixed elements of direct costs provide the possibility of rationalization of resources in anaesthesia.

  11. Tracking and computing

    International Nuclear Information System (INIS)

    Niederer, J.

    1983-01-01

    This note outlines several ways in which large scale simulation computing and programming support may be provided to the SSC design community. One aspect of the problem is getting supercomputer power without the high cost and long lead times of large scale institutional computing. Another aspect is the blending of modern programming practices with more conventional accelerator design programs in ways that do not also swamp designers with the details of complicated computer technology

  12. Computational Fluid Dynamics of Whole-Body Aircraft

    Science.gov (United States)

    Agarwal, Ramesh

    1999-01-01

    The current state of the art in computational aerodynamics for whole-body aircraft flowfield simulations is described. Recent advances in geometry modeling, surface and volume grid generation, and flow simulation algorithms have led to accurate flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics has emerged as a crucial enabling technology for the design and development of flight vehicles. Examples illustrating the current capability for the prediction of transport and fighter aircraft flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future, inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology, and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  13. Advanced computer graphics techniques as applied to the nuclear industry

    International Nuclear Information System (INIS)

    Thomas, J.J.; Koontz, A.S.

    1985-08-01

    Computer graphics is a rapidly advancing technological area in computer science. This is being motivated by increased hardware capability coupled with reduced hardware costs. This paper will cover six topics in computer graphics, with examples forecasting how each of these capabilities could be used in the nuclear industry. These topics are: (1) Image Realism with Surfaces and Transparency; (2) Computer Graphics Motion; (3) Graphics Resolution Issues and Examples; (4) Iconic Interaction; (5) Graphic Workstations; and (6) Data Fusion - illustrating data coming from numerous sources, for display through high dimensional, greater than 3-D, graphics. All topics will be discussed using extensive examples with slides, video tapes, and movies. Illustrations have been omitted from the paper due to the complexity of color reproduction. 11 refs., 2 figs., 3 tabs

  14. Low-cost positron computed tomography

    International Nuclear Information System (INIS)

    Ott, R.J.; Batty, V.; Bateman, T.E.; Clack, R.; Flower, M.A.; Leach, M.O.; Marsden, P.; Webb, S.; McCready, V.R.

    1986-01-01

    After briefly describing the technique of positron emission tomography (PET) and the types of detectors used, the operational experience of a recently developed multi-wire proportional chamber positron camera which can be used to provide images using radionuclides such as 68 Ga, 124 I, 82 Rb, 55 Co, 18 F and 11 C is discussed. Clinical applications included PET imaging of the thyroid and the brain and possible future applications include PET imaging of the liver and tumour localization using antigen-specific monoclonal antibodies. Future developments to improve the sensitivity and spatial resolution of the detectors used in PET are discussed. (U.K.)

  15. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  16. Operating dedicated data centers – is it cost-effective?

    International Nuclear Information System (INIS)

    Ernst, M; Hogue, R; Hollowell, C; Strecker-Kellog, W; Wong, A; Zaytsev, A

    2014-01-01

    The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.

  17. Operating Dedicated Data Centers - Is It Cost-Effective?

    Science.gov (United States)

    Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.

    2014-06-01

    The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.

  18. Assessing Human Impacts on the Greater Akaki River, Ethiopia ...

    African Journals Online (AJOL)

    We assessed the impacts of human activities on the Greater Akaki River using physicochemical parameters and macroinvertebrate metrics. Physicochemical samples and macroinvertebrates were collected bimonthly from eight sites established on the Greater Akaki River from February 2006 to April 2006. Eleven metrics ...

  19. Comparative Education in Greater China: Contexts, Characteristics, Contrasts and Contributions.

    Science.gov (United States)

    Bray, Mark; Qin, Gui

    2001-01-01

    The evolution of comparative education in Greater China (mainland China, Taiwan, Hong Kong, and Macau) has been influenced by size, culture, political ideologies, standard of living, and colonialism. Similarities and differences in conceptions of comparative education are identified among the four components and between Greater China and other…

  20. Greater temperature sensitivity of plant phenology at colder sites

    DEFF Research Database (Denmark)

    Prevey, Janet; Vellend, Mark; Ruger, Nadja

    2017-01-01

    Warmer temperatures are accelerating the phenology of organisms around the world. Temperature sensitivity of phenology might be greater in colder, higher latitude sites than in warmer regions, in part because small changes in temperature constitute greater relative changes in thermal balance...

  1. Breeding of Greater and Lesser Flamingos at Sua Pan, Botswana ...

    African Journals Online (AJOL)

    to fledging was unknown owing to the rapid drying of the pan in late March 1999. No Greater Flamingo breeding was seen that season. Exceptional flooding during 1999–2000 produced highly favourable breeding conditions, with numbers of Greater and Lesser Flamingos breeding estimated to be 23 869 and 64 287 pairs, ...

  2. Surgical anatomy of greater occipital nerve and its relation to ...

    African Journals Online (AJOL)

    Introduction: The knowledge of the anatomy of greater occipital nerve and its relation to occipital artery is important for the surgeon. Blockage or surgical release of greater occipital nerve is clinically effective in reducing or eliminating chronic migraine symptoms. Aim: The aim of this research was to study the anatomy of ...

  3. Surgical anatomy of greater occipital nerve and its relation to ...

    African Journals Online (AJOL)

    Nancy Mohamed El Sekily

    2014-08-19

    Aug 19, 2014 ... Abstract Introduction: The knowledge of the anatomy of greater occipital nerve and its relation to occipital artery is important for the surgeon. Blockage or surgical release of greater occipital nerve is clinically effective in reducing or eliminating chronic migraine symptoms. Aim: The aim of this research was to ...

  4. INDUSTRIAL LAND DEVELOPMENT AND MANUFACTURING DECONCENTRATION IN GREATER JAKARTA

    NARCIS (Netherlands)

    Hudalah, Delik; Viantari, Dimitra; Firman, Tommy; Woltjer, Johan

    2013-01-01

    Industrial land development has become a key feature of urbanization in Greater Jakarta, one of the largest metropolitan areas in Southeast Asia. Following Suharto's market-oriented policy measures in the late 1980s, private developers have dominated the land development projects in Greater Jakarta.

  5. Strategies for Talent Management: Greater Philadelphia Companies in Action

    Science.gov (United States)

    Council for Adult and Experiential Learning (NJ1), 2008

    2008-01-01

    Human capital is one of the critical issues that impacts the Greater Philadelphia region's ability to grow and prosper. The CEO Council for Growth (CEO Council) is committed to ensuring a steady and talented supply of quality workers for this region. "Strategies for Talent Management: Greater Philadelphia Companies in Action" provides…

  6. Productivity associated with visual status of computer users.

    Science.gov (United States)

    Daum, Kent M; Clore, Katherine A; Simms, Suzanne S; Vesely, Jon W; Wilczek, Dawn D; Spittle, Brian M; Good, Greg W

    2004-01-01

    The aim of this project is to examine the potential connection between the astigmatic refractive corrections of subjects using computers and their productivity and comfort. We hypothesize that improving the visual status of subjects using computers results in greater productivity, as well as improved visual comfort. Inclusion criteria required subjects 19 to 30 years of age with complete vision examinations before being enrolled. Using a double-masked, placebo-controlled, randomized design, subjects completed three experimental tasks calculated to assess the effects of refractive error on productivity (time to completion and the number of errors) at a computer. The tasks resembled those commonly undertaken by computer users and involved visual search tasks of: (1) counties and populations; (2) nonsense word search; and (3) a modified text-editing task. Estimates of productivity for time to completion varied from a minimum of 2.5% upwards to 28.7% with 2 D cylinder miscorrection. Assuming a conservative estimate of an overall 2.5% increase in productivity with appropriate astigmatic refractive correction, our data suggest a favorable cost-benefit ratio of at least 2.3 for the visual correction of an employee (total cost 268 dollars) with a salary of 25,000 dollars per year. We conclude that astigmatic refractive error affected both productivity and visual comfort under the conditions of this experiment. These data also suggest a favorable cost-benefit ratio for employers who provide computer-specific eyewear to their employees.

  7. 20 CFR 404.270 - Cost-of-living increases.

    Science.gov (United States)

    2010-04-01

    ... INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404.270 Cost-of-living... rises in the cost of living. These automatic increases also apply to other benefit amounts, as described...

  8. CECP, Decommissioning Costs for PWR and BWR

    International Nuclear Information System (INIS)

    Bierschbach, M.C.

    1997-01-01

    1 - Description of program or function: The Cost Estimating Computer Program CECP, designed for use on an IBM personal computer or equivalent, was developed for estimating the cost of decommissioning boiling water reactor (BWR) and light-water reactor (PWR) power stations to the point of license termination. 2 - Method of solution: Cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial volume and costs; and manpower staffing costs. Using equipment and consumables costs and inventory data supplied by the user, CECP calculates unit cost factors and then combines these factors with transportation and burial cost algorithms to produce a complete report of decommissioning costs. In addition to costs, CECP also calculates person-hours, crew-hours, and exposure person-hours associated with decommissioning. 3 - Restrictions on the complexity of the problem: The program is designed for a specific waste charge structure. The waste cost data structure cannot handle intermediate waste handlers or changes in the charge rate structures. The decommissioning of a reactor can be divided into 5 periods. 200 different items for special equipment costs are possible. The maximum amount for each special equipment item is 99,999,999$. You can support data for 10 buildings, 100 components each; ESTS1071/01: There are 65 components for 28 systems available to specify the contaminated systems costs (BWR). ESTS1071/02: There are 75 components for 25 systems available to specify the contaminated systems costs (PWR)

  9. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    classical information theory and, arguably, quantum from classical physics. Basic quantum information ideas are next outlined, including qubits and data compression, quantum gates, the 'no cloning' property and teleportation. Quantum cryptography is briefly sketched. The universal quantum computer (QC) is described, based on the Church-Turing principle and a network model of computation. Algorithms for such a computer are discussed, especially those for finding the period of a function, and searching a random list. Such algorithms prove that a QC of sufficiently precise construction is not only fundamentally different from any computer which can only manipulate classical information, but can compute a small class of functions with greater efficiency. This implies that some important computational tasks are impossible for any device apart from a QC. To build a universal QC is well beyond the abilities of current technology. However, the principles of quantum information physics can be tested on smaller devices. The current experimental situation is reviewed, with emphasis on the linear ion trap, high-Q optical cavities, and nuclear magnetic resonance methods. These allow coherent control in a Hilbert space of eight dimensions (three qubits) and should be extendable up to a thousand or more dimensions (10 qubits). Among other things, these systems will allow the feasibility of quantum computing to be assessed. In fact such experiments are so difficult that it seemed likely until recently that a practically useful QC (requiring, say, 1000 qubits) was actually ruled out by considerations of experimental imprecision and the unavoidable coupling between any system and its environment. However, a further fundamental part of quantum information physics provides a solution to this impasse. This is quantum error correction (QEC). An introduction to QEC is provided. The evolution of the QC is restricted to a carefully chosen subspace of its Hilbert space. Errors are almost certain to

  10. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    classical information theory and, arguably, quantum from classical physics. Basic quantum information ideas are next outlined, including qubits and data compression, quantum gates, the 'no cloning' property and teleportation. Quantum cryptography is briefly sketched. The universal quantum computer (QC) is described, based on the Church-Turing principle and a network model of computation. Algorithms for such a computer are discussed, especially those for finding the period of a function, and searching a random list. Such algorithms prove that a QC of sufficiently precise construction is not only fundamentally different from any computer which can only manipulate classical information, but can compute a small class of functions with greater efficiency. This implies that some important computational tasks are impossible for any device apart from a QC. To build a universal QC is well beyond the abilities of current technology. However, the principles of quantum information physics can be tested on smaller devices. The current experimental situation is reviewed, with emphasis on the linear ion trap, high-Q optical cavities, and nuclear magnetic resonance methods. These allow coherent control in a Hilbert space of eight dimensions (three qubits) and should be extendable up to a thousand or more dimensions (10 qubits). Among other things, these systems will allow the feasibility of quantum computing to be assessed. In fact such experiments are so difficult that it seemed likely until recently that a practically useful QC (requiring, say, 1000 qubits) was actually ruled out by considerations of experimental imprecision and the unavoidable coupling between any system and its environment. However, a further fundamental part of quantum information physics provides a solution to this impasse. This is quantum error correction (QEC). An introduction to QEC is provided. The evolution of the QC is restricted to a carefully chosen subspace of its Hilbert space. Errors are almost certain to

  11. Computers appreciated by marketers

    International Nuclear Information System (INIS)

    Mantho, M.

    1993-01-01

    The computer has been worth its weight in gold to the fueloil man. In fact, with falling prices on both software and machines, the worth is greater than gold. Every so often, about every three years, we ask some questions about the utilization of computers. This time, we looked into the future, to find out the acceptance of other marvels such as the cellular phone and hand held computer. At the moment, there isn't much penetration. Contact by two-way radio as well as computing meters on trucks still reign supreme

  12. Fractures of the greater trochanter following total hip replacement.

    Science.gov (United States)

    Brun, Ole-Christian L; Maansson, Lukas

    2013-01-01

    We studied the incidence of greater trochanteric fractures at our department following THR. In all we examined 911 patients retrospectively and found the occurance of a greater trochanteric fracture to be 3%. Patients with fractures had significantly poorer outcome on Oxford Hip score, Pain VAS, Satisfaction VAS and EQ-5D compared to THR without fractures. Greater trochanteric fracture following THR is one of the most common complications following THR. It has previously been thought to have little impact on the overall outcome following THR, but our study suggests otherwise.

  13. Social costs of energy

    International Nuclear Information System (INIS)

    Jones, P.M.S.

    1990-01-01

    There have been many studies over the past 20 years which have looked at the environmental and other impacts of energy production, conversion and use. A number of these have attempted to put a monetary value to the external costs which are not reflected in the prices charged for energy. The topic has received increased attention recently as a direct result of the recognition of the potentially large social costs that might arise from the depletion of the ozone layer, the consequences of global warming and the continued releases of acid gases from fossil fuel combustion. The determination of external costs was attempted in the report for the European Economic Community, EUR11519, ''Social Costs of Energy Consumption'', by O Hohmeyer. Due to its official sponsorship, this report has been afforded greater respect than it deserves and is being used in some quarters to claim that the external costs of nuclear power are high relative to those of fossil fuels. The remainder of this note looks at some of the serious deficiencies of the document and why its conclusions offer no meaningful guidance to policy makers. So far as the present author is aware no serious criticism of the Hohmeyer study has previously appeared. (author)

  14. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  15. Prey selection by a reintroduced lion population in the Greater ...

    African Journals Online (AJOL)

    Prey selection by a reintroduced lion population in the Greater Makalali Conservancy, South Africa. Dave Druce, Heleen Genis, Jonathan Braak, Sophie Greatwood, Audrey Delsink, Ross Kettles, Luke Hunter, Rob Slotow ...

  16. LiveDiverse: Case study area, Greater Kruger South Africa

    CSIR Research Space (South Africa)

    Nortje, Karen

    2011-01-01

    Full Text Available Livelihoods and Biodiversity in Developing Countries Case study area: Greater Kruger, South Africa January 2011 Kolhapur, India Where are we? HARDSHIP LIVELIHOODS NATURE & BIODIVERSITY BELIEFS & CULTURAL PRACTISE threesansinv foursansinv onesansinv...

  17. Exploration of the Energy Efficiency of the Greater London Authority ...

    African Journals Online (AJOL)

    GLA Building/City Hall) ... Journal Home > Vol 11, No 2 (2007) > ... The Greater London Authority building was acclaimed as being energy efficient, with claims of 75 % reduction in its annual energy consumption compared to a high specification ...

  18. Molecular insights into the biology of Greater Sage-Grouse

    Science.gov (United States)

    Oyler-McCance, Sara J.; Quinn, Thomas W.

    2011-01-01

    Recent research on Greater Sage-Grouse (Centrocercus urophasianus) genetics has revealed some important findings. First, multiple paternity in broods is more prevalent than previously thought, and leks do not comprise kin groups. Second, the Greater Sage-Grouse is genetically distinct from the congeneric Gunnison sage-grouse (C. minimus). Third, the Lyon-Mono population in the Mono Basin, spanning the border between Nevada and California, has unique genetic characteristics. Fourth, the previous delineation of western (C. u. phaios) and eastern Greater Sage-Grouse (C. u. urophasianus) is not supported genetically. Fifth, two isolated populations in Washington show indications that genetic diversity has been lost due to population declines and isolation. This chapter examines the use of molecular genetics to understand the biology of Greater Sage-Grouse for the conservation and management of this species and put it into the context of avian ecology based on selected molecular studies.

  19. Ubiquitous Computing: The Universal Use of Computers on College Campuses.

    Science.gov (United States)

    Brown, David G., Ed.

    This book is a collection of vignettes from 13 universities where everyone on campus has his or her own computer. These 13 institutions have instituted "ubiquitous computing" in very different ways at very different costs. The chapters are: (1) "Introduction: The Ubiquitous Computing Movement" (David G. Brown); (2) "Dartmouth College" (Malcolm…

  20. Greater saphenous vein anomaly and aneurysm with subsequent pulmonary embolism

    OpenAIRE

    Ma, Truong; Kornbau, Craig

    2017-01-01

    Abstract Venous aneurysms often present as painful masses. They can present either in the deep or superficial venous system. Deep venous system aneurysms have a greater risk of thromboembolism. Though rare, there have been case reports of superficial aneurysms and thrombus causing significant morbidity such as pulmonary embolism. We present a case of an anomalous greater saphenous vein connection with an aneurysm and thrombus resulting in a pulmonary embolism. This is the only reported case o...

  1. GREATER OMENTUM: MORPHOFUNCTIONAL CHARACTERISTICS AND CLINICAL SIGNIFICANCE IN PEDIATRICS

    Directory of Open Access Journals (Sweden)

    A.V. Nekrutov

    2007-01-01

    Full Text Available The review analyzes the structure organization and pathophysiological age specificities of the greater omentum, which determine its uniqueness and functional diversity in a child's organism. the article discusses protective functions of the organ, its role in the development of post operative complications of children, and the usage in children's reconstructive plastic surgery.Key words: greater omentum, omentitis, of post operative complications, children.

  2. Computer architecture fundamentals and principles of computer design

    CERN Document Server

    Dumas II, Joseph D

    2005-01-01

    Introduction to Computer ArchitectureWhat is Computer Architecture?Architecture vs. ImplementationBrief History of Computer SystemsThe First GenerationThe Second GenerationThe Third GenerationThe Fourth GenerationModern Computers - The Fifth GenerationTypes of Computer SystemsSingle Processor SystemsParallel Processing SystemsSpecial ArchitecturesQuality of Computer SystemsGenerality and ApplicabilityEase of UseExpandabilityCompatibilityReliabilitySuccess and Failure of Computer Architectures and ImplementationsQuality and the Perception of QualityCost IssuesArchitectural Openness, Market Timi

  3. Fitness cost

    DEFF Research Database (Denmark)

    Nielsen, Karen L.; Pedersen, Thomas M.; Udekwu, Klas I.

    2012-01-01

    phage types, predominantly only penicillin resistant. We investigated whether isolates of this epidemic were associated with a fitness cost, and we employed a mathematical model to ask whether these fitness costs could have led to the observed reduction in frequency. Bacteraemia isolates of S. aureus...... from Denmark have been stored since 1957. We chose 40 S. aureus isolates belonging to phage complex 83A, clonal complex 8 based on spa type, ranging in time of isolation from 1957 to 1980 and with varyous antibiograms, including both methicillin-resistant and -susceptible isolates. The relative fitness...... of each isolate was determined in a growth competition assay with a reference isolate. Significant fitness costs of 215 were determined for the MRSA isolates studied. There was a significant negative correlation between number of antibiotic resistances and relative fitness. Multiple regression analysis...

  4. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  5. Genomics With Cloud Computing

    Directory of Open Access Journals (Sweden)

    Sukhamrit Kaur

    2015-04-01

    Full Text Available Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computing to genomics are like easy access and sharing of data security of data less cost to pay for resources but still there are some demerits like large time needed to transfer data less network bandwidth.

  6. (Super Variable Costing-Throughput Costing)

    OpenAIRE

    Çakıcı, Cemal

    2006-01-01

    (Super Variable Costing-Throughput Costing) The aim of this study is to explain the super-variable costing method which is a new subject in cost and management accounting and to show it’s working practicly.Shortly, super-variable costing can be defined as a costing method which is use only direct material costs in calculate of product costs and treats all costs except these (direct labor and overhead) as periad costs or operating costs.By using super-variable costing method, product costs ar...

  7. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  8. Low Cost Night Vision System for Intruder Detection

    Science.gov (United States)

    Ng, Liang S.; Yusoff, Wan Azhar Wan; R, Dhinesh; Sak, J. S.

    2016-02-01

    The growth in production of Android devices has resulted in greater functionalities as well as lower costs. This has made previously more expensive systems such as night vision affordable for more businesses and end users. We designed and implemented robust and low cost night vision systems based on red-green-blue (RGB) colour histogram for a static camera as well as a camera on an unmanned aerial vehicle (UAV), using OpenCV library on Intel compatible notebook computers, running Ubuntu Linux operating system, with less than 8GB of RAM. They were tested against human intruders under low light conditions (indoor, outdoor, night time) and were shown to have successfully detected the intruders.

  9. Improving greater trochanteric reattachment with a novel cable plate system.

    Science.gov (United States)

    Baril, Yannick; Bourgeois, Yan; Brailovski, Vladimir; Duke, Kajsa; Laflamme, G Yves; Petit, Yvan

    2013-03-01

    Cable-grip systems are commonly used for greater trochanteric reattachment because they have provided the best fixation performance to date, even though they have a rather high complication rate. A novel reattachment system is proposed with the aim of improving fixation stability. It consists of a Y-shaped fixation plate combined with locking screws and superelastic cables to reduce cable loosening and limit greater trochanter movement. The novel system is compared with a commercially available reattachment system in terms of greater trochanter movement and cable tensions under different greater trochanteric abductor application angles. A factorial design of experiments was used including four independent variables: plate system, cable type, abductor application angle, and femur model. The test procedure included 50 cycles of simultaneous application of an abductor force on the greater trochanter and a hip force on the femoral head. The novel plate reduces the movements of a greater trochanter fragment within a single loading cycle up to 26%. Permanent degradation of the fixation (accumulated movement based on 50-cycle testing) is reduced up to 46%. The use of superelastic cables reduces tension loosening up to 24%. However this last improvement did not result in a significant reduction of the grater trochanter movement. The novel plate and cables present advantages over the commercially available greater trochanter reattachment system. The plate reduces movements generated by the hip abductor. The superelastic cables reduce cable loosening during cycling. Both of these positive effects could decrease the risks related to grater trochanter non-union. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.

  10. The impact of geography on energy infrastructure costs

    International Nuclear Information System (INIS)

    Zvoleff, Alex; Kocaman, Ayse Selin; Huh, Woonghee Tim; Modi, Vijay

    2009-01-01

    Infrastructure planning for networked infrastructure such as grid electrification (or piped supply of water) has historically been a process of outward network expansion, either by utilities in response to immediate economic opportunity, or in response to a government mandate or subsidy intended to catalyze economic growth. While significant progress has been made in access to grid electricity in Asia, where population densities are greater and rural areas tend to have nucleated settlements, access to grid electricity in Sub-Saharan Africa remains low; a problem generally ascribed to differences in settlement patterns. The discussion, however, has remained qualitative, and hence it has been difficult for planners to understand the differing costs of carrying out grid expansion in one region as opposed to another. This paper describes a methodology to estimate the cost of local-level distribution systems for a least-cost network, and to compute additional information of interest to policymakers, such as the marginal cost of connecting additional households to a grid as a function of the penetration rate. We present several large datasets of household locations developed from satellite imagery, and examine them with our methodology, providing insight into the relationship between settlement pattern and the cost of rural electrification.

  11. Laboratory cost and utilization containment.

    Science.gov (United States)

    Steiner, J W; Root, J M; White, D C

    1991-01-01

    The authors analyzed laboratory costs and utilization in 3,771 cases of Medicare inpatients admitted to a New England academic medical center ("the Hospital") from October 1, 1989 to September 30, 1990. The data were derived from the Hospital's Decision Resource System comprehensive data base. The authors established a historical reference point for laboratory costs as a percentage of total inpatient costs using 1981-82 Medicare claims data and cost report information. Inpatient laboratory costs were estimated at 9.5% of total inpatient costs for pre-Diagnostic Related Groups (DRGs) Medicare discharges. Using this reference point and adjusting for the Hospital's 1990 case mix, the "expected" laboratory cost was 9.3% of total cost. In fact, the cost averaged 11.5% (i.e., 24% above the expected cost level), and costs represented an even greater percentage of DRG reimbursement at 12.9%. If we regard the reimbursement as a total cost target (to eliminate losses from Medicare), then that 12.9% is 39% above the "expected" laboratory proportion of 9.3%. The Hospital lost an average of $1,091 on each DRG inpatient. The laboratory contributed 29% to this loss per case. Compared to other large hospitals, the Hospital was slightly (3%) above the mean direct cost per on-site test and significantly (58%) above the mean number of inpatient tests per inpatient day compared to large teaching hospitals. The findings suggest that careful laboratory cost analyses will become increasingly important as the proportion of patients reimbursed in a fixed manner grows. The future may hold a prospective zero-based laboratory budgeting process based on predictable patterns of DRG admissions or other fixed-reimbursement admission and laboratory utilization patterns.

  12. The economic costs of energy

    International Nuclear Information System (INIS)

    Brookes, L.G.

    1980-01-01

    At a recent symposium, the economic costs of nuclear power were examined in four lectures which considered; (1) The performance of different types, size and ages of nuclear power plants. (2) The comparison between coal and nuclear power costs based on the principle of net effective cash. (3) The capital requirements of a nuclear programme. (4) The comparative costs, now and in the future, of coal-fired and nuclear plants. It is concluded that uncertainties seem to get greater rather than smaller with time probably due to the high and fluctuating world inflation rates and the great uncertainty about world economic performance introduced by the politicising of world oil supplies. (UK)

  13. Public power costs less

    International Nuclear Information System (INIS)

    Moody, D.

    1993-01-01

    The reasons why residential customers of public power utilities paid less for power than private sector customers is discussed. Residential customers of investor-owned utilities (IOU's) paid average rates that were 28% above those paid by customers by possibly owned systems during 1990. The reasons for this disparity are that management costs faced by public power systems are below those of private power companies, indicating a greater efficiency of management among public power systems, and customer accounts expenses averaged $33.00 per customer for publicly owned electric utilities compared to $39.00 per customer for private utilities

  14. Estimation of the laser cutting operating cost by support vector regression methodology

    Science.gov (United States)

    Jović, Srđan; Radović, Aleksandar; Šarkoćević, Živče; Petković, Dalibor; Alizamir, Meysam

    2016-09-01

    Laser cutting is a popular manufacturing process utilized to cut various types of materials economically. The operating cost is affected by laser power, cutting speed, assist gas pressure, nozzle diameter and focus point position as well as the workpiece material. In this article, the process factors investigated were: laser power, cutting speed, air pressure and focal point position. The aim of this work is to relate the operating cost to the process parameters mentioned above. CO2 laser cutting of stainless steel of medical grade AISI316L has been investigated. The main goal was to analyze the operating cost through the laser power, cutting speed, air pressure, focal point position and material thickness. Since the laser operating cost is a complex, non-linear task, soft computing optimization algorithms can be used. Intelligent soft computing scheme support vector regression (SVR) was implemented. The performance of the proposed estimator was confirmed with the simulation results. The SVR results are then compared with artificial neural network and genetic programing. According to the results, a greater improvement in estimation accuracy can be achieved through the SVR compared to other soft computing methodologies. The new optimization methods benefit from the soft computing capabilities of global optimization and multiobjective optimization rather than choosing a starting point by trial and error and combining multiple criteria into a single criterion.

  15. Technical concept for a Greater Confinement Disposal test facility

    International Nuclear Information System (INIS)

    Hunter, P.H.

    1982-01-01

    For the past two years, Ford, Bacon and Davis has been performing technical services for the Department of Energy at the Nevada Test Site in specific development of defense low-level waste management concepts for greater confinement disposal concept with particular application to arid sites. The investigations have included the development of Criteria for Greater Confinement Disposal, NVO-234, which was published in May of 1981 and the draft of the technical concept for Greater Confinement Disposal, with the latest draft published in November 1981. The final draft of the technical concept and design specifications are expected to be published imminently. The document is prerequisite to the actual construction and implementation of the demonstration facility this fiscal year. The GCD Criteria Document, NVO-234 is considered to contain information complimentary and compatible with that being developed for the reserved section 10 CFR 61.51b of the NRCs proposed licensing rule for low level waste disposal facilities

  16. Can a Costly Intervention Be Cost-effective?

    Science.gov (United States)

    Foster, E. Michael; Jones, Damon

    2009-01-01

    Objectives To examine the cost-effectiveness of the Fast Track intervention, a multi-year, multi-component intervention designed to reduce violence among at-risk children. A previous report documented the favorable effect of intervention on the highest-risk group of ninth-graders diagnosed with conduct disorder, as well as self-reported delinquency. The current report addressed the cost-effectiveness of the intervention for these measures of program impact. Design Costs of the intervention were estimated using program budgets. Incremental cost-effectiveness ratios were computed to determine the cost per unit of improvement in the 3 outcomes measured in the 10th year of the study. Results Examination of the total sample showed that the intervention was not cost-effective at likely levels of policymakers' willingness to pay for the key outcomes. Subsequent analysis of those most at risk, however, showed that the intervention likely was cost-effective given specified willingness-to-pay criteria. Conclusions Results indicate that the intervention is cost-effective for the children at highest risk. From a policy standpoint, this finding is encouraging because such children are likely to generate higher costs for society over their lifetimes. However, substantial barriers to cost-effectiveness remain, such as the ability to effectively identify and recruit such higher-risk children in future implementations. PMID:17088509

  17. Expatriate job performance in Greater China: Does age matter?

    DEFF Research Database (Denmark)

    Selmer, Jan; Lauring, Jakob; Feng, Yunxia

    to expatriates in Chinese societies. It is possible that older business expatriates will receive more respect and be treated with more deference in a Chinese cultural context than their apparently younger colleagues. This may have a positive impact on expatriates’ job performance. To empirically test...... this presumption, business expatriates in Greater Chine were targeted by a survey. Controlling for the potential bias of a number of background variables, results indicate that contextual/managerial performance, including general managerial functions applied to the subsidiary in Greater China, had a positive...

  18. Absenteeism movement in Greater Poland in 1840–1902

    OpenAIRE

    Izabela Krasińska

    2013-01-01

    The article presents the origins and development of the idea of absenteeism in Greater Poland in the 19th century. The start date for the research is 1840, which is considered to be a breakthrough year in the history of an organized absenteeism movement in Greater Poland. It was due to the Association for the Suppression of the Use of Vodka (Towarzystwo ku Przytłumieniu Używania Wódki) in the Great Duchy of Posen that was then established in Kórnik. It was a secular organization that came int...

  19. Wastewater Treatment Costs and Outlays in Organic Petrochemicals: Standards Versus Taxes With Methodology Suggestions for Marginal Cost Pricing and Analysis

    Science.gov (United States)

    Thompson, Russell G.; Singleton, F. D., Jr.

    1986-04-01

    With the methodology recommended by Baumol and Oates, comparable estimates of wastewater treatment costs and industry outlays are developed for effluent standard and effluent tax instruments for pollution abatement in five hypothetical organic petrochemicals (olefins) plants. The computational method uses a nonlinear simulation model for wastewater treatment to estimate the system state inputs for linear programming cost estimation, following a practice developed in a National Science Foundation (Research Applied to National Needs) study at the University of Houston and used to estimate Houston Ship Channel pollution abatement costs for the National Commission on Water Quality. Focusing on best practical and best available technology standards, with effluent taxes adjusted to give nearly equal pollution discharges, shows that average daily treatment costs (and the confidence intervals for treatment cost) would always be less for the effluent tax than for the effluent standard approach. However, industry's total outlay for these treatment costs, plus effluent taxes, would always be greater for the effluent tax approach than the total treatment costs would be for the effluent standard approach. Thus the practical necessity of showing smaller outlays as a prerequisite for a policy change toward efficiency dictates the need to link the economics at the microlevel with that at the macrolevel. Aggregation of the plants into a programming modeling basis for individual sectors and for the economy would provide a sound basis for effective policy reform, because the opportunity costs of the salient regulatory policies would be captured. Then, the government's policymakers would have the informational insights necessary to legislate more efficient environmental policies in light of the wealth distribution effects.

  20. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  1. [Relating costs to activities in hospitals. Use of internal cost accounting].

    Science.gov (United States)

    Stavem, K

    1995-01-10

    During the last few years hospital cost accounting has become widespread in many countries, in parallel with increasing cost pressure, greater competition and new financing schemes. Cost accounting has been used in the manufacturing industry for many years. Costs can be related to activities and production, e.g. by the costing of procedures, episodes of care and other internally defined cost objectives. Norwegian hospitals have lagged behind in the adoption of cost accounting. They ought to act quickly if they want to be prepared for possible changes in health care financing. The benefits can be considerable to a hospital operating in a rapidly changing health care environment.

  2. Cost restructuring

    International Nuclear Information System (INIS)

    Schmidt, J.A.

    1991-01-01

    This paper reports on the cost restructuring of the petroleum industry. This current decade is likely to be one of the most challenging for the petroleum industry. Though petroleum remains among the world's biggest businesses, news of consolidations, restructuring, and layoffs permeates the oil patch from the Gulf of Mexico to the Arctic Isles. The recessionary economy has accelerated these changes, particularly in the upstream sector. Today, even the best-managed companies are transforming their cost structures, and companies that fail to do likewise probably won't survive as independent companies. Indeed, significant consolidation took place during the 1980s. More consolidations can be expected in this decade for companies that do not adapt to the economic realities of the mature business

  3. Advanced Fuel Cycle Cost Basis

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2009-12-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  4. Advanced Fuel Cycle Cost Basis

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert

    2007-04-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 26 cost modules—24 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, and high-level waste.

  5. Advanced Fuel Cycle Cost Basis

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2008-03-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  6. Adjustment of Business Expatriates in Greater China: A Strategic Perspective

    DEFF Research Database (Denmark)

    Selmer, Jan

    2006-01-01

    Research has found that due to similarities, firms which have gained business experience elsewhere in Greater China may exhibit relatively better performance in mainland China. Hence, the experience of business expatriates could be of strategic importance for the expansion path of their firms...

  7. College Students with ADHD at Greater Risk for Sleep Disorders

    Science.gov (United States)

    Gaultney, Jane F.

    2014-01-01

    The pediatric literature indicates that children with ADHD are at greater risk for sleep problems, daytime sleepiness, and some sleep disorders than children with no diagnosed disability. It has not been determined whether this pattern holds true among emerging adults, and whether comorbid sleep disorders with ADHD predict GPA. The present study…

  8. Ecology of greater sage-grouse in the Dakotas

    Science.gov (United States)

    Christopher C. Swanson

    2009-01-01

    Greater sage-grouse (Centrocercus urophasianus) populations and the sagebrush (Artemisia spp.) communities that they rely on have dramatically declined from historic levels. Moreover, information regarding sage-grouse annual life-history requirements at the eastern-most extension of sagebrush steppe communities is lacking....

  9. The Easterlin Illusion: Economic growth does go with greater happiness

    NARCIS (Netherlands)

    R. Veenhoven (Ruut); F. Vergunst (Floris)

    2014-01-01

    markdownabstract__Abstract__ The 'Easterlin Paradox' holds that economic growth in nations does not buy greater happiness for the average citizen. This thesis was advanced in the 1970s on the basis of the then available data on happiness in nations. Later data have disproved most of the empirical

  10. Job-Sharing at the Greater Victoria Public Library.

    Science.gov (United States)

    Miller, Don

    1978-01-01

    Describes the problems associated with the management of part-time library employees and some solutions afforded by a job sharing arrangement in use at the Greater Victoria Public Library. This is a voluntary work arrangement, changing formerly full-time positions into multiple part-time positions. (JVP)

  11. Radiographic features of tuberculous osteitis in greater trochanter and lschium

    International Nuclear Information System (INIS)

    Hahm, So Hee; Lee, Ye Ri; Kim, Dong Jin; Sung, Ki Jun; Lim, Jong Nam

    1996-01-01

    To evaluate, if possible, the radiographic features of tuberculous osteitis in the greater trochanter and ischium, and to determine the cause of the lesions. We reterospectively reviewed the plain radiographic findings of 14 ptients with histologically proven tuberculous osteitis involving the greater trochanter and ischium. In each case, the following were analyzed:morphology of bone destruction, including cortical erosion;periosteal reaction;presence or abscence of calcific shadows in adjacent soft tissue. On the basis of an analysis of radiographic features and correlation of the anatomy with adjacent structures we attempted to determine causes. Of the 14 cases evaluated, 12 showed varrious degrees of extrinsic erosion on the outer cortical bone of the greater trochanter and ischium ; in two cases, bone destruction was so severe that the radiographic features of advanced perforated osteomyelitis were simulated. In addition to findings of bone destruction, in these twelve cases, the presence of sequestrum or calcific shadows was seen in adjacent soft tissue. Tuberculous osteitis in the greater trochanter and ischium showed the characteristic findings of chronic extrinsic erosion. On the basis of these findings we can suggest that these lesions result from an extrinsic pathophysiologic cause such as adjacent bursitis

  12. Radiographic features of tuberculous osteitis in greater trochanter and lschium

    Energy Technology Data Exchange (ETDEWEB)

    Hahm, So Hee; Lee, Ye Ri [Hanil Hospital Affiliated to KEPCO, Seoul (Korea, Republic of); Kim, Dong Jin; Sung, Ki Jun [Yonsei Univ. Wonju College of Medicine, Wonju (Korea, Republic of); Lim, Jong Nam [Konkuk Univ. College of Medicine, Seoul (Korea, Republic of)

    1996-11-01

    To evaluate, if possible, the radiographic features of tuberculous osteitis in the greater trochanter and ischium, and to determine the cause of the lesions. We reterospectively reviewed the plain radiographic findings of 14 ptients with histologically proven tuberculous osteitis involving the greater trochanter and ischium. In each case, the following were analyzed:morphology of bone destruction, including cortical erosion;periosteal reaction;presence or abscence of calcific shadows in adjacent soft tissue. On the basis of an analysis of radiographic features and correlation of the anatomy with adjacent structures we attempted to determine causes. Of the 14 cases evaluated, 12 showed varrious degrees of extrinsic erosion on the outer cortical bone of the greater trochanter and ischium ; in two cases, bone destruction was so severe that the radiographic features of advanced perforated osteomyelitis were simulated. In addition to findings of bone destruction, in these twelve cases, the presence of sequestrum or calcific shadows was seen in adjacent soft tissue. Tuberculous osteitis in the greater trochanter and ischium showed the characteristic findings of chronic extrinsic erosion. On the basis of these findings we can suggest that these lesions result from an extrinsic pathophysiologic cause such as adjacent bursitis.

  13. Greater Confinement Disposal trench and borehole operations status

    International Nuclear Information System (INIS)

    Harley, J.P. Jr.; Wilhite, E.L.; Jaegge, W.J.

    1987-01-01

    Greater Confinement Disposal (GCD) facilities have been constructed within the operating burial ground at the Savannah River Plant (SRP) to dispose of the higher activity fraction of SRP low-level waste. GCD practices of waste segregation, packaging, emplacement below the root zone, and waste stabilization are being used in the demonstration. 2 refs., 2 figs., 2 tabs

  14. The Mesozoic-Cenozoic tectonic evolution of the Greater Caucasus

    NARCIS (Netherlands)

    Saintot, A.N.; Brunet, M.F.; Yakovlev, F.; Sébrier, M.; Stephenson, R.A.; Ershov, A.V.; Chalot-Prat, F.; McCann, T.

    2006-01-01

    The Greater Caucasus (GC) fold-and-thrust belt lies on the southern deformed edge of the Scythian Platform (SP) and results from the Cenoozoic structural inversion of a deep marine Mesozoic basin in response to the northward displacement of the Transcaucasus (lying south of the GC subsequent to the

  15. Introduction. China and the Challenges in Greater Middle East

    DEFF Research Database (Denmark)

    Sørensen, Camilla T. N.; Andersen, Lars Erslev; Jiang, Yang

    2016-01-01

    This collection of short papers is an outcome of an international conference entitled China and the Challenges in Greater Middle East, organized by the Danish Institute for International Studies and Copenhagen University on 10 November 2015. The conference sought answers to the following questions...

  16. On the Occurrence of Standardized Regression Coefficients Greater than One.

    Science.gov (United States)

    Deegan, John, Jr.

    1978-01-01

    It is demonstrated here that standardized regression coefficients greater than one can legitimately occur. Furthermore, the relationship between the occurrence of such coefficients and the extent of multicollinearity present among the set of predictor variables in an equation is examined. Comments on the interpretation of these coefficients are…

  17. The Educational Afterlife of Greater Britain, 1903-1914

    Science.gov (United States)

    Gardner, Philip

    2012-01-01

    Following its late nineteenth-century emergence as an important element within federalist thinking across the British Empire, the idea of Greater Britain lost much of its political force in the years following the Boer War. The concept however continued to retain considerable residual currency in other fields of Imperial debate, including those…

  18. Children's (Pediatric) CT (Computed Tomography)

    Medline Plus

    Full Text Available ... images can be viewed on a computer monitor, printed on film or transferred to a CD or DVD. CT images of internal organs, bones, soft tissue and blood vessels provide greater detail ...

  19. Security Architecture of Cloud Computing

    OpenAIRE

    V.KRISHNA REDDY; Dr. L.S.S.REDDY

    2011-01-01

    The Cloud Computing offers service over internet with dynamically scalable resources. Cloud Computing services provides benefits to the users in terms of cost and ease of use. Cloud Computing services need to address the security during the transmission of sensitive data and critical applications to shared and public cloud environments. The cloud environments are scaling large for data processing and storage needs. Cloud computing environment have various advantages as well as disadvantages o...

  20. Security in cloud computing

    OpenAIRE

    Moreno Martín, Oriol

    2016-01-01

    Security in Cloud Computing is becoming a challenge for next generation Data Centers. This project will focus on investigating new security strategies for Cloud Computing systems. Cloud Computingisarecent paradigmto deliver services over Internet. Businesses grow drastically because of it. Researchers focus their work on it. The rapid access to exible and low cost IT resources on an on-demand fashion, allows the users to avoid planning ahead for provisioning, and enterprises to save money ...

  1. Engineering computations at the national magnetic fusion energy computer center

    International Nuclear Information System (INIS)

    Murty, S.

    1983-01-01

    The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given

  2. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  3. Cost-effectiveness analysis of computerized ECG interpretation system in an ambulatory health care organization.

    Science.gov (United States)

    Carel, R S

    1982-04-01

    The cost-effectiveness of a computerized ECG interpretation system in an ambulatory health care organization has been evaluated in comparison with a conventional (manual) system. The automated system was shown to be more cost-effective at a minimum load of 2,500 patients/month. At larger monthly loads an even greater cost-effectiveness was found, the average cost/ECG being about $2. In the manual system the cost/unit is practically independent of patient load. This is primarily due to the fact that 87% of the cost/ECG is attributable to wages and fees of highly trained personnel. In the automated system, on the other hand, the cost/ECG is heavily dependent on examinee load. This is due to the relatively large impact of equipment depreciation on fixed (and total) cost. Utilization of a computer-assisted system leads to marked reduction in cardiologists' interpretation time, substantially shorter turnaround time (of unconfirmed reports), and potential provision of simultaneous service at several remotely located "heart stations."

  4. Alternatives for packaging and transport of greater-than-class C low-level waste

    International Nuclear Information System (INIS)

    Smith, R.I.

    1990-06-01

    Viable methods for packaging greater-than-class C (GTCC) low-level wastes and for transporting those wastes from the waste generator sites or from an eastern interim storage site to the Yucca Mountain repository site have been identified and evaluated. Estimated costs for packaging and transporting the population of GTCC wastes expected to be accumulated through the year 2040 have been developed for three waste volume scenarios, for two preferred packaging methods for activated metals from reactor operations and from reactor decommissioning, and for two packaging density assumptions for the activated metals from reactor decommissioning. 7 refs. 7 tabs

  5. Activity of the greater padloper, Homopus femoralis (Testudinidae ...

    African Journals Online (AJOL)

    ... or imminent rain, perhaps to avoid avian predators or physiological costs of water and food shortages. Future studies might locate active tortoises in the highest rainfall months, and use telemetry to identify activity patterns throughout the year. Key words: behaviour, ecology, Karoo, morphology, predation, reptile, tortoise.

  6. Forestry-related pathways for the movement of exotic plant pests into and within the greater Caribbean region

    Science.gov (United States)

    Leslie Newton; Heike Meissner; Andrea. Lemay

    2011-01-01

    Forests of the Greater Caribbean Region (GCR) are important ecologically and economically. These unique ecosystems are under increasing pressure from exotic pests, which may cause extensive environmental damage and cost billions of dollars in control programs, lost production, and forest restoration.

  7. Computer graphics in engineering education

    CERN Document Server

    Rogers, David F

    2013-01-01

    Computer Graphics in Engineering Education discusses the use of Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) as an instructional material in engineering education. Each of the nine chapters of this book covers topics and cites examples that are relevant to the relationship of CAD-CAM with engineering education. The first chapter discusses the use of computer graphics in the U.S. Naval Academy, while Chapter 2 covers key issues in instructional computer graphics. This book then discusses low-cost computer graphics in engineering education. Chapter 4 discusses the uniform b

  8. Higher motivation - greater control? The effect of arousal on judgement.

    Science.gov (United States)

    Riemer, Hila; Viswanathan, Madhu

    2013-01-01

    This research examines control over the effect of arousal, a dimension of affect, on judgement. Past research shows that high processing motivation enhances control over the effects of affect on judgement. Isolating and studying arousal as opposed to valence, the other dimension of affect, and its effect on judgement, we identify boundary conditions for past findings. Drawing from the literature on processes by which arousal influences judgement, we demonstrate that the role of motivation is contingent upon the type of judgement task (i.e., memory- versus stimulus-based judgement). In stimulus-based judgement, individuals exert greater control over the effect of arousal on judgement under low compared to high motivation. In contrast, in memory-based judgement individuals exert greater control over the effect of arousal under high compared to low motivation. Theoretical implications and avenues for future research are discussed.

  9. Patient expectations predict greater pain relief with joint arthroplasty.

    Science.gov (United States)

    Gandhi, Rajiv; Davey, John Roderick; Mahomed, Nizar

    2009-08-01

    We examined the relationship between patient expectations of total joint arthroplasty and functional outcomes. We surveyed 1799 patients undergoing primary hip or knee arthroplasty for demographic data and Western Ontario McMaster University Osteoarthritis Index scores at baseline, 3 months, and 1 year of follow-up. Patient expectations were determined with 3 survey questions. The patients with the greatest expectations of surgery were younger, male, and had a lower body mass index. Linear regression modeling showed that a greater expectation of pain relief with surgery independently predicted greater reported pain relief at 1 year of follow-up, adjusted for all relevant covariates (P relief after joint arthroplasty is an important predictor of outcomes at 1 year.

  10. VMware private cloud computing with vCloud director

    CERN Document Server

    Gallagher, Simon

    2013-01-01

    It's All About Delivering Service with vCloud Director Empowered by virtualization, companies are not just moving into the cloud, they're moving into private clouds for greater security, flexibility, and cost savings. However, this move involves more than just infrastructure. It also represents a different business model and a new way to provide services. In this detailed book, VMware vExpert Simon Gallagher makes sense of private cloud computing for IT administrators. From basic cloud theory and strategies for adoption to practical implementation, he covers all the issues. You'll lea

  11. Quantum analogue computing.

    Science.gov (United States)

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  12. The thermodynamic efficiency of computations made in cells across the range of life

    Science.gov (United States)

    Kempes, Christopher P.; Wolpert, David; Cohen, Zachary; Pérez-Mercader, Juan

    2017-11-01

    Biological organisms must perform computation as they grow, reproduce and evolve. Moreover, ever since Landauer's bound was proposed, it has been known that all computation has some thermodynamic cost-and that the same computation can be achieved with greater or smaller thermodynamic cost depending on how it is implemented. Accordingly an important issue concerning the evolution of life is assessing the thermodynamic efficiency of the computations performed by organisms. This issue is interesting both from the perspective of how close life has come to maximally efficient computation (presumably under the pressure of natural selection), and from the practical perspective of what efficiencies we might hope that engineered biological computers might achieve, especially in comparison with current computational systems. Here we show that the computational efficiency of translation, defined as free energy expended per amino acid operation, outperforms the best supercomputers by several orders of magnitude, and is only about an order of magnitude worse than the Landauer bound. However, this efficiency depends strongly on the size and architecture of the cell in question. In particular, we show that the useful efficiency of an amino acid operation, defined as the bulk energy per amino acid polymerization, decreases for increasing bacterial size and converges to the polymerization cost of the ribosome. This cost of the largest bacteria does not change in cells as we progress through the major evolutionary shifts to both single- and multicellular eukaryotes. However, the rates of total computation per unit mass are non-monotonic in bacteria with increasing cell size, and also change across different biological architectures, including the shift from unicellular to multicellular eukaryotes. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  13. Ecological specialization and morphological diversification in Greater Antillean boas.

    Science.gov (United States)

    Reynolds, R Graham; Collar, David C; Pasachnik, Stesha A; Niemiller, Matthew L; Puente-Rolón, Alberto R; Revell, Liam J

    2016-08-01

    Colonization of islands can dramatically influence the evolutionary trajectories of organisms, with both deterministic and stochastic processes driving adaptation and diversification. Some island colonists evolve extremely large or small body sizes, presumably in response to unique ecological circumstances present on islands. One example of this phenomenon, the Greater Antillean boas, includes both small (<90 cm) and large (4 m) species occurring on the Greater Antilles and Bahamas, with some islands supporting pairs or trios of body-size divergent species. These boas have been shown to comprise a monophyletic radiation arising from a Miocene dispersal event to the Greater Antilles, though it is not known whether co-occurrence of small and large species is a result of dispersal or in situ evolution. Here, we provide the first comprehensive species phylogeny for this clade combined with morphometric and ecological data to show that small body size evolved repeatedly on separate islands in association with specialization in substrate use. Our results further suggest that microhabitat specialization is linked to increased rates of head shape diversification among specialists. Our findings show that ecological specialization following island colonization promotes morphological diversity through deterministic body size evolution and cranial morphological diversification that is contingent on island- and species-specific factors. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.

  14. Moderate Baseline Vagal Tone Predicts Greater Prosociality in Children

    Science.gov (United States)

    Miller, Jonas G.; Kahle, Sarah; Hastings, Paul D.

    2016-01-01

    Vagal tone is widely believed to be an important physiological aspect of emotion regulation and associated positive behaviors. However, there is inconsistent evidence for relations between children’s baseline vagal tone and their helpful or prosocial responses to others (Hastings & Miller, 2014). Recent work in adults suggests a quadratic association (inverted U-shape curve) between baseline vagal tone and prosociality (Kogan et al., 2014). The present research examined whether this nonlinear association was evident in children. We found consistent evidence for a quadratic relation between vagal tone and prosociality across 3 samples of children using 6 different measures. Compared to low and high vagal tone, moderate vagal tone in early childhood concurrently predicted greater self-reported prosociality (Study 1), observed empathic concern in response to the distress of others and greater generosity toward less fortunate peers (Study 2), and longitudinally predicted greater self-, mother-, and teacher-reported prosociality 5.5 years later in middle childhood (Study 3). Taken together, our findings suggest that moderate vagal tone at rest represents a physiological preparedness or tendency to engage in different forms of prosociality across different contexts. Early moderate vagal tone may reflect an optimal balance of regulation and arousal that helps prepare children to sympathize, comfort, and share with others. PMID:27819463

  15. Absenteeism movement in Greater Poland in 1840–1902

    Directory of Open Access Journals (Sweden)

    Izabela Krasińska

    2013-12-01

    Full Text Available The article presents the origins and development of the idea of absenteeism in Greater Poland in the 19th century. The start date for the research is 1840, which is considered to be a breakthrough year in the history of an organized absenteeism movement in Greater Poland. It was due to the Association for the Suppression of the Use of Vodka (Towarzystwo ku Przytłumieniu Używania Wódki in the Great Duchy of Posen that was then established in Kórnik. It was a secular organization that came into being on an initiative of doctor De La Roch, who was a German surgeon of a French origin. However, as early as 1844, the idea of absenteeism raised an interest of catholic clergymen of Greater Poland with high ranking clergy such as Rev. Leon Michał Przyłuski, Archbishop of Gniezno and Rev. Jan Kanty Dąbrowski, Archbishop of Posen, and later on Archbishops Rev. Mieczysław Halka Ledóchowski and Rev. Florian Oksza Stablewski. They were fascinated with activities of Rev. Jan Nepomucen Fick, Parish Priest of Piekary Śląskie and several other priests on whose initiative a lot of church brotherhoods of so called holy continence were set up in Upper Silesia as early as the first half-year of 1844. It was due to Bishop Dąbrowski that 100 000 people took vows of absenteeism in 1844–1845, becoming members of brotherhoods of absenteeism. In turn, it was an initiative of Archbishop Przyłuski that Jesuit missionaries – Rev. Karol Bołoz Antoniewicz, Rev. Teofil Baczyński and Rev. Kamil Praszałowicz, arrived in Greater Poland from Galicia in 1852 to promote the idea of absenteeism. Starting from 1848, they were helping Silesian clergymen to spread absenteeism. Clergymen of Greater Poland were also active in secular absenteeism associations. They became involved in the workings of the Association for the Promotion of Absenteeism that was set up by Zygmunt Celichowski in Kórnik in 1887, and especially in the Jutrzenka Absenteeism Association

  16. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  17. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  18. ULTRAFISH: generalization of SUPERFISH to m greater than or equal to 1

    International Nuclear Information System (INIS)

    Gluckstern, R.L.; Holsinger, R.F.; Halbach, K.; Minerbo, G.N.

    1982-01-01

    The present version of the SUPERFISH program computes fundamental and higher order resonant frequencies and corresponding fields of azimuthally symmetric TE and TM modes (m = 0) in an electromagnetic cavity which is a figure of revolution about a longitudinal axis. We have developed the program ULTRAFISH which computes the resonant frequencies and fields in such a cavity for azimuthally asymmetric modes (cos mphi with m greater than or equal to 1). These modes no longer can be characterized as TE and TM and lead to simultaneous equations involving two field components. These are taken for convenience to be rEphi and rHphi, in terms of which all four other field components are expressed. Several different formulations for solving the equations are being investigated

  19. [Cost effectiveness of workplace smoking policies].

    Science.gov (United States)

    Raaijmakers, Tamara; van den Borne, Inge

    2003-01-01

    This study reviews the motivations of companies to set out a policy for controlling smoking, the economic benefits for the company resulting from such a policy and the costs, broken down by European Union countries. The literature on the costs of implementing a policy related to smoking at the workplace is reviewed. The main objective of policies related to smoking at the workplace is that of safeguarding employees from environmental tobacco smoke. Other reasons are cutting costs, improving the company image, and reducing absenteeism, occupational accidents, internal quarrels and extra costs due to cigarette smoking, protection against environmental tobacco smoke does not entail any higher costs for companies, and economic advantages are visible. The benefits are by far greater than the costs involved, particularly on a long-range basis, and seem to be greater when smoking at the workplace is completely prohibited and no smoking areas are set.

  20. Costs of fire suppression forces based on cost-aggregation approach

    Science.gov (United States)

    Gonz& aacute; lez-Cab& aacute; Armando n; Charles W. McKetta; Thomas J. Mills

    1984-01-01

    A cost-aggregation approach has been developed for determining the cost of Fire Management Inputs (FMls)-the direct fireline production units (personnel and equipment) used in initial attack and large-fire suppression activities. All components contributing to an FMI are identified, computed, and summed to estimate hourly costs. This approach can be applied to any FMI...

  1. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  2. A computational model for determining the minimal cost expansion alternatives in transmission systems planning; Um modelo computacional para determinacao de alternativas de expansao de custo minimo em planejamento de sistemas de transmissao

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, L M.V.G.; Pereira, M V.F. [Pontificia Univ. Catolica do Rio de Janeiro, RJ (Brazil); Nunes, A [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1990-12-31

    A computational model for determining an economical transmission expansion plan, based in the decomposition techniques is presented. The algorithm was used in the Brazilian South System and was able to find an optimal solution, with a low computational resource. Some expansions of this methodology are been investigated: the probabilistic one and the expansion with financier restriction. (C.G.C.). 4 refs, 7 figs.

  3. Computing in the Clouds

    Science.gov (United States)

    Johnson, Doug

    2010-01-01

    Web-based applications offer teachers, students, and school districts a convenient way to accomplish a wide range of tasks, from accounting to word processing, for free. Cloud computing has the potential to offer staff and students better services at a lower cost than the technology deployment models they're using now. Saving money and improving…

  4. The post-orgasmic prolactin increase following intercourse is greater than following masturbation and suggests greater satiety.

    Science.gov (United States)

    Brody, Stuart; Krüger, Tillmann H C

    2006-03-01

    Research indicates that prolactin increases following orgasm are involved in a feedback loop that serves to decrease arousal through inhibitory central dopaminergic and probably peripheral processes. The magnitude of post-orgasmic prolactin increase is thus a neurohormonal index of sexual satiety. Using data from three studies of men and women engaging in masturbation or penile-vaginal intercourse to orgasm in the laboratory, we report that for both sexes (adjusted for prolactin changes in a non-sexual control condition), the magnitude of prolactin increase following intercourse is 400% greater than that following masturbation. The results are interpreted as an indication of intercourse being more physiologically satisfying than masturbation, and discussed in light of prior research reporting greater physiological and psychological benefits associated with coitus than with any other sexual activities.

  5. Cloud computing for comparative genomics

    Directory of Open Access Journals (Sweden)

    Pivovarov Rimma

    2010-05-01

    Full Text Available Abstract Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD, to run within Amazon's Elastic Computing Cloud (EC2. We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  6. Cloud computing for comparative genomics.

    Science.gov (United States)

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  7. CERN School of Computing

    CERN Multimedia

    2007-01-01

    The 2007 CERN School of Computing, organised by CERN in collaboration with the University of Split (FESB) will be held from 20 to 31 August 2007 in Dubrovnik, Croatia. It is aimed at postgraduate students and research workers with a few years' experience in scientific physics, computing or related fields. Special themes this year are: GRID Technologies: The Grid track delivers unique theoretical and hands-on education on some of the most advanced GRID topics; Software Technologies: The Software track addresses some of the most relevant modern techniques and tools for large scale distributed software development and handling as well as for computer security; Physics Computing: The Physics Computing track focuses on informatics topics specific to the HEP community. After setting-the-scene lectures, it addresses data acquisition and ROOT. Grants from the European Union Framework Programme 6 (FP6) are available to participants to cover part or all of the cost of the School. More information can be found at...

  8. Computer Augmented Learning; A Survey.

    Science.gov (United States)

    Kindred, J.

    The report contains a description and summary of computer augmented learning devices and systems. The devices are of two general types programed instruction systems based on the teaching machines pioneered by Pressey and developed by Skinner, and the so-called "docile" systems that permit greater user-direction with the computer under student…

  9. Computers for Your Classroom: CAI and CMI.

    Science.gov (United States)

    Thomas, David B.; Bozeman, William C.

    1981-01-01

    The availability of compact, low-cost computer systems provides a means of assisting classroom teachers in the performance of their duties. Computer-assisted instruction (CAI) and computer-managed instruction (CMI) are two applications of computer technology with which school administrators should become familiar. CAI is a teaching medium in which…

  10. Cloud Computing. Technology Briefing. Number 1

    Science.gov (United States)

    Alberta Education, 2013

    2013-01-01

    Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…

  11. Humanitarian information systems and emergencies in the Greater Horn of Africa: logical components and logical linkages.

    Science.gov (United States)

    Maxwell, Daniel; Watkins, Ben

    2003-03-01

    Natural and man-made emergencies are regular occurrences in the Greater Horn of Africa region. The underlying impoverishment of whole populations is increasing, making it more difficult to distinguish between humanitarian crises triggered by shocks and those resulting from chronic poverty. Shocks and hazards can no longer be seen as one-off events that trigger a one-time response. In countries that are both poor and exposed to frequent episodes of debilitating drought or chronic conflict, information needs tend to be different from the straightforward early warning/commodity accounting models of information systems that have proven reliable in past emergencies. This paper describes the interdependent components of a humanitarian information system appropriate for this kind of complex environment, noting the analytical links between the components and operational links to programme and policy. By examining a series of case studies from the Greater Horn region, the paper demonstrates that systems lacking one or more of these components will fail to provide adequate information--and thus incur humanitarian costs. While information always comes with a cost, the price of poor information--or none--is higher. And in situations of chronic vulnerability, in which development interventions are likely to be interspersed with both safety nets and emergency interventions on a recurrent basis, investment in improved information is a good investment from both a humanitarian and a financial viewpoint.

  12. Design to Cost and Life Cycle Cost.

    Science.gov (United States)

    1980-07-01

    MANAGEMENT TASK ORIENTATED COST STRUCTURE 5. COSTS OF CONSTRUCTION INIFRA 2. COSTS DURING DEVELOPMENT -6. COSTS OF TRAINING 3. COSTS DURING TESi ...de r~duction des coats, ii faut disponer de ?!vyenr. performants d’eetimation des coats en main-d’oeuvre et en applrvininrinesent. Cam moyenm doivent

  13. Sexual predators, energy development, and conservation in greater Yellowstone.

    Science.gov (United States)

    Berger, Joel; Beckmann, Jon P

    2010-06-01

    In the United States, as elsewhere, a growing debate pits national energy policy and homeland security against biological conservation. In rural communities the extraction of fossil fuels is often encouraged because of the employment opportunities it offers, although the concomitant itinerant workforce is often associated with increased wildlife poaching. We explored possible positive and negative factors associated with energy extraction in the Greater Yellowstone Ecosystem (GYE), an area known for its national parks, intact biological diversity, and some of the New World's longest terrestrial migrations. Specifically, we asked whether counties with different economies-recreation (ski), agrarian (ranching or farming), and energy extractive (petroleum)-differed in healthcare (gauged by the abundance of hospital beds) and in the frequency of sexual predators. The absolute and relative frequency of registered sex offenders grew approximately two to three times faster in areas reliant on energy extraction. Healthcare among counties did not differ. The strong conflation of community dishevel, as reflected by in-migrant sexual predators, and ecological decay in Greater Yellowstone is consistent with patterns seen in similar systems from Ecuador to northern Canada, where social and environmental disarray exist around energy boomtowns. In our case, that groups (albeit with different aims) mobilized campaigns to help maintain the quality of rural livelihoods by protecting open space is a positive sign that conservation can matter, especially in the face of rampant and poorly executed energy extraction projects. Our findings further suggest that the public and industry need stronger regulatory action to instill greater vigilance when and where social factors and land conversion impact biological systems.

  14. Practicing more retrieval routes leads to greater memory retention.

    Science.gov (United States)

    Zheng, Jun; Zhang, Wei; Li, Tongtong; Liu, Zhaomin; Luo, Liang

    2016-09-01

    A wealth of research has shown that retrieval practice plays a significant role in improving memory retention. The current study focused on one simple yet rarely examined question: would repeated retrieval using two different retrieval routes or using the same retrieval route twice lead to greater long-term memory retention? Participants elaborately learned 22 Japanese-Chinese translation word pairs using two different mediators. Half an hour after the initial study phase, the participants completed two retrieval sessions using either one mediator (Tm1Tm1) or two different mediators (Tm1Tm2). On the final test, which was performed 1week after the retrieval practice phase, the participants received only the cue with a request to report the mediator (M1 or M2) followed by the target (Experiment 1) or only the mediator (M1 or M2) with a request to report the target (Experiment 2). The results of Experiment 1 indicated that the participants who practiced under the Tm1Tm2 condition exhibited greater target retention than those who practiced under the Tm1Tm1 condition. This difference in performance was due to the significant disadvantage in mediator retrieval and decoding of the unpracticed mediator under the Tm1Tm1 condition. Although mediators were provided to participants on the final test in Experiment 2, decoding of the unpracticed mediators remained less effective than decoding of the practiced mediators. We conclude that practicing multiple retrieval routes leads to greater memory retention than focusing on a single retrieval route. Thus, increasing retrieval variability during repeated retrieval practice indeed significantly improves long-term retention in a delay test. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Taino and African maternal heritage in the Greater Antilles.

    Science.gov (United States)

    Bukhari, Areej; Luis, Javier Rodriguez; Alfonso-Sanchez, Miguel A; Garcia-Bertrand, Ralph; Herrera, Rene J

    2017-12-30

    Notwithstanding the general interest and the geopolitical importance of the island countries in the Greater Antilles, little is known about the specific ancestral Native American and African populations that settled them. In an effort to alleviate this lacuna of information on the genetic constituents of the Greater Antilles, we comprehensively compared the mtDNA compositions of Cuba, Dominican Republic, Haiti, Jamaica and Puerto Rico. To accomplish this, the mtDNA HVRI and HVRII regions, as well as coding diagnostic sites, were assessed in the Haitian general population and compared to data from reference populations. The Taino maternal DNA is prominent in the ex-Spanish colonies (61.3%-22.0%) while it is basically non-existent in the ex-French and ex-English colonies of Haiti (0.0%) and Jamaica (0.5%), respectively. The most abundant Native American mtDNA haplogroups in the Greater Antilles are A2, B2 and C1. The African mtDNA component is almost fixed in Haiti (98.2%) and Jamaica (98.5%), and the frequencies of specific African haplogroups vary considerably among the five island nations. The strong persistence of Taino mtDNA in the ex-Spanish colonies (and especially in Puerto Rico), and its absence in the French and English excolonies is likely the result of different social norms regarding mixed marriages with Taino women during the early years after the first contact with Europeans. In addition, this article reports on the results of an integrative approach based on mtDNA analysis and demographic data that tests the hypothesis of a southward shift in raiding zones along the African west coast during the period encompassing the Transatlantic Slave Trade. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Absorption spectrum of DNA for wavelengths greater than 300 nm

    International Nuclear Information System (INIS)

    Sutherland, J.C.; Griffin, K.P.

    1981-01-01

    Although DNA absorption at wavelengths greater than 300 nm is much weaker than that at shorter wavelengths, this absorption seems to be responsible for much of the biological damage caused by solar radiation of wavelengths less than 320 nm. Accurate measurement of the absorption spectrum of DNA above 300 nm is complicated by turbidity characteristic of concentrated solutions of DNA. We have measured the absorption spectra of DNA from calf thymus, Clostridium perfringens, Escherichia coli, Micrococcus luteus, salmon testis, and human placenta using procedures which separate optical density due to true absorption from that due to turbidity. Above 300 nm, the relative absorption of DNA increases as a function of guanine-cytosine content, presumably because the absorption of guanine is much greater than the absorption of adenine at these wavelengths. This result suggests that the photophysical processes which follow absorption of a long-wavelength photon may, on the average, differ from those induced by shorter-wavelength photons. It may also explain the lower quantum yield for the killing of cells by wavelengths above 300 nm compared to that by shorter wavelengths

  17. Black breast cancer survivors experience greater upper extremity disability.

    Science.gov (United States)

    Dean, Lorraine T; DeMichele, Angela; LeBlanc, Mously; Stephens-Shields, Alisa; Li, Susan Q; Colameco, Chris; Coursey, Morgan; Mao, Jun J

    2015-11-01

    Over one-third of breast cancer survivors experience upper extremity disability. Black women present with factors associated with greater upper extremity disability, including: increased body mass index (BMI), more advanced disease stage at diagnosis, and varying treatment type compared with Whites. No prior research has evaluated the relationship between race and upper extremity disability using validated tools and controlling for these factors. Data were drawn from a survey study among 610 women with stage I-III hormone receptor positive breast cancer. The disabilities of the arm, shoulder and hand (QuickDASH) is an 11-item self-administered questionnaire that has been validated for breast cancer survivors to assess global upper extremity function over the past 7 days. Linear regression and mediation analysis estimated the relationships between race, BMI and QuickDASH score, adjusting for demographics and treatment types. Black women (n = 98) had 7.3 points higher average QuickDASH scores than White (n = 512) women (p disability by 40 %. Even several years post-treatment, Black breast cancer survivors had greater upper extremity disability, which was partially mediated by higher BMIs. Close monitoring of high BMI Black women may be an important step in reducing disparities in cancer survivorship. More research is needed on the relationship between race, BMI, and upper extremity disability.

  18. Fast mode decision based on human noticeable luminance difference and rate distortion cost for H.264/AVC

    Science.gov (United States)

    Li, Mian-Shiuan; Chen, Mei-Juan; Tai, Kuang-Han; Sue, Kuen-Liang

    2013-12-01

    This article proposes a fast mode decision algorithm based on the correlation of the just-noticeable-difference (JND) and the rate distortion cost (RD cost) to reduce the computational complexity of H.264/AVC. First, the relationship between the average RD cost and the number of JND pixels is established by Gaussian distributions. Thus, the RD cost of the Inter 16 × 16 mode is compared with the predicted thresholds from these models for fast mode selection. In addition, we use the image content, the residual data, and JND visual model for horizontal/vertical detection, and then utilize the result to predict the partition in a macroblock. From the experimental results, a greater time saving can be achieved while the proposed algorithm also maintains performance and quality effectively.

  19. Ready To Buy a Computer?

    Science.gov (United States)

    Rourke, Martha; Rourke, Patrick

    1974-01-01

    The school district business manager can make sound, cost-conscious decisions in the purchase of computer equipment by developing a list of cost-justified applications for automation, considering the software, writing performance specifications for bidding or negotiating a contract, and choosing the vendor wisely prior to the purchase; and by…

  20. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost